¼¯ÈºÅäÖÃ
HBase ¨C Hadoop Database£¬ÊÇÒ»¸ö¸ß¿É¿¿ÐÔ¡¢¸ßÐÔÄÜ¡¢ÃæÏòÁС¢¿ÉÉìËõµÄ·Ö²¼Ê½´æ´¢ÏµÍ³£¬ÀûÓÃHBase¼¼Êõ¿ÉÔÚÁ®¼ÛPC ServerÉϴÆð´ó¹æÄ£½á¹¹»¯´æ´¢¼¯Èº¡£
Ò»¡¢»ù´¡»·¾³
JDK¡¡¡¡¡¡ £º1.8.0_65£¨ÒªÇó1.6+£©
ZooKeeper£º3.4.10
Hadoop£º2.7.2
HBase£º2.0.0-alpha4
Ö÷»úÊý£º3£¨ÒªÇó3+£¬ÇÒ±ØÐëÊÇÆæÊý£¬ÒòΪZooKeeperµÄÑ¡¾ÙËã·¨£©
·þÎñÆ÷ÁÐ±í£º
Ö÷»úÃû |
IPµØÖ· |
JDK |
ZooKeeper |
myid |
Master |
10.116.33.109 |
1.8.0_65 |
server.1 |
1 |
Slave1 |
10.27.185.72 |
1.8.0_65 |
server.2 |
2 |
slave2 |
10.25.203.67 |
1.8.0_65 |
server.3 |
3 |
¶þ¡¢zk¡¢hadoop°²×°
Èý¡¢HBase °²×°(HBase ×Ô¹ÜÀíZK)
ÏÂÔؽâѹHBaseÎļþ
cd /data/spark/
wget "http://archive.apache.org/dist/hbase/2.0.0-alpha4/hbase-2.0.0-alpha4-bin.tar.gz"
tar zxvf hbase-2.0.0-alpha4-bin.tar.gz
»·¾³±äÁ¿
vim ~/.bash_profile
export HBASE_HOME=/data/spark/hbase-2.0.0-alpha4
export PATH=$HBASE_HOME/bin:$PATH
export HBASE_CLASSPATH=/data/spark/hbase-2.0.0-alpha4/conf
Ö´ÐÐ
source ~/.bash_profileʹ»·¾³±äÁ¿ÉúЧ
ÅäÖÃhbase-env.sh JAVA_HOME
export JAVA_HOME=/opt/jdk1.8.0_65
export HBASE_MANAGES_ZK=true ʹÓÃ×Ô´øzookeeper
ÐÞ¸ÄÅäÖÃÎļþhbase-site.xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://Master:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>Master,Slave1,Slave2</value>
</property>
<property>
<name>hbase.temp.dir</name>
<value>/data/spark/hbase-2.0.0-alpha4/tmp</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/data/spark/hbase-2.0.0-alpha4/tmp/zookeeper</value>
</property>
<property>
<name>hbase.master.info.port</name>
<value>60010</value>
</property>
</configuration>
hbase.zookeeper.quorum ÅäÖñØÐëÓÃIP£¬Ê¹ÓÃhostname»á±¨´í£¬ÓëMapReduce ³åÍ»¡£
¸÷¸öÖ÷»úÖ®¼ä¸´ÖÆHBase
scp -r /data/spark/hbase-2.0.0-alpha4 Slave1:/data/spark
scp -r /data/spark/hbase-2.0.0-alpha4 Slave2:/data/spark
Æô¶¯HBase
Æô¶¯Ö®Ç°ÏÈÆô¶¯hadoopºÍzookeeper¼¯Èº
HBaseÖ»ÐèÔÚÒ»¸ö½ÚµãÉÏÖ´ÐÐÆô¶¯HBaseÃüÁÓëHadoopÒ»Ñù
$HADOOP_HOME/sbin/start-all.sh
cd /data/spark/hbase-2.0.0-alpha4/bin
./start-hbase.sh
¼ì²éhbaseÆô¶¯Çé¿ö
Slave ½ø³Ì¼ì²é
29541 NodeManager
30456 HQuorumPeer
29435 DataNode
30541 HRegionServer
30718 Jps
Master½ø³Ì¼ì²é
532 Jps
14278 NodeManager
18376 HMaster
18313 HQuorumPeer
14172 DataNode
18511 HRegionServer
HBaseÅäÖÃweb½çÃæ
http://Maseter IP:60010/master-status
×¢Ò⣺HBase×Ô¹ÜÀíZKÖУ¬hostsÎļþÖÐÓÐÖ÷»úÃû¶ÔÓ¦µÄIP£¬·ñÔò»á±¨´í¡£
zookeeper.ClientCnxn: Opening socket connection to server iZwz9evsidoafzcicmva9nZ/10.25.203.67:2181. Will not attempt to authenticate using SASL (unknown error)
µ¼ÈëÊý¾Ý
hdsf dfs -put test_hbase.csv /lw6/test
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator="," -Dimporttsv.columns=HBASE_ROW_KEY,f1 t1 /lw6/test/test_hbase.csv -Dimporttsv.bulk.output=/lw6/test/output.txt
Import ¹¤¾ßÖ÷ÒªÊÇͨ¹ýת»»ÎªMapReduce µ¼ÈëHBase¡£
×¢Ò⣺µ¼Èë¹ý³ÌÖгöÏÖÈÎÎñÎÞ·¨Running£¬yarnÈÕÖ¾ÏÔʾÎÞ·¨ÕÒµ½Ö÷»ú£¬Ö÷ÒªÊÇÒòΪÖ÷»úÃûδÐ޸ģ¬Ö»ÊÇÅäÖýڵãÃû³Æµ¼Ö¡£
Hbase»ù±¾ÃüÁî
²Î¿¼£ºhttps://www.cnblogs.com/xiaolei2017/p/7245299.html
ÓëÊý¾Ý¿âÒ»ÑùÓÐDML¡¢DDL²Ù×÷£¬ÃüÁî²Ù×÷²»Ò»Ñù¡£
Ãû³Æ
|
ÃüÁî±í´ïʽ
|
´´½¨±í
|
create '±íÃû³Æ', 'ÁÐÃû³Æ1','ÁÐÃû³Æ2','ÁÐÃû³ÆN'
|
Ìí¼Ó¼Ç¼
|
put '±íÃû³Æ', 'ÐÐÃû³Æ', 'ÁÐÃû³Æ:', 'Öµ'
|
²é¿´¼Ç¼
|
get '±íÃû³Æ', 'ÐÐÃû³Æ'
|
²é¿´±íÖеļǼ×ÜÊý
|
count '±íÃû³Æ'
|
ɾ³ý¼Ç¼
|
delete '±íÃû' ,'ÐÐÃû³Æ' , 'ÁÐÃû³Æ'
|
ɾ³ýÒ»Õűí
|
ÏÈÒªÆÁ±Î¸Ã±í£¬²ÅÄܶԸñí½øÐÐɾ³ý£¬µÚÒ»²½ disable '±íÃû³Æ' µÚ¶þ²½ drop '±íÃû³Æ'
|
²é¿´ËùÓмǼ
|
scan "±íÃû³Æ"
|
²é¿´Ä³¸ö±íij¸öÁÐÖÐËùÓÐÊý¾Ý
|
scan "±íÃû³Æ" , ['ÁÐÃû³Æ:']
|
¸üмǼ
|
¾ÍÊÇÖØдһ±é½øÐи²¸Ç
|