ÉèΪÊ×Ò³ ¼ÓÈëÊÕ²Ø

TOP

HBASEµ¥»ú°²×°½Ì³Ì£¨centOS6.5£©
2019-02-09 01:51:21 ¡¾´ó ÖРС¡¿ ä¯ÀÀ:58´Î
Tags£ºHBASE µ¥»ú °²×° ½Ì³Ì centOS6.5

×¢ÒâHadoopÓëHBASE¶ÔÓ¦Ö§³ÖÖ§³Ö°æ±¾

1»ù±¾ÅäÖÃ

1.1¹Ø±Õ·À»ðǽ

Chkconfig ¨Clist | grep iptables

È·¶¨È«²¿Îªoff

·ñÔòÖ´ÐÐchkconfig iptables off

Service iptables stop

1.2Ìí¼Ó Ö÷»úIP ºÍ¶ÔÓ¦µÄÖ÷»úÃû³Æ£¬×öÓ³Éä¡£Ö÷»úÃû¿ÉÓÃhostname²é¿´£¨×¢:ÔÚÅäÖÃÎļþÖÐʹÓÃÖ÷»úÃûµÄ»°£¬Õâ¸öÓ³Éä±ØÐë×ö£¡£©

1.3²é¿´sestatus

Èç¹ûûÓÐÏÔʾdisabledÔò vi /etc/selinux/config½«SELINUX=disabled

1.4ÐÞ¸ÄÖ÷»úÃû

Vi /home/hadoop3.1.0/etc/hadoop/slave

Ð޸ijÉ×Ô¼ºµÄÖ÷»úÃû

2 jdk°²×°

2.1½«jdk-8u144-linux-x64.rpm·ÅÈë¸ùĿ¼ÏÂ

ÉÏ´«ÍêºóµÄÎļþÈçÏ£¬ÎļþÖ»ÓжÁдȨÏÞ£¬Ã»ÓÐÖ´ÐÐȨÏÞ

-¡·Ö´ÐÐchmod 755 jdk-8u144-linux-x64.rpmÊÚȨ

-¡·Ö´ÐÐrpm -ivh jdk-8u144-linux-x64.rpm½øÐа²×°

Èç¹ûÔÚ°²×°Ê±³öÏÖÈçÏ´íÎó

warning:waiting for transaction lock on /var/lib/rpm/.rpm.lock

ʹÓÃÈçÏÂÃüÁîÀ´½øÐа²×°

sudo rpm -ivh jdk-8u144-linux-x64.rpm

Èç¹ûÈÔÈ»²»¿ÉÒÔ£¬Ê¹ÓÃÈçÏÂÃüÁîÇ¿ÖƽâËøºóÔٴΰ²×°¼´¿É

sudo rm /var/lib/rpm/.rpm.lock

    1. jdk°²×°³É¹¦ºó£¬Ä¬ÈÏ´æ·ÅÔÚ/usr/javaÎļþĿ¼ÖÐ

2.3¡¢ÅäÖû·¾³±äÁ¿

vi /etc/profile

ÔÚÎļþβ²¿Ìí¼ÓÈçÏÂÄÚÈÝ£¬±£´æÍ˳ö

export JAVA_HOME=/usr/java/jdk1.8.0_144

export PATH=$JAVA_HOME/bin:$PATH

export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tool.jar

ÔÙʹÓÃsource /etc/profileʹ»·¾³±äÁ¿ÉúЧ

ÊäÈëÈçÏÂÃüÁîÑéÖ¤»·¾³±äÁ¿ÊÇ·ñÉúЧ

echo $PATH

ÊäÈëÈçÏÂÃüÁî²é¿´jdk°æ±¾

java ¨Cversion

3 hadoop°²×°

3.1¡¢½âѹhadoop-2.7.6.tar.gz

Tar ¨Czvxfhadoop-2.7.6.tar.gz¿É½«ÎļþÒÆÖÁhomeÏÂ

3.2¡¢profile Îļþ¸ü¸Ä

export HADOOP_HOME=/home/hadoop

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

export PATH=.:${JAVA_HOME}/bin:${HADOOP_HOME}/bin:$PATH

ÊäÈësource /etc/profile

3.3¡¢ÐÞ¸ÄÅäÖÃÎļþ

3.3.1 ÔÚÐÞ¸ÄÅäÖÃÎļþ֮ǰ£¬ÏÈÔÚrootĿ¼Ï½¨Á¢Ò»Ð©Îļþ¼Ð¡£

mkdir /root/hadoop

mkdir /root/hadoop/tmp

mkdir /root/hadoop/var

mkdir /root/hadoop/dfs

mkdir /root/hadoop/dfs/name

mkdir /root/hadoop/dfs/data

3.3.2 ÐÞ¸Äcore-site.xml

ÔÚ<configuration>Ìí¼Ó:

<configuration>

<property>

<name>hadoop.tmp.dir</name>

<value>/root/hadoop/tmp</value>

<description>Abase for other temporary directories.</description>

</property>

<property>

<name>fs.default.name</name>

<value>hdfs://wll:9000</value>

</property>

</configuration>

3.3.3ÐÞ¸Ä hadoop-env.sh

export JAVA_HOME=/usr/java/ jdk1.8.0_144

3.3.3ÐÞ¸Ä hdfs-site.xml

ÔÚ<configuration>Ìí¼Ó:

<configuration>

<property>

<name>dfs.name.dir</name>

<value>/root/hadoop/dfs/name</value>

<description>Path on the local filesystem where theNameNode stores the namespace and transactions logs persistently.</description>

</property>

<property>

<name>dfs.data.dir</name>

<value>/root/hadoop/dfs/data</value>

<description>Comma separated list of paths on the localfilesystem of a DataNode where it should store its blocks.</description>

</property>

<property>

<name>dfs.replication</name>

<value>1</value>

</property>

</configuration>

3.3.4 ÐÞ¸Ämapred-site.xml

<property>

<name>mapred.local.dir</name>

<value>/root/hadoop/var</value>

</property>

<property>

<name>mapreduce.framework.name</name>

<value>yarn</value>

</property>

µ½´Ë Hadoop µÄµ¥»úģʽµÄÅäÖþÍÍê³ÉÁË¡£

4 ÃâÃܵǼ

ÏȲâÊÔÊÇ·ñÄÜÃâÃܵǼ£¬ÊäÈëÊÔÊÔlocalhost£¬³öÏÖÒÔÏÂÔò²»ÄÜÃâÃܵǼ

È¥µô /etc/ssh/sshd_configÖеÄÁ½ÐÐ×¢ÊÍ£¬ÈçûÓÐÔòÌí¼Ó

Éú³ÉÃØÔ¿£¬ÊäÈëÃüÁî ssh-keygen -t rsa È»ºóһ·»Ø³µ

¸´ÖƵ½¹«¹²ÃÜÔ¿ÖÐ

cp /root/.ssh/id_rsa.pub /root/.ssh/authorized_keys

²âÊÔÊÇ·ñ³É¹¦

5 hadoopÆô¶¯

µÚÒ»´ÎÆô¶¯HadoopÐèÒª³õʼ»¯£¬Çл»µ½ /home/hadoop/binĿ¼ÏÂÊäÈë

hadoop namenode -format

³õʼ»¯³É¹¦ºó£¬¿ÉÒÔÔÚ/root/hadoop/dfs/name Ŀ¼Ï¿´¼ûÐÂÔöÁËÒ»¸öcurrent Ŀ¼ÒÔ¼°Ò»Ð©Îļþ¡£

Çл»µ½/home/hadoop/hadoop3.1.0/sbinĿ¼£¬Æô¶¯HDFS

ÊäÈ룺

start-dfs.sh

start-yarn.sh

ÊäÈëjps:³öÏÖÒÔÏÂÐÅÏ¢£¬ÅäÖÃÍê³É

´íÎóÒ»£º

ÊäÈë: start-dfs.sh

Èç¹û³öÏÖÒÔÏ´íÎó£¬ÔòÊäÈësource ~/.bash_profile

´íÎó¶þ£º

Èç¹ûÔËÐÐstart-dfs.sh½Å±¾±¨ÈçÏ´íÎó

£¨È±ÉÙÓû§¶¨Òå¶øÔì³ÉµÄ£©Òò´Ë±à¼­Æô¶¯ºÍ¹Ø±Õ

vi sbin/start-dfs.sh

vi sbin/stop-dfs.sh

¶¥²¿¼ÓÈë

HDFS_DATANODE_USER=root

HADOOP_SECURE_DN_USER=hdfs

HDFS_NAMENODE_USER=root

HDFS_SECONDARYNAMENODE_USER=root

´íÎóÈý£º

ÔËÐÐstart-yarn.sh±¨´í

ÊÇÒòΪȱÉÙÓû§¶¨ÒåÔì³ÉµÄ£¬ËùÒÔ·Ö±ð±à¼­¿ªÊ¼ºÍ¹Ø±Õ½Å±¾

vi sbin/start-yarn.sh

vi sbin/stop-yarn.sh

¶¥²¿Ìí¼Ó

YARN_RESOURCEMANAGER_USER=root

HADOOP_SECURE_DN_USER=yarn

YARN_NODEMANAGER_USER=root

6 hbase°²×°

6.1¡¢½âѹhbase-2.0.1-bin.tar.gz

tar ¨Czvxf hbase-2.0.1-bin.tar.gz

È»ºóÒƶ¯µ½ /home/hbase

mv hbase-2.0.1 /home/hbase

6.2 ¡¢±à¼­ /etc/profile Îļþ

×¢ÒâÐÞ¸ÄÔÚÇ°»ù´¡ÉÏÐÞ¸ÄPATH·¾¶

export PATH=.:${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${HBASE_HOME}/bin:$PATH

export HBASE_HOME=/home/hbase

ÊäÈësource /etc/profileʹÐÞ¸ÄÉúЧ

²é¿´hbase°æ±¾hbase version

6.3¡¢ÐÞ¸ÄÅäÖÃÎļþ

ÔÚ rootĿ¼ÏÂн¨Îļþ¼Ð

mkdir /root/hbase

mkdir /root/hbase/tmp

mkdir /root/hbase/pids

±à¼­hbase-site.xml Îļþ£¬ÔÚÌí¼ÓÈçÏÂÅäÖÃ

<!-- ´æ´¢Ä¿Â¼ -->

<property>

<name>hbase.rootdir</name>

<value>hdfs://wll:9000/hbase</value>

</property>

<!-- zookeeper ¼¯ÈºÅäÖá£Èç¹ûÊǼ¯Èº£¬ÔòÌí¼ÓÆäËüµÄÖ÷»úµØÖ· -->

<property>

<name>hbase.zookeeper.quorum</name>

<value>wll</value>

</property>

<property>

<name>hbase.tmp.dir</name>

<value>/root/hbase/tmp</value>

</property>

</configuration>

7 HBASEÆô¶¯

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/home/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/hadoop3.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

·¢ÏÖÕâÁ½¸öjarÖظ´£¬É¾³ýÆäÖÐÒ»¸ö¼´¿É

Æô¶¯ºó£º

½øÈë²Ù×÷Óï¾äÖ´ÐнçÃæHBASE shell

¡¾´ó ÖРС¡¿¡¾´òÓ¡¡¿ ¡¾·±Ìå¡¿¡¾Í¶¸å¡¿¡¾Êղء¿ ¡¾ÍƼö¡¿¡¾¾Ù±¨¡¿¡¾ÆÀÂÛ¡¿ ¡¾¹Ø±Õ¡¿ ¡¾·µ»Ø¶¥²¿¡¿
ÉÏһƪ£ºhbaseÆô¶¯³É¹¦£¬½øÈëhbase shell.. ÏÂһƪ£ºspark´Óhbase¶ÁȡдÈëÊý¾Ý

×îÐÂÎÄÕÂ

ÈÈÃÅÎÄÕÂ

Hot ÎÄÕÂ

Python

C ÓïÑÔ

C++»ù´¡

´óÊý¾Ý»ù´¡

linux±à³Ì»ù´¡

C/C++ÃæÊÔÌâÄ¿