各个节点的主机名为什么不一致?
namenode:hadoop1
datanode:localhost
secondarynamenode:0.0.0.0
[hadoop@localhost hadoop-2.7.2]$ sbin/start-dfs.sh
Starting namenodes on [hadoop1]
hadoop@hadoop1's password:
hadoop1: starting namenode, logging to /home/hadoop/hadoop-2.7.2/logs/hadoop-hadoop-namenode-hadoop1.hadoopdomain.out
hadoop@localhost's password:
localhost: starting datanode, logging to /home/hadoop/hadoop-2.7.2/logs/hadoop-hadoop-datanode-hadoop1.hadoopdomain.out
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to /home/hadoop/hadoop-2.7.2/logs/hadoop-hadoop-secondarynamenode-hadoop1.hadoopdomain.out
你这个似乎一直在提示输入密码,很可能是因为ssh没有配好。如果 ssh localhost登陆不需要密码则说明ssh配好了。
需要将namnode的公钥(如id_rsa.pub)放到所有datanode的.ssh目录的authorized_keys中。