$ HADOOP_PREFIX / sbin / start-dfs.sh failed to start, stuck in node2: starting datanode, logging to ......

It is a problem for a long time, first of all carefully looked at the details of the operation today: Hadoop Hive + + installation Raiders complete the Spark

But it has been stuck in here:

#启动 Hdfs 
$HADOOP_PREFIX/sbin/start-dfs.sh

As Figure: will

node2: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-10.211.55.102.out

The line that stuck without moving the knockout round after that

Permission denied (publickey,gssapi-keyex,gssapi-with-mic)

According to this intermediate page ssh use under Linux (based on personal experience summary) The following modifications were made:

(5) ssh login appears: permission denied (publickey.gssapi-with- mic)
Solution:
Modify / etc / ssh / sshd-config file, which will be:
PermitRootLogin NO amended as yes
PubkeyAuthentication yes 
plus AuthorizedKeysFile .ssh / authorized_keys front the # masked
PasswordAuthentication no to yes to modify
the final restart sshd service can be! (Service sshd restart)

Still no effect, and finally found the answer here: Help! hdfs not start properly!

Indeed find themselves in this step set a password:

So it has failed to start! After re-operation can finally!


Hadoop + Hive + Spark 完整安装攻略的内容抄过来担心以后页面会删掉,侵删:

  • download files
    mkdir files
    sh download_file.sh

  • vagrant up
    这边会 up 很久

  • 一切正常运行,有红字跳出来也不用怕
  • copy ssh key 
    进入 master 的 root 帐号,把 id_rsa.pub 复制到 authorized_keys 里

接着把 public key 丢给 node

回到 master 后确认连线状况

HADOOP

  • Format namenode
    hdfs namenode -format

  • 启动 Hdfs 
    $HADOOP_PREFIX/sbin/start-dfs.sh

  • 确认 
    hdfs dfsadmin -report
  • 启动 Hdfs 
    $HADOOP_PREFIX/sbin/start-yarn.sh
  • netstat -nlopt
  •  
  • 启动集群
    /usr/local/spark/sbin/start-all.sh
  • netstat -nlopt  

      当出现的结果显示出7077(spark master)和8080(spark UI)两个port被占用的时候,就表示spark成功执行了

 

 


  • /usr/local/spark/sbin/start-all.sh
  • netstat -nlopt  
  • pyspark --master spark://10.211.55.100:7077
  • 直接敲pyspark启动默认是用本机来执行的,以上是让master来执行

 

 

Guess you like

Origin blog.csdn.net/yuxeaotao/article/details/90017378