[root@node001 bin]# spark-shell
报错信息:
Caused by: org.apache.spark.sql.AnalysisException:
java.lang.RuntimeException: java.net.ConnectException: Call From
node001/192.168.100.201 to node001:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused;
Caused by: java.lang.RuntimeException: java.net.ConnectException: Call From node001/192.168.100.201 to node001:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Call From node001/192.168.100.201 to node001:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
解决办法:
因为 hadoop 集群是没有启动的
启动 hadoop 集群
什么原因出现这个 bug ?
locol 模式下 启动 spark 出现的错误
原因 1 :
配置 on yarn 集群模式后 , 指明了hadoop的配置文件的位置 也就是与Hadoop 建立了 联系
然后 启动 spark 时 要 与 hadoop 联合使用