【HDFS】解决hadoop fs -put时出现createBlockOutputStream异常

向HDFS上传文件时出现异常:

INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.234.132:50010
        at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1497)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1400)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
18/12/21 19:36:53 INFO hdfs.DFSClient: Abandoning BP-1635234535-192.168.234.133-1545332720741:blk_1073741841_1017
18/12/21 19:36:53 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.234.132:50010,DS-94a817d7-e6a5-4a04-b234-8650abc6437c,DISK]

我是因为异常中那个结点机器的防火墙没有关。

如果关了之后还是报错可以看一下机器的进程,看看有没有datanode在运行

如果没有的话,执行命令hadoop-daemon.sh start datanode 会启动机器上数据节点的守护进程。

猜你喜欢

转载自blog.csdn.net/hr786250678/article/details/85159681