hadoop搭建完成后,记录执行wordcount遇到的坑

1  Error Launching job : org.apache.hadoop.mapred.SafeModeException: JobTracker is in safe mode

解决办法:
关闭安全模式

 hadoop dfsadmin -safemode leave
2 /usr/bin/hadoop-config.sh: No such file or directory
vim /etc/profile
	export JAVA_HOME=/usr/local/src/jdk1.6.0_45       # java绝对路径
	export HADOOP_HOME=/usr/local/hadoop-1.2.1    # hadoop绝对路径
	export PATH=.:$HADOOP_HOME/bin:$JAVA_HOME/bin:$PA 
	export HADOOP_HOME_WARN_SUPPRESS=1                # 可以解决这个警告   Warning: $HADOOP_HOME is deprecated.    

修改完毕后,执行

source /etc/profile
3 Input path does not exist: hdfs://192.168.37.10:9000/wordcount.txt

解决办法:

hadoop fs -put wordcount.txt /  
4 Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201911200438_0004_m_000000

脚本

HADOOP_CMD="/usr/local/hadoop-1.2.1/bin/hadoop"
STREAM_JAR_PATH="/usr/local/hadoop-1.2.1/contrib/streaming/hadoop-streaming-1.2.1.jar"

INPUT_FILE_PATH_1="/wordcount.txt"
OUTPUT_PATH="/output"

$HADOOP_CMD fs -rmr -skipTrash $OUTPUT_PATH

# Step 1.
$HADOOP_CMD jar $STREAM_JAR_PATH \
    -input $INPUT_FILE_PATH_1 \
    -output $OUTPUT_PATH \
    -mapper "python3 .py" \
    -reducer "python3 reducer.py" 
    -file ./map.py \
    -file ./reducer.py

解决办法: check 脚本,
发现 -reducer “python3 reducer.py” 没加 “\”

猜你喜欢

转载自blog.csdn.net/u014644167/article/details/103171945