解决运行Spark-shell,出现报错Unable to load native-hadoop library for your platform的问题

1.启动spark后,运行bin/spark-shell会出现一个警告

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2.解决办法:

         (1)第一种:在linux环境变量里设置linux共享库:命令行输入一下命令

vim /etc/profile
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH
source /etc/profile

        (2)第二种:设置环境变量和conf/spark-env.sh

vim /etc/profile
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native/
source /etc/profile

进入conf/spark-env.sh目录下:

vim conf/spark-env.sh
export LD_LIBRARY_PATH=$JAVA_LIBRARY_PATH

问题就解决了

发布了26 篇原创文章 · 获赞 35 · 访问量 743

猜你喜欢

转载自blog.csdn.net/csdnliu123/article/details/105488895