启动spark报错

再启动spark时报错:
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the “DBCP” plugin to create a ConnectionPool gave an error : The specified datastore driver (“com.mysql.jdbc.Driver”) was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

是因为在使用hive on spark时 启动没有找到 mysql的驱动
解决:

1、

再启动时把mysql驱动加上, spark-shell --master local[2] --jars $HIVE_HOME/lib/mysql驱动

2 在spark-env.sh中添加 export
SPARK_CLASSPATH=$HIVE_HOME/lib/mysql-connector-java-5.1.17.jar

猜你喜欢

转载自blog.csdn.net/Lu_Xiao_Yue/article/details/86376734
今日推荐