spark+hive运行时没有写权限

当使用spark连接hive时,无论是通过spark-submit提交作业,还是使用spark-shell,spark-sql 都会报以下错误:

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------

    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)

    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)

要从两方面考虑哦,

解决方法:

  1. 更改HDFS目录/tmp/hive的权限:hadoop fs -chmod 777 /tmp/hive
  2. 同时删HDFS与本地的目录/tmp/hive:

    hadoop fs -rm -r /tmp/hive;  

    rm -rf /tmp/hive

猜你喜欢

转载自blog.csdn.net/yangbosos/article/details/89455838