当使用spark连接hive时,无论是通过spark-submit提交作业,还是使用spark-shell,spark-sql 都会报以下错误:
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
要从两方面考虑哦,
解决方法:
- 更改HDFS目录/tmp/hive的权限:hadoop fs -chmod 777 /tmp/hive
- 同时删HDFS与本地的目录/tmp/hive:
hadoop fs -rm -r /tmp/hive;
rm -rf /tmp/hive