kylin启动 -printf: unknown primary or operator spark jars not found

版权声明:本文为博主九师兄(QQ群:spark源代码 198279782 欢迎来探讨技术)原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_21383435/article/details/89924114
lcc@lcc apache-kylin-2.6.0-hbase1x$ bin/kylin.sh start
Retrieving hadoop conf dir...
KYLIN_HOME is set to /Users/lcc/soft/kylin/apache-kylin-2.6.0-hbase1x
19/05/07 16:26:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/07 16:26:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/07 16:26:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
netstat: n: unknown or uninstrumented protocol
Retrieving hive dependency...
Retrieving hbase dependency...
Retrieving hadoop conf dir...
Retrieving kafka dependency...
Retrieving Spark dependency...
find: -printf: unknown primary or operator
spark jars not found

错误原因:mac 下面 find 查找文件命令行不支持 -printf ‘%p:’ ,出错位置是在find-hive-dependency.sh 大概201行的位置。错误代码如下:

spark_dependency=`find -L $spark_home/jars -name '*.jar' ! -name '*slf4j*' ! -name '*calcite*' ! -name '*doc*' ! -name '*test*' ! -name '*sources*' | awk '{printf "%s:", $1}'  | sed 's/:$//'`

这段命令的意思就是查找相关的jar包,并用冒号把jar包的绝对地址连接起来。

解决办法:用awk代替,修改如下:

vim  bin/find-spark-dependency.sh

spark_dependency=`find -L $spark_home/jars -name '*.jar' ! -name '*slf4j*' ! -name '*calcite*' ! -name '*doc*' ! -name '*test*' ! -name '*sources*' | awk '{printf "%s:", $1}'  | sed 's/:$//'`


猜你喜欢

转载自blog.csdn.net/qq_21383435/article/details/89924114