shell脚本传参执行spark-submit

读取多个目录下的本地文件,多个目录通过循环遍历的方式,以参数传递:

#!/bin/bash

i=0
while [ $i -lt 10000 ]
do
echo "i=$i"
spark-submit --class com.link.fblx.readFromPath --driver-memory 20G --executor-memory 20G --num-executors 1 --executor-cores 25 --total-executor-cores 25 --jars jsoup-1.8.1.jar /root/sparkdemo_jar.jar file:///home/zl/data/$i/* /test/zl/fblx_link/20190109/output$i
((i++))
done

猜你喜欢

转载自www.cnblogs.com/tianziru/p/10245463.html