centos7 spark standalone 模式搭建

1、先搭建spark local 模式

https://blog.csdn.net/starkpan/article/details/86437089

2、进入spark安装目录conf文件夹

cp spark-env.sh.template spark-env.sh

3、配置spark-env.sh,添加以下内容

SPARK_MASTER_HOST=hadoopOne
SPARK_WORKER_CORES=2
SPARK_WORKER_MEMORY=2g
SPARK_WORKER_INSTANCES=1

4、进入sbin目录,修改文件

vi spark-config.sh 
#添加java_home
export JAVA_HOME=/home/hadoop/app/jdk1.8.0_181

5、启动spark

./start-all.sh

6、查看网址

http://ip:8080
  • URL: spark://hadoopOne:7077
  • REST URL: spark://hadoopOne:6066 (cluster mode)
  • Alive Workers: 1
  • Cores in use: 2 Total, 0 Used
  • Memory in use: 2.0 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed 
  • Status: ALIVE

可以看到以上信息,如果想要1台机器启动2个worker则将配置文件中SPARK_WORKER_INSTANCES=2

7、spark-shell启动命令,在./start-all.sh的基础上,可进行spark-shell启动

spark-shell --master spark://hadoopOne:7077

猜你喜欢

转载自blog.csdn.net/starkpan/article/details/86438904