Spark的Standalone模式

Spark基础理论: https://blog.csdn.net/weixin_45102492/article/details/104318250

Spark安装及Local模式:https://blog.csdn.net/weixin_45102492/article/details/104318738
Spark的Yarn模式: https://blog.csdn.net/weixin_45102492/article/details/104319175

Standalone模式

构建一个由Master+Slave构成的Spark集群,Spark运行在集群中

修改spark-env.sh文件
把YARN_CONF_DIR=/opt/module/Hadoop/hadoop-2.7.7/etc/hadoop注释掉,这个是我在Yarn模式下的修改

# spark启动时master的启动主机
#YARN_CONF_DIR=/opt/module/Hadoop/hadoop-2.7.7/etc/hadoop
SPARK_MASTER_HOST=node01
SPARK_MASTER_PORT=7077

修改slaves文件
设置worker所在的机器我这边只有一台,所以只有一个worker

#添加以下内容
node01

在spark的目录下启动spark

[root@node01 spark-3.0.0-preview2-bin-hadoop2.7]# sbin/start-all.sh

如果出现JAVA_HOME is not set异常,可以在sbin目录下的spark-config.sh文件中加入如下配置
export JAVA_HOME=XXXX

[root@node01 job]# echo $JAVA_HOME
/opt/module/Java/jdk1.8.0_212
[root@node01 spark-3.0.0-preview2-bin-hadoop2.7]# vi sbin/spark-config.sh
#如下添加
export JAVA_HOME=/opt/module/Java/jdk1.8.0_212

再次启动spark后提交任务

[root@node01 spark-3.0.0-preview2-bin-hadoop2.7]# sbin/start-all.sh
[root@node01 spark-3.0.0-preview2-bin-hadoop2.7]# bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master spark://node01:7077 \
--executor-memory 1g\
--total-executor-cores 2 \
./examples/jars/spark-examples_2.12-3.0.0-preview2.jar \
100

Standalone模式启动shell

[root@node01 spark-3.0.0-preview2-bin-hadoop2.7]# bin/spark-shell --master spark://node01:7077 --executor-memory 1g --total-executor-cores 2
20/02/14 16:21:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/02/14 16:22:16 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
Spark context Web UI available at http://node01:4041
Spark context available as 'sc' (master = spark://node01:7077, app id = app-20200214162223-0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-preview2
      /_/
         
Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_212)
Type in expressions to have them evaluated.
Type :help for more information.

scala>
发布了59 篇原创文章 · 获赞 4 · 访问量 4504

猜你喜欢

转载自blog.csdn.net/weixin_45102492/article/details/104319485