Spark2.4.3 部署

scala下载地址:https://www.scala-lang.org/download/

spark下载地址:https://spark.apache.org/downloads.html

安装步骤

安装scala

tar -zxvf scala-2.13.0.tgz
scp -r scala-2.13.0 node102:/root
scp -r scala-2.13.0 node103:/root

配置环境变量 vi /etc/profile

#scala
export SCALA_HOME=/root/scala-2.13.0
export PATH=$SCALA_NAME/bin:$PATH

使环境变量生效

source /etc/profile

安装spark

解压

配置环境变量

#spark
export SPARK_HOME=/root/spark-2.4.3
export PATH=$PATH:$SPARK_HOME/bin

spark-env.sh文件配置

export JAVA_HOME=/root/jdk1.8.0_211
export SCALA_HOME=/root/scala-2.13.0

export HADOOP_HOME=/root/hadoop-3.1.2
export HADOOP_CONF_DIR=/root/hadoop-3.1.2/etc/hadoop
export SPARK_WORKER_MEMORY=500m
export SPARK_WORKER_CORES=1
export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop1:2181,hadoop2:2181,hadoop3:2181,hadoop4:2181 -Dspark.deploy.zookeeper.dir=/spark"

slaves文件配置

node101
node102
node103

spark目录文件同步到其他节点

scp -r spark-2.4.3/ node102:/root
scp -r spark-2.4.3/ node103:/root

启动spark

cd /spark-2.4.3/sbin
start-all.sh

web页面查看spark节点情况

猜你喜欢

转载自www.cnblogs.com/wwbz/p/11286830.html
今日推荐