Spark01-安装单机版环境

1.Hadoop环境验证

hadoop version

2. Scala环境验证

3.下载Spark

http://spark.apache.org/downloads.html

注意版本兼容问题

4.安装Spark

解压tar包

tar -zxvf spark-3.1.1-bin-hadoop2.7.tgz

5.配置Spark环境

添加环境变量

vim ~/.bash_profile
export SPARK_HOME=/Library/spark-3.1.1-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
source ~/.bash_profile

6.验证环境

 spark-shell

spark.read.json("/Users/liyapeng/Desktop/json.json").where("age > 21").show()
 

猜你喜欢

转载自blog.csdn.net/lucklilili/article/details/115415220
今日推荐