spark-spark集群

spark-spark集群

Scala下载配置
Scala下载配置

spark下载
spark下载

spark配置
spark配置

spark集群启动
spark集群启动

Scala
下载
http://www.scala-lang.org/

        cd /opt
            mkdir scala
        cp /home/hserver1/desktop/scala-2.12.2.tgz  /opt/scala
        cd /opt/scala
        tar -xvf scala-2.12.2.tgz
    配置
        gedit /etc/profile
            export    SCALA_HOME=/opt/scala/scala-2.12.2
        source   /etc/profile 
        验证Scala
            scala     -version  

下载
http://spark.apache.org/downloads.html

spark下载
spark下载
cd /opt
       mkdir spark
   把桌面的拷贝到下面
   tar -zxvf 解压即可

配置

    gedit /etc/profile
        export  SPARK_HOME=/opt/spark/spark-2.2.0-bin-hadoop2.7
        ${SPARK_HOME}/bin
        source    /etc/profile
    配置conf目录
        新建spark-env.h文件
            cd /opt/spark/spark-2.2.0-bin-hadoop2.7/conf
                cp spark-env.sh.template spark-env.sh
         gedit spark-env.sh
            export SCALA_HOME=/opt/scala/scala-2.12.2 export JAVA_HOME=/opt/jdk1.8.0_152 export HADOOP_HOME=/opt/hadoop/hadoop-2.8.2 export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop export SPARK_HOME=/opt/spark/spark-2.2.0-bin-hadoop2.7 export SPARK_MASTER_IP=hserver1 export SPARK_EXECUTOR_MEMORY=1G 新建slaves cd /opt/spark/spark-2.2.0-bin-hadoop2.7/conf cp slaves.template slaves gedit slaves 

http://blog.csdn.net/pucao_cug/article/details/72353701

启动

    hadoop集群
        cd /opt/hadoop-2.8.2/bin
         ./hadoop namenode -format
        cd /opt/hadoop-2.8.2/sbin
         ./start-all.sh
        http://169.254.254.11:50070
        http://169.254.254.11:8088
50070
50070

8088
8088
    spark集群启动
        cd /opt/spark/spark-2.2.0-bin-hadoop2.7/sbin
         ./start-all.sh
        http://169.254.254.11:8080/
spark集群启动
spark集群启动
安装idea
    https://www.jetbrains.com/idea 
    解压进入解压的安装目录  /opt/idea/idea-IC

猜你喜欢

转载自www.cnblogs.com/ycx95/p/9177231.html