搭建Spark前要做的配置

安装Spark
1.下载VM

http://sw.bos.baidu.com/sw-search-sp/software/a08321b624453/VMware_workstation_full_12.5.2.exe

2.安装vm,使用ctrl+alt切换内外鼠标

3.下载ubuntu

http://old-releases.ubuntu.com/releases/14.04.1/ubuntu-14.04.1-server-amd64.iso

4.安装ubuntu

5.下载ubuntu桌面控制 见博客

http://bdxnote.blog.163.com/blog/static/84442352013222112540421/

6.安装vm的vmtools,方便鼠标切换。


7.安装SSh

ssh serversudo apt-get install openssh-server

详见:http://jingyan.baidu.com/article/9c69d48fb9fd7b13c8024e6b.html

8.Linux 下安装jdk

wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" http://download.oracle.com/otn-pub/java/jdk/8u121-b13/e9e7ea248e2c4826b92b3f075a80e441/jdk-8u121-linux-x64.tar.gz
设置环境变量
5.下载
Hadoop
6.下载Spark

7.启动spark

猜你喜欢

转载自xiangkuifu-163-com.iteye.com/blog/2378602
今日推荐