先将下面代码打包
第一种方式
先backage然后打断点,然后Debug执行main方法
在idea中添加参数
①点击Run下的Edit Configurations
②配置Configuration页中的Program arguments选项,就可以在idea中传入参数,参数之间用空格隔开
如:file:///root/wc.txt(centos本地路径输入/root/wc.txt) file:///root/wc00(代表输出路径为/root/wc00),
如果是docker启动需要填写为hdfs路径
hdfs://192.168.1.101:9000/root/wc/input.txt hdfs://192.168.1.101:9000/root/wc00
Exception in thread “main” org.apache.hadoop.security.AccessControlException: Permission denied: user=Administrator, access=WRITE, inode="/root":root:supergroup:drwxr-xr-x
可能会出现这个异常参考
//设置用户访问权限这里用户名为hadoop的用户名请在代码中设置
System.setProperty(“user.name”,“bigdata”)
//或者采用下面方式
https://blog.csdn.net/xiaoshunzi111/article/details/52062640解决问题
执行代码如下:
package cn.itcast.spark.day1
import org.apache.spark.{SparkConf, SparkContext}
/**
* Created by root on 2016/5/14.
*/
object WordCount {
def main(args: Array[String]) {
//非常重要,是通向Spark集群的入口
val conf = new SparkConf().setAppName("WC")
//拷贝绝对路径上去
.setJars(Array("D:\\ideaWorkbase\\HelloSpark\\target\\hello-spark-1.0.jar"))
//填写ip地址和端口号上去
.setMaster("spark://192.168.1.101:7077")
val sc = new SparkContext(conf)
//textFile会产生两个RDD:HadoopRDD -> MapPartitinsRDD
sc.textFile(args(0)).cache()
// 产生一个RDD :MapPartitinsRDD
.flatMap(_.split(" "))
//产生一个RDD MapPartitionsRDD
.map((_, 1))
//产生一个RDD ShuffledRDD
.reduceByKey(_+_)
//产生一个RDD: mapPartitions
.saveAsTextFile(args(1))
sc.stop()
}
}
pom文件如下:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cn.itcast.spark</groupId>
<artifactId>hello-spark</artifactId>
<version>1.0</version>
<properties>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.10.6</scala.version>
<spark.version>1.6.1</spark.version>
<hadoop.version>2.6.4</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume_2.10</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
<dependency>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>2.8.1</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-make:transitive</arg>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>