Spark Streaming的IDEA操作在spark操作的差别和解决

Spark Streaming的IDEA操作
博客https://blog.csdn.net/qq_43688472/article/details/86499291
这里就不重复操作了

[hadoop@hadoop001 bin]$ ./spark-shell

Spark Streaming从代码写到spark上时候
val ssc = new StreamingContext(conf, Seconds(10))
在导入这个的时候
发现没有conf

scala> conf
<console>:24: error: not found: value conf
       conf

没有发现这个conf那该如何是好呢
将conf改成sc

val ssc = new StreamingContext(sc, Seconds(10))

在操作一下

scala> import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.{Seconds, StreamingContext}

scala> val ssc = new StreamingContext(sc, Seconds(10))

ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@6a7cbeed

但是如果非得用conf怎么办

val ssc = new StreamingContext(sc.getConf, Seconds(10))
scala> val ssc = new StreamingContext(sc.getConf, Seconds(10))
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:

这里会报错,显示是Only one SparkContext 只能有一个
具体可以看SPARK-2243
https://lssues.apache.org/jira/browse/SPARK-2243

那要如何解决呢

外部获取使用:conf.get(“spark…”)
内部传的话:ssc.sparkContext.getConf.get(" spark… ")
控制台上有sc/spark

拓展

编程入口点
数据接收点
数据转换点
数据输出点
虽然不用的不多,这几个点一定要看

猜你喜欢

转载自blog.csdn.net/qq_43688472/article/details/86606652