spark dataframe 转换 json

首先新建一个dataframe

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{SQLContext, SparkSession}
import scala.util.parsing.json.{JSON, JSONArray, JSONObject}

val conf = new SparkConf().setAppName("TTyb").setMaster("local")
val sc = new SparkContext(conf)
val spark = new SQLContext(sc)
val testDataFrame = spark.createDataFrame(Seq(
  ("1", "asf"),
  ("2", "2143"),
  ("3", "rfds")
)).toDF("label", "col")
testDataFrame.show()

打印结构是:

+-----+----+
|label| col|
+-----+----+
|    1| asf|
|    2|2143|
|    3|rfds|
+-----+----+

spark 自带函数

val sparkFunction = testDataFrame.toJSON.collectAsList.toString
println(sparkFunction)
// 得到结果
// [{"label":"1","col":"asf"}, {"label":"2","col":"2143"}, {"label":"3","col":"rfds"}]

列表型json

但是如果想得到第一列为key,第二列为value,那么写法是这样子的:

val df2Array: Array[(String, String)] = testDataFrame.collect().map { row => (row(0).toString, row(1).toString) }
val jsonData: Array[JSONObject] = df2Array.map { i =>
  new JSONObject(Map(i._1 -> i._2))
}
val jsonArray:JSONArray = new JSONArray(jsonData.toList)
println(jsonArray)
// [{"1" : "asf"}, {"2" : "2143"}, {"3" : "rfds"}]

合并JSONArray key:value

但是上面发现每一个key:value都放在一个括号里面,怎么把他们合并成起来?只需要文本处理一下:

val df2Array: Array[(String, String)] = testDataFrame.collect().map { row => (row(0).toString, row(1).toString) }
val jsonData: Array[JSONObject] = df2Array.map { i =>
  new JSONObject(Map(i._1 -> i._2))
}
val jsTest = jsonData.mkString(",").replace("},{",",")
println(jsTest)
// {"1" : "asf","2" : "2143","3" : "rfds"}

怎么把这个字符串变成map通过key值来取得value?定义一下函数即可:

def regJson(json:Option[Any]):Map[String,Any] = json match {
case Some(map:Map[String,Any]) => map
}
println(regJson(JSON.parseFull(jsTest)))
// Map(1 -> asf, 2 -> 2143, 3 -> rfds)

猜你喜欢

转载自www.cnblogs.com/TTyb/p/12698417.html