Spark: [error] DataFrames turn DataSet Failed

Error:(45, 63) Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
    val ds = spark.read.json("F:\\BigData\\employees.json").as[Employee]

 

Solution:

After creating SparkSession import statement:

  import spark.implicits._

 

Guess you like

Origin blog.csdn.net/drl_blogs/article/details/93064635