json - Exception in thread "main" org.apache.spark.SparkException: Task not serializable" -


iam getting above error while running below code. observed there serializable problem cudn't trace out exactly. can 1 explain can here. in advance.

 enter code here         def checkfortype(json:string):string={             val  parsedjson = parse(json)                val res=(parsedjson \\ "head" \\ "type" ).extract[string]                (res)                  }     val dstream = kafkautils.createstream(ssc, zkquorum, group, map("topic" -> 1)).map(_._2)            val ptype = dstream.map(checkfortype)            ptype.map(rdd => {            val pkt= rdd.tostring()           if(pkt.equals("p300")) {            val t300=dstream.map(par300)            t300.print()          }else if(pkt.equals("p30")) {             val t30=dstream.map(par30)             t30.print()                  }else if(pkt.equals("p6")) {             val t6=dstream.map(par6)             t6.print()                  }         }) 

this happens when passing object transformation , object not serializable.

i found 1 interesting post on : https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-tips-and-tricks-sparkexception-task-not-serializable.html

maybe can solve problem. !


Comments

Popular posts from this blog

magento2 - Magento 2 admin grid add filter to collection -

Android volley - avoid multiple requests of the same kind to the server? -

Combining PHP Registration and Login into one class with multiple functions in one PHP file -