json - Exception in thread "main" org.apache.spark.SparkException: Task not serializable" -
iam getting above error while running below code. observed there serializable problem cudn't trace out exactly. can 1 explain can here. in advance.
enter code here def checkfortype(json:string):string={ val parsedjson = parse(json) val res=(parsedjson \\ "head" \\ "type" ).extract[string] (res) } val dstream = kafkautils.createstream(ssc, zkquorum, group, map("topic" -> 1)).map(_._2) val ptype = dstream.map(checkfortype) ptype.map(rdd => { val pkt= rdd.tostring() if(pkt.equals("p300")) { val t300=dstream.map(par300) t300.print() }else if(pkt.equals("p30")) { val t30=dstream.map(par30) t30.print() }else if(pkt.equals("p6")) { val t6=dstream.map(par6) t6.print() } })
this happens when passing object transformation , object not serializable.
i found 1 interesting post on : https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-tips-and-tricks-sparkexception-task-not-serializable.html
maybe can solve problem. !
Comments
Post a Comment