SparkException: 在类:org.apache.avro.generic.GenericDatumReader上,任务不可序列化。

问题描述 投票:1回答:1

我有两个字段的json格式的输入,(size : BigInteger和data : String)。 这里的data包含 ZStd 压缩 Avro 记录。任务是对这些记录进行解码。我使用的是 Spark-avro 为此。但得到的是,Task not serializable异常。

示例数据

{
"data": "7z776qOPevPJF5/0Dv9Rzx/1/i8gJJiQD5MTDGdbeNKKT"
"size" : 231
}

编码

import java.util.Base64
import com.github.luben.zstd.Zstd
import org.apache.avro.Schema
import com.twitter.bijection.Injection
import org.apache.avro.generic.GenericRecord
import com.twitter.bijection.avro.GenericAvroCodecs
import com.databricks.spark.avro.SchemaConverters
import org.apache.spark.sql.types.StructType
import com.databricks.spark.avro.SchemaConverters._

def decode2(input:String,size:Int,avroBijection:Injection[GenericRecord, Array[Byte]], sqlType:StructType): GenericRecord = {

        val compressedGenericRecordBytes = Base64.getDecoder.decode(input)
        val genericRecordBytes = Zstd.decompress(compressedGenericRecordBytes,size)
        avroBijection.invert(genericRecordBytes).get
}

val myRdd = spark.read.format("json").load("/path").rdd

val rows = myRdd.mapPartitions{
    lazy val schema = new Schema.Parser().parse(schemaStr)
    lazy val avroBijection: Injection[GenericRecord, Array[Byte]] = GenericAvroCodecs.toBinary(schema)    
    lazy val sqlType = SchemaConverters.toSqlType(schema).dataType.asInstanceOf[StructType]
    (iterator) => {
        val myList = iterator.toList
        myList.map{ x => {
            val size = x(1).asInstanceOf[Long].intValue
            val data = x(0).asInstanceOf [String]
            decode2(data, size, avroBijection,sqlType)
        }
    }.iterator
    }
}

异常情况

files: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row] = MapPartitionsRDD[987] at rdd at <console>:346
org.apache.spark.SparkException: Task not serializable
  at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
  at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
  at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
  at org.apache.spark.SparkContext.clean(SparkContext.scala:2287)
  at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:794)
  at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:793)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
  at org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:793)
  ... 112 elided
Caused by: java.io.NotSerializableException: org.apache.avro.generic.GenericDatumReader
Serialization stack:
    - object not serializable (class: org.apache.avro.generic.GenericDatumReader, value: org.apache.avro.generic.GenericDatumReader@4937cd88)
    - field (class: com.twitter.bijection.avro.BinaryAvroCodec, name: reader, type: interface org.apache.avro.io.DatumReader)
    - object (class com.twitter.bijection.avro.BinaryAvroCodec, com.twitter.bijection.avro.BinaryAvroCodec@6945439c)
    - field (class: $$$$79b2515edf74bd80cfc9d8ac1ba563c6$$$$iw, name: avroBijection, type: interface com.twitter.bijection.Injection)

已试过SO员额

  1. Spark: java.io.NotSerializableException: org.apache.avro.Schema$RecordSchema

在这个帖子之后,我更新了 decode2 取法 schemaStr 作为输入,并在方法中转换为schema和SqlType。 异常情况不变

  1. 使用Spark的模式将AVRO消息转换为DataFrame。

使用帖子中提供的代码来创建 object Injection 然后使用它。这个也没有用。

scala apache-spark serialization avro
1个回答
0
投票

你有没有试过

val rows = myRdd.mapPartitions{
    (iterator) => {
        val myList = iterator.toList
        myList.map{ x => {
    lazy val schema = new Schema.Parser().parse(schemaStr)
    lazy val avroBijection: Injection[GenericRecord, Array[Byte]] = GenericAvroCodecs.toBinary(schema)    
    lazy val sqlType = SchemaConverters.toSqlType(schema).dataType.asInstanceOf[StructType]
            val size = x(1).asInstanceOf[Long].intValue
            val data = x(0).asInstanceOf [String]
            decode2(data, size, avroBijection,sqlType)
        }
    }.iterator
    }
© www.soinside.com 2019 - 2024. All rights reserved.