为什么Flink的操作函数类的构造函数传递参数为空?

问题描述 投票:0回答:1

我正在研究Flink,我想建立一个扩展的操作函数ProcessWindowFunction,并重载一个新的构造函数,其参数为该类的字段值,但当这个类被实例化时,没有这个字段,我感到困惑。

import com.aliyun.datahub.client.model.Field;
import com.aliyun.datahub.client.model.FieldType;
import com.aliyun.datahub.client.model.PutRecordsResult;
import io.github.streamingwithflink.chapter8.PoJoElecMeterSource;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;

public class DataHubSinkDemo {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime);
        env.enableCheckpointing(10_000L);
        env.setParallelism(2);

        RecordSchemaSer schema = new RecordSchemaSer();

        schema.addField(new Field("id", FieldType.STRING));            

        DataStream<PutRecordsResult> out = env
                .addSource(new PoJoElecMeterSource())
                .keyBy( r -> r.getId())
                .window(TumblingProcessingTimeWindows.of(Time.seconds(3))) 
                .process(new PutDatahubFunction<>(schema));  // PutDatahubFunction is my build a new Operator function class

        env.execute();
    }
}

变量schema是我想发送给构造函数的一个参数,它是RecordSchemaSer类的一个实例。

import com.aliyun.datahub.client.model.RecordSchema;
import java.io.Serializable;

public class RecordSchemaSer
        extends RecordSchema
        implements Serializable {

}

PutDatahubFunction是ProcessWindowFunction的扩展类,代码如下。

import com.aliyun.datahub.client.model.*;

import io.github.streamingwithflink.chapter8.PUDAPoJo;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.functions.windowing.ProcessWindowFunction;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;

import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.List;

public class PutDatahubFunction<IN extends PUDAPoJo, KEY>
        extends ProcessWindowFunction<IN, PutRecordsResult, KEY, TimeWindow> {

    private DataHubBase dataHubHandler;
    private List<RecordEntry> recordEntries;
    private RecordSchema schema;

    public PutDatahubFunction(RecordSchema schema) {

        this.schema = schema;
        System.out.println("field 'id' not exist ? " + this.schema.containsField("id"));  // it's true
    }

    @Override
    public void open(Configuration parameters) throws Exception {
        .........
    }

    @Override
    public void process(KEY KEY,
                        Context context,
                        Iterable<IN> elements,
                        Collector<PutRecordsResult> out)
            throws Exception {

        RecordEntry entry = new RecordEntry();

        for (IN e : elements) {
            System.out.println("field 'id' not exist ? " + this.schema.containsField("id")); // it's false
            ......
        }

    }
}

第一个system.out在构造函数中,this.schema.containsField("id")是true,但第二个system.out在处理方法中,this.schema.containsField("id")是false!为什么?我有system.out的两个类名的实例都是PutDatahubFunction。

使用ValueState不工作,因为构造函数没有调用getRuntimeContext(),否则在线程 "main "中出现了java.lang.IllegalStateException。运行时上下文没有被初始化,代码如下:

private ValueState<RecordSchema>  schema;


public PutTupleDatahubFunction(RecordSchema schema) throws IOException {
    ValueStateDescriptor schemaDes =
            new ValueStateDescriptor("datahub schema", TypeInformation.of(RecordSchema.class));
    /*
     * error Exception in thread "main" java.lang.IllegalStateException:
     * The runtime context has not been initialized.
     */
    this.schema = getRuntimeContext().getState(schemaDes);
    this.schema.update(schema);
}

我很困惑,谁能告诉我原因,有什么方法可以把参数传给这个操作函数类的构造函数吗?

apache-flink flink-streaming
1个回答
0
投票

我终于知道为什么了,原因是Serialize和Deserialize。我在编写RecordSchemaSer的时候原因是Serialize内容,由于null的原因,我没有编写RecordSchemaSer。

public class RecordSchemaSer
        extends RecordSchema
        implements Serializable
{


}
© www.soinside.com 2019 - 2024. All rights reserved.