Debezium 连接:java.lang.NoClassDefFoundError:io/confluence/connect/schema/AbstractDataConfig

问题描述 投票:0回答:1

我正在尝试按照这些帖子中的步骤执行类似的操作,但它对我不起作用,我正在使用 bezium 2.5 Final。

我的 DockerFile 是这样的:

# # https://packages.confluent.io/maven/io/confluent/

ARG DEBEZIUM_VERSION 
FROM quay.io/debezium/connect:$DEBEZIUM_VERSION
#FROM quay.io/debezium/connect-base:$DEBEZIUM_VERSION

ARG KAFKA_VERSION
RUN echo ${KAFKA_VERSION}



ADD --chown=kafka:kafka --chmod=775 https://repo1.maven.org/maven2/com/ibm/informix/jdbc/4.50.10/jdbc-4.50.10.jar /kafka/connect/debezium-connector-informix/
ADD --chown=kafka:kafka --chmod=775 https://repo1.maven.org/maven2/com/ibm/informix/ifx-changestream-client/1.1.3/ifx-changestream-client-1.1.3.jar /kafka/connect/debezium-connector-informix/

ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/kafka-connect-avro-converter/${KAFKA_VERSION}/kafka-connect-avro-converter-${KAFKA_VERSION}.jar /kafka/connect/kafka-connect-avro-converter/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/kafka-connect-avro-data/${KAFKA_VERSION}/kafka-connect-avro-data-${KAFKA_VERSION}.jar /kafka/connect/kafka-connect-avro-data/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/kafka-avro-serializer/${KAFKA_VERSION}/kafka-avro-serializer-${KAFKA_VERSION}.jar /kafka/connect/kafka-avro-serializer/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/kafka-schema-serializer/${KAFKA_VERSION}/kafka-schema-serializer-${KAFKA_VERSION}.jar /kafka/connect/kafka-schema-serializer/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/kafka-schema-registry-client/${KAFKA_VERSION}/kafka-schema-registry-client-${KAFKA_VERSION}.jar /kafka/connect/kafka-schema-registry-client/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/common-config/${KAFKA_VERSION}/common-config-${KAFKA_VERSION}.jar /kafka/connect/common-config/
ADD --chown=kafka:kafka --chmod=775 https://packages.confluent.io/maven/io/confluent/common-utils/${KAFKA_VERSION}/common-utils-${KAFKA_VERSION}.jar /kafka/connect/common-utils/
ADD --chown=kafka:kafka --chmod=775 https://repo1.maven.org/maven2/org/apache/avro/avro/1.11.3/avro-1.11.3.jar /kafka/connect/avro/
ADD --chown=kafka:kafka --chmod=775 https://repo1.maven.org/maven2/com/google/guava/guava/33.0.0-jre/guava-33.0.0-jre.jar /kafka/connect/guava/

我的docker-compose是这样的:

connect:
    image: debezium/connect-ifx:${DEBEZIUM_VERSION}
    container_name: connect
    build:
      context: ./debezium-ifx-init/ifxconnect
      args:
        DEBEZIUM_VERSION: ${DEBEZIUM_VERSION}
        KAFKA_VERSION: ${KAFKA_VERSION}
    ports:
      - 8083:8083
    depends_on:
      - kafka
      - informix
      - schema-registry
    environment:
      GROUP_ID: 1
      # Kafka config
      CONFIG_STORAGE_TOPIC: my_connect_configs
      OFFSET_STORAGE_TOPIC: my_connect_offsets
      STATUS_STORAGE_TOPIC: my_connect_statuses
      BOOTSTRAP_SERVERS: kafka:9092
      # Avro config
      KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
      VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: http://schema-registry:8081
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://schema-registry:8081
      KAFKA_CONNECT_PLUGINS_DIR: /kafka/connect
      CLASSPATH: /kafka/connect/kafka-connect-avro-converter/*:/kafka/connect/kafka-connect-avro-data/*:/kafka/connect/kafka-avro-serializer/*:/kafka/connect/kafka-schema-serializer/*:/kafka/connect/kafka-schema-registry-client/*:/kafka/connect/common-config/*:/kafka/connect/common-utils/*:/kafka/connect/avro/*:/kafka/connect/guava/*
    volumes:
      - ./data/connect/data:/var/lib/kafka/data

但是在注册源连接器时,它给了我这个错误:

{
    "name": "source-testdb-ifx",
    "config": {
        "connector.class" : "io.debezium.connector.informix.InformixConnector",
        "tasks.max" : "1",
        "topic.prefix" : "ifxserver",
        "database.hostname" : "informix",
        "database.port" : "9088",
        "database.user" : "informix",
        "database.password" : "in4mix",
        "database.dbname" : "testdb",
        "schema.history.internal.kafka.bootstrap.servers" : "kafka:9092",
        "schema.history.internal.kafka.topic": "schema-changes.testdb",
        "key.converter": "io.confluent.connect.avro.AvroConverter",
        "key.converter.schema.registry.url": "http://schema-registry:8081",
        "value.converter": "io.confluent.connect.avro.AvroConverter",
        "value.converter.schema.registry.url": "http://schema-registry:8081"
    }
}

错误||无法启动任务 source-testdb-ifx-0 [org.apache.kafka.connect.runtime.Worker] java.lang.NoClassDefFoundError:io/confluence/connect/schema/AbstractDataConfig

知道它可能是什么吗?

avro confluent-schema-registry debezium debezium-connect
1个回答
0
投票

我遇到了类似的问题,看来我没有包含足够的 JAR 文件。要解决此问题:

  1. 访问连接器的 Confluence 页面,例如,
    https://www.confluence.io/hub/confluenceinc/kafka-connect-avro-converter
  2. 点击“下载”按钮。
  3. 下载 ZIP 文件后,导航到“bin”目录。
  4. 在“bin”目录中,您将找到所有必需的 JAR 文件。

将这些 JAR 文件包含在您的项目中,它应该可以工作。

© www.soinside.com 2019 - 2024. All rights reserved.