Debezium 连接:java.lang.NoClassDefFoundError:com/google/common/base/Ticker

问题描述 投票:0回答:1

问题描述

我在

Debezium
Confluent Schema Registry
上使用 Avro 序列化时遇到问题。我遵循 Debezium 的官方文档。正如文档中提到的,从版本
2.0
开始,jar 文件需要安装在连接容器内。

以下是所需 jar 文件的列表:

  • kafka-connect-avro-转换器
  • kafka-connect-avro-数据
  • kafka-avro-序列化器
  • kafka-schema-序列化器
  • kafka-schema-registry-client
  • 通用配置
  • 通用实用程序

解决方法 为此,我下载了所有必需的 jar 文件并将它们安装到

/kafka/connect/avro_jar_files
目录:

Debezium 版本:2.4

Confluence JAR 文件版本:7.5.2

每当我想注册一个新的连接器(MySQL连接器)时,我都会面临以下问题:

java.lang.NoClassDefFoundError: com/google/common/base/Ticker
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.<init>(CachedSchemaRegistryClient.java:181)
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.<init>(CachedSchemaRegistryClient.java:164)
        at io.confluent.kafka.schemaregistry.client.SchemaRegistryClientFactory.newClient(SchemaRegistryClientFactory.java:36)
        at io.confluent.connect.avro.AvroConverter.configure(AvroConverter.java:71)
        at org.apache.kafka.connect.runtime.isolation.Plugins.newConverter(Plugins.java:328)
        at org.apache.kafka.connect.runtime.Worker.startTask(Worker.java:620)
        at org.apache.kafka.connect.runtime.Worker.startSourceTask(Worker.java:548)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.startTask(DistributedHerder.java:1833)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.lambda$getTaskStartingCallable$32(DistributedHerder.java:1850)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.ClassNotFoundException: com.google.common.base.Ticker
        at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:594)
        at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:136)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:527)

从错误来看,我认为

com/google/common/base/Tickercom/google/common/base/Ticker
类不包含在
kafka-schema-registry-client-7.5.2.jar
文件中。

我没有找到任何提到 Debezium 2.4 版本的 jar 文件的兼容版本的链接。

docker-compose.yaml

version: '3'

services:
  zookeeper:
    image: quay.io/debezium/zookeeper:2.4
    container_name: zookeeper
    ports:
      - 2181:2181
      - 2888:2888
      - 3888:3888

  kafka:
    image: quay.io/debezium/kafka:2.4
    container_name: kafka
    ports:
      - 9092:9092
    environment:
      ZOOKEEPER_CONNECT: zookeeper:2181
    depends_on:
      - zookeeper

  mysql:
    image: quay.io/debezium/example-mysql:2.4
    container_name: mysql
    ports:
      - 3306:3306
    environment:
      MYSQL_ROOT_PASSWORD: debezium
      MYSQL_USER: mysqluser
      MYSQL_PASSWORD: mysqlpw

  mysql_cli:
    image: mysql:8.0
    container_name: mysql_cli
    environment:
      MYSQL_ALLOW_EMPTY_PASSWORD: true
    depends_on:
      - mysql

  connect:
    image: quay.io/debezium/connect:2.4
    container_name: connect
    ports:
      - 8083:8083
    environment:
      GROUP_ID: 1
      # Kafka config
      CONFIG_STORAGE_TOPIC: my_connect_configs
      OFFSET_STORAGE_TOPIC: my_connect_offsets
      STATUS_STORAGE_TOPIC: my_connect_statuses
      BOOTSTRAP_SERVERS: kafka:9092
      # Avro config
      KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
      VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: http://schema_registry:8081
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://schema_registry:8081
      KAFKA_CONNECT_PLUGINS_DIR: /kafka/connect
    volumes:
      - ./jar_files:/kafka/connect/avro_jar_files
    depends_on:
      - kafka
      - schema_registry
      - mysql
  
  schema_registry:
    image: confluentinc/cp-schema-registry:7.1.10 
    container_name: schema_registry
    ports:
      - 8081:8081
    environment:
      SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: zookeeper:2181
      SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: PLAINTEXT://kafka:9092
      SCHEMA_REGISTRY_HOST_NAME: schema_registry
      SCHEMA_REGISTRY_LISTENERS: http://0.0.0.0:8081
    depends_on:
      - zookeeper

  kafdrop:
    image: obsidiandynamics/kafdrop:4.0.1
    container_name: kafdrop
    ports:
      - 9000:9000
    environment:
      KAFKA_BROKERCONNECT: kafka:9092
        #      - JVM_OPTS="-Xms32M -Xmx64M"
        #   - SERVER_SERVLET_CONTEXTPATH="/"
    depends_on:
      - kafka

cURL 请求:

curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{
  "name":"inventory-connector",
  "config":{
    "connector.class":"io.debezium.connector.mysql.MySqlConnector",
    "tasks.max":"1",
    "database.hostname":"mysql",
    "database.port":"3306",
    "database.user":"debezium",
    "database.password":"dbz",
    "database.server.id":"184054",
    "topic.prefix":"dbserver1",
    "database.include.list":"inventory",
    "schema.history.internal.kafka.bootstrap.servers":"kafka:9092",
    "schema.history.internal.kafka.topic":"schemahistory.inventory",
    "key.converter":"io.confluent.connect.avro.AvroConverter",
    "value.converter":"io.confluent.connect.avro.AvroConverter",
    "key.converter.schema.registry.url":"http://schema_registry:8081",
    "value.converter.schema.registry.url":"http://schema_registry:8081"
  }
}'
avro confluent-schema-registry debezium
1个回答
0
投票

Debezium 及其与 AVRO 在版本 > 2 上的集成的问题在于,它依赖于其官方文档中列出的更多库(JAR 文件)。

Debezium 版本:2.5(2024 年 1 月) 根据其文档,提到所需的 JAR 文件是:

  • kafka-connect-avro-转换器
  • kafka-connect-avro-数据
  • kafka-avro-序列化器
  • kafka-schema-序列化器
  • kafka-schema-registry-client
  • 通用配置
  • 通用实用程序

但它们还不够。问题中提到的问题是 Google 相关库(Guava)依赖性存在一些问题:

java.lang.NoClassDefFoundError:com/google/common/base/Ticker

我也检查了Confluence链接,但还是没有完成!

面对很多问题后我发现所需的JAR文件如下:

  • avro-1.11.3.jar
  • common-config-7.5.3.jar
  • common-utils-7.5.3.jar
  • commons-compress-1.21.jar
  • failureaccess-1.0.1.jar
  • 番石榴-31.0.1-jre.jar
  • jackson-annotations-2.14.2.jar
  • jackson-core-2.14.2.jar
  • jackson-databind-2.14.2.jar
  • 杰克逊-dataformat-csv-2.14.2.jar
  • kafka-avro-serializer-7.5.3.jar
  • kafka-connect-avro-converter-7.5.3.jar
  • kafka-connect-avro-data-7.5.3.jar
  • kafka-schema-registry-client-7.5.3.jar
  • kafka-schema-serializer-7.5.3.jar
  • logredactor-1.0.12.jar
  • logredactor-metrics-1.0.12.jar
  • 最小-json-0.9.5.jar
  • re2j-1.6.jar
  • slf4j-api-1.7.36.jar
  • snakeyaml-2.0.jar
  • swagger-annotations-2.1.10.jar

附注我创建了一个存储库,并将问题已解决的每个脚本和 Dockerfile 放置在其中。欲了解更多信息,请查看此链接

© www.soinside.com 2019 - 2024. All rights reserved.