将 CDC Azure 数据库 MySQL 创建到 Apache Kafka 时出错

问题描述 投票:0回答:1

大家好,我想问一些关于 CDC mysql 到 kafka 以及 azure mysql 数据库的问题。我已经关注了本教程: https://techcommunity.microsoft.com/t5/azure-database-for-mysql-blog/cdc-in-azure-database-for-mysql-flexible-server-using-kafka/ba-p/2780943

但是在这部分创建kafka conncetor时卡住了,并且出现错误:

{"error_code":400,"message":"Connector configuration is invalid and contains the following 1 error(s):\nUnable to connect: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.\nYou can also find the above list of errors at the endpoint `/connector-plugins/{connectorType}/config/validate`"}

Kafka 连接器配置:

{
    "name": "sql-server-connection",
    "config": {
        "connector.class": "io.debezium.connector.mysql.MySqlConnector",
        "database.hostname": "localhost",
        "database.port": "3306",
        "database.user": "soleluna",
        "database.dbname": "cdcdatabase",
        "database.password": "mypassword",
        "database.server.id": "1",
        "database.server.name": "userserver",
        "table.whitelist": "dbo.users",
        "database.history": "io.debezium.relational.history.MemoryDatabaseHistory",
        "topic.prefix": "cdc.kafkadev"
    }
}

connect-distributed.properties 配置:

bootstrap.servers=goldwing.servicebus.windows.net:9093
group.id=connect-cluster-group-1

# connect internal topic names, auto-created if not exists
config.storage.topic=connect-cluster-configs
offset.storage.topic=connect-cluster-offsets
status.storage.topic=connect-cluster-status

# internal topic replication factors - auto 3x replication in Azure Storage
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1

rest.advertised.host.name=connect
offset.flush.interval.ms=10000

key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter

internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

# required EH Kafka security settings
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";

producer.security.protocol=SASL_SSL
producer.sasl.mechanism=PLAIN
producer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";

consumer.security.protocol=SASL_SSL
consumer.sasl.mechanism=PLAIN
consumer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";

plugin.path=/opt/kafka/libs

connect-cluster-configs、connect-cluster-offsets、connect-cluster-status 主题已在 azure 命名空间上自动创建,任何人都可以告诉我我的错误在哪里吗?任何答案将不胜感激,谢谢。

我的期望是 debezium kafka 连接器主题已创建,并且可以将数据从 azure mysql 复制到 kafka 主题(事件中心)。

azure apache-kafka apache-kafka-connect debezium
1个回答
0
投票

@Dityudha 如果没有更多,日志这可能很难解决。请检查以下内容并发布更多日志

检查表上是否启用了 CDC

检查 Kafka 连接日志。

curl localhost:8083/connectors
curl localhost:8083/connectors/sql-server-connection/tasks/0/status

日志上还有其他提示吗?

MySQL服务器使用的权限是否有问题,那里有日志吗?

连接器部署位置和 My SQL Server 之间是否存在任何网络问题?

© www.soinside.com 2019 - 2024. All rights reserved.