Kafka使用者不使用来自安全的Kafka群集的消息

问题描述 投票:0回答:1

我正在使用kafka-console-consumer,它工作正常。这是配置文件(admin.conf)。

security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="admin" password="admin-secret";
ssl.truststore.location=F:\\temp\\kafka.server.truststore.jks
ssl.truststore.password=serversecret

带有输出的控制台用户命令:

PS F:\Soft\confluent-5.1.0-2.11\confluent-5.1.0\bin\windows> ./kafka-console-consumer.bat --bootstrap-server prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093 --from-beginning --topic test_topic3 --consumer.config admin.conf
dgdsfgdf
sfgsfg
rherht
errger
11111111111
aaaaaaaaaaaa
cvvvc
2222222222222222
aaaaa
csdacv
2222222222222222222222
12121121212
35364646346
Processed a total of 13 messages

但是Java客户端使用者不使用消息。这是我的Kafka消费者Java代码

        //System.setProperty("java.security.auth.login.config","F:\\temp\\jaas.conf");

        Properties props = new Properties();
        props.put("bootstrap.servers", "prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093");
        props.put("sasl.jaas.config", "org.apache.kafka.common.security.scram.ScramLoginModule required username=\"admin\" password=\"admin-secret\";");    
        props.put("group.id", "demo-consumer");
        props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("auto.offset.reset", "earliest");
        props.put("security.protocol","SASL_SSL");  
        props.put("sasl.mechanism","SCRAM-SHA-256");
        props.put("ssl.truststore.location","F:\\temp\\kafka.server.truststore.jks");
        props.put("ssl.truststore.password","serversecret");
        KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
        System.out.println("Consumer created...");
        consumer.subscribe(Arrays.asList("test_topic3"));
        try {
            while (true) {
                System.out.println("Consumer polling started....");
                ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(10000));
                System.out.println("records.count():: "+ records.count());
                for (ConsumerRecord<String, String> record : records)
                    System.out.println(record.key() + ": " + record.value());
            }
        }

这里是kafka客户端库的maven依赖项。

        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>2.4.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-streams</artifactId>
            <version>2.4.1</version>
        </dependency>

这是我的jaas.conf文件。

KafkaClient {
org.apache.kafka.common.security.scram.ScramLoginModule required
username="admin123"
password="admin-secret";
};

这是我的输出

2020-04-07 17:12:45 INFO  ConsumerConfig - ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [prd-app-kafka-0:9093, prd-app-kafka-1:9093, prd-app-kafka-2:9093]
    check.crcs = true
    client.dns.lookup = default
    client.id = 
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = true
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = demo-consumer
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = [hidden]
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = SCRAM-SHA-256
    security.protocol = SASL_SSL
    security.providers = null
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = F:\temp\kafka.server.truststore.jks
    ssl.truststore.password = [hidden]
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-07 17:12:45 DEBUG KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Initializing the Kafka consumer
2020-04-07 17:12:45 INFO  AbstractLogin - Successfully logged in.
2020-04-07 17:12:46 DEBUG SslEngineBuilder - Created SSL context with keystore null, truststore SecurityStore(path=F:\temp\kafka.server.truststore.jks, modificationTime=Mon Apr 06 18:56:42 GMT+03:00 2020), provider SunJSSE.
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka version: 5.4.1-ccs
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka commitId: bd7407ab4c5d30c1
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka startTimeMs: 1586268766652
2020-04-07 17:12:46 DEBUG KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Kafka consumer initialized
Consumer created...
2020-04-07 17:12:46 INFO  KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Subscribed to topic(s): test_topic3
Consumer polling started....
2020-04-07 17:12:46 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
2020-04-07 17:12:46 DEBUG NetworkClient - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Initiating connection to node prd-app-kafka-0:9093 (id: -1 rack: null) using address prd-app-kafka-0/10.10.54.72
2020-04-07 17:12:47 DEBUG SaslClientAuthenticator - Set SASL client state to SEND_APIVERSIONS_REQUEST
2020-04-07 17:12:47 DEBUG SaslClientAuthenticator - Creating SaslClient: client=null;service=kafka;serviceHostname=prd-app-kafka-0;mechs=[SCRAM-SHA-256]
2020-04-07 17:12:47 DEBUG ScramSaslClient - Setting SASL/SCRAM_SHA_256 client state to SEND_CLIENT_FIRST_MESSAGE
2020-04-07 17:12:47 DEBUG Selector - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Created socket with SO_RCVBUF = 65536, SO_SNDBUF = 131072, SO_TIMEOUT = 0 to node -1
2020-04-07 17:12:47 DEBUG NetworkClient - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Completed connection to node -1. Fetching API versions.
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
2020-04-07 17:13:16 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Coordinator discovery failed, refreshing metadata
records.count():: 0
Consumer polling started....
2020-04-07 17:13:26 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Coordinator discovery failed, refreshing metadata
records.count():: 0
Consumer polling started....
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
records.count():: 0

如果缺少任何必需的配置,请告知我。

java ssl apache-kafka jaas
1个回答
0
投票

根据您的CMD命令admin.conf和您的信任库不在F:\\temp\\

此外,您需要将SASL JAAS Config添加到客户端。查看日志sasl.jaas.config = null。然后根据CLI上可用的设置进行验证

© www.soinside.com 2019 - 2024. All rights reserved.