Spring Cloud Stream Kafka消费者/生产者API恰好一次语义(事务性)

问题描述 投票:0回答:1

具有事务启用和动态目标的Spring Cloud Stream Kafka存在问题。我有两种不同的服务

  • 第一个服务将从Solace队列中侦听并将其生成到kafka topic-1(启用事务处理)
  • 第二服务将从上面的kafka topic-1监听并将其写入另一个kafka topic-2(我们没有手动提交,启用了生成其他主题的事务,自动提交偏移为false&isolation.level设置为read_commited)和我们将动态识别主题名称,以便我们使用动态目标解析器

现在我遇到了第二个服务的问题,如果只是运行服务@StreamListener&@SendTo它正常工作正常。但是当我开始使用动态目标时,我遇到了以下问题: 动态目的地

Caused by: java.lang.IllegalStateException: Cannot perform operation after producer has been closed
    at org.apache.kafka.clients.producer.KafkaProducer.throwIfProducerClosed(KafkaProducer.java:810) ~[kafka-clients-2.0.0.jar:na]
    at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:819) ~[kafka-clients-2.0.0.jar:na]
    at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803) ~[kafka-clients-2.0.0.jar:na]
    at org.springframework.kafka.core.DefaultKafkaProducerFactory$CloseSafeProducer.send(DefaultKafkaProducerFactory.java:423) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.kafka.core.KafkaTemplate.doSend(KafkaTemplate.java:351) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.kafka.core.KafkaTemplate.send(KafkaTemplate.java:209) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.handleRequestMessage(KafkaProducerMessageHandler.java:382) ~[spring-integration-kafka-3.1.0.RELEASE.jar:3.1.0.RELEASE]
    at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:123) [spring-integration-core-5.1.0.RELEASE.jar:5.1.0.RELEASE]
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:169) [spring-integration-core-5.1.0.RELEASE.jar:5.1.0.RELEASE]

尝试了动态目标解析器的两种方法:dynamic destination resolver yml for spring cloud kafka:

spring: 
  cloud.stream:
      bindings:
        input:
          destination: test_input
          content-type: application/json
          group: test_group
        output:
          destination: test_output
          content-type: application/json
      kafka.binder: 
          configuration: 
            isolation.level: read_committed
            security.protocol: SASL_SSL
            sasl.mechanism: GSSAPI
            sasl.kerberos.service.name: kafka
            ssl.truststore.location: jks
            ssl.truststore.password: 
            ssl.endpoint.identification.algorithm: null            
          brokers: broker1:9092,broker2:9092,broker3:9092
          auto-create-topics: false
          transaction:
            transaction-id-prefix: trans-2
            producer:
              configuration:
                retries: 2000
                acks: all
                security.protocol: SASL_SSL
                sasl.mechanism: GSSAPI
                sasl.kerberos.service.name: kafka
                ssl.truststore.location: jks
                ssl.truststore.password: 
                ssl.endpoint.identification.algorithm: null

这是这个问题的背景 Spring Cloud Stream for Kafka with consumer/producer API exactly once semantics with transaction-id-prefix is not working as expected

更新了代码:


    @Autowired
    private BinderAwareChannelResolver resolver;

    @StreamListener(target = Processor.INPUT)
    public void consumer(@Payload Object inMessage, @Headers Map headers) {
        String topicName = null;
        String itemType = null;
        try {
            TransactionSynchronizationManager.setActualTransactionActive(true);     
            itemType = msgService.itemTypeExtract((String) inMessage);          
            topicName = msgService.getTopicName(itemType, (String) inMessage);      

            Map<String, Object> headersMap = new HashMap<>();
            headersMap.put(MessageHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE);

            resolver.resolveDestination("destination_topic")
                    .send(MessageBuilder.createMessage(inMessage, new MessageHeaders(headersMap)), 10000);
        } catch (Exception e) {
            LOGGER.error("error " + e.getMessage());
        }
    } 
kafka-consumer-api kafka-producer-api spring-cloud-stream
1个回答
0
投票

活页夹中有一个错误;我打开了an issue to get it fixed

© www.soinside.com 2019 - 2024. All rights reserved.