Spring Cloud Stream(Hoxton)Kafka生产者/消费者未使用EmbeddedKafka进行集成测试

问题描述 投票:0回答:1

我有一个正常工作的应用程序,它使用Hoxton附带的Producer的最新更新。现在,我正在尝试添加一些集成测试,断言生产者实际上正在按预期生成消息。问题是,我在测试中使用的消费者从不从主题中读取任何内容。

为了使此问题可重现,我从春季云流样本中重用了一个项目(spring-cloud-stream-samples/source-samples/dynamic-destination-source-kafka,并对其进行了如下调整:

DynamicDestinationSourceApplication(EmitterProcessor现在是bean)


@SpringBootApplication
@RestController
public class DynamicDestinationSourceApplication {

    @Autowired
    private ObjectMapper jsonMapper;

    @Autowired
    private EmitterProcessor<Message<?>> processor;

    public static void main(String[] args) {
        SpringApplication.run(DynamicDestinationSourceApplication.class, args);
    }

    @SuppressWarnings("unchecked")
    @RequestMapping(path = "/", method = POST, consumes = "*/*")
    @ResponseStatus(HttpStatus.ACCEPTED)
    public void handleRequest(@RequestBody String body, @RequestHeader(HttpHeaders.CONTENT_TYPE) Object contentType) throws Exception {
        Map<String, String> payload = jsonMapper.readValue(body, Map.class);
        String destinationName = payload.get("id");
        Message<?> message = MessageBuilder.withPayload(payload)
                .setHeader("spring.cloud.stream.sendto.destination", destinationName).build();
        processor.onNext(message);
    }

    @Bean
    public Supplier<Flux<Message<?>>> supplier() {
        return () -> processor;
    }

    @Bean
    public EmitterProcessor<Message<?>> processor(){
        return EmitterProcessor.create();
    }

    //Following sink is used as test consumer. It logs the data received through the consumer.
    static class TestSink {

        private final Log logger = LogFactory.getLog(getClass());

        @Bean
        public Consumer<String> receive1() {
            return data -> logger.info("Data received from customer-1..." + data);
        }

        @Bean
        public Consumer<String> receive2() {
            return data -> logger.info("Data received from customer-2..." + data);
        }
    }
}

ModuleApplicationTests

@EmbeddedKafka
@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(classes = DynamicDestinationSourceApplication.class)
@WebAppConfiguration
@DirtiesContext
public class ModuleApplicationTests {

    private static String TOPIC = "someTopic";

    @Autowired
    private EmbeddedKafkaBroker embeddedKafkaBroker;

    @Autowired
    private EmitterProcessor<Message<?>> processor;

    @Test
    public void shouldProduceAndConsume() {

        Map<String, Object> configs = new HashMap<>(KafkaTestUtils.consumerProps("consumer", "false", embeddedKafkaBroker));
        Consumer<String, String> consumer = new DefaultKafkaConsumerFactory<>(configs, new StringDeserializer(), new StringDeserializer()).createConsumer();
        consumer.subscribe(Collections.singleton(TOPIC));
        consumer.poll(0);

        Message<?> message = MessageBuilder.withPayload(new HashMap<String,String>(){{put("somekey", "somevalue");}})
                .setHeader("spring.cloud.stream.sendto.destination", TOPIC).build();
        processor.onNext(message);

        ConsumerRecord<String, String> someRecord = KafkaTestUtils.getSingleRecord(consumer, TOPIC);
        System.out.println(someRecord);

    }

}

No records found for topic结尾。为什么在测试过程中不起作用?

UPDATE:

我的实际项目的行为与上面的项目完全不同,我看到的是emitterProcessor.onNext()最终没有调用AbstractMessageHandler.onNext()

调试到emitterProcessor.onNext()中,我已经看到它调用drain(),并且在FluxPublish.PubSubInner<T>[] a = subscribers;中,订户是一个空数组,而在正常的应用程序执行中,它包含EmitterProcessor。

apache-kafka spring-cloud-stream spring-cloud-stream-binder-kafka
1个回答
0
投票

我错误地将testImplementation("org.springframework.cloud:spring-cloud-stream-test-support")添加为依赖项。这使用的测试绑定器不打算与集成测试一起使用。

© www.soinside.com 2019 - 2024. All rights reserved.