Kafka将接收器连接到Cassandra :: java.lang.VerifyError:错误的返回类型

问题描述 投票:1回答:1

我正在尝试设置一个Kafka Connect Sink,以使用Datastax连接器将主题中的数据收集到Cassandra表中:https://downloads.datastax.com/#akc

运行直接在代理上运行的独立工作程序,运行Kafka 0.10.2.2-1:

name=dse-sink
connector.class=com.datastax.kafkaconnector.DseSinkConnector
tasks.max=1
datastax-java-driver.advanced.protocol.version = V4

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false

internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

plugin.path=/usr/share/java/kafka-connect-dse/kafka-connect-dse-1.2.1.jar

topics=connect-test
contactPoints=172.16.0.48
loadBalancing.localDc=datacenter1
port=9042

ignoreErrors=true

topic.connect-test.cdrs.test.mapping= kafkakey=key, value=value
topic.connect-test.cdrs.test.consistencyLevel=LOCAL_QUORUM

但是我有以下错误:

2019-12-23 16:58:43,165] ERROR Task dse-sink-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
java.lang.VerifyError: Bad return type
Exception Details:
  Location:
    com/fasterxml/jackson/databind/cfg/MapperBuilder.streamFactory()Lcom/fasterxml/jackson/core/TokenStreamFactory; @7: areturn
  Reason:
    Type 'com/fasterxml/jackson/core/JsonFactory' (current frame, stack[0]) is not assignable to 'com/fasterxml/jackson/core/TokenStreamFactory' (from method signature)
  Current Frame:
    bci: @7
    flags: { }
    locals: { 'com/fasterxml/jackson/databind/cfg/MapperBuilder' }
    stack: { 'com/fasterxml/jackson/core/JsonFactory' }
  Bytecode:
    0x0000000: 2ab4 0002 b600 08b0                    

	at com.fasterxml.jackson.databind.json.JsonMapper.builder(JsonMapper.java:114)
	at com.datastax.dsbulk.commons.codecs.json.JsonCodecUtils.getObjectMapper(JsonCodecUtils.java:36)
	at com.datastax.kafkaconnector.codecs.CodecSettings.init(CodecSettings.java:131)
	at com.datastax.kafkaconnector.state.LifeCycleManager.lambda$buildInstanceState$9(LifeCycleManager.java:423)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.HashMap$ValueSpliterator.forEachRemaining(HashMap.java:1625)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
	at com.datastax.kafkaconnector.state.LifeCycleManager.buildInstanceState(LifeCycleManager.java:457)
	at com.datastax.kafkaconnector.state.LifeCycleManager.lambda$startTask$0(LifeCycleManager.java:106)
	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
	at com.datastax.kafkaconnector.state.LifeCycleManager.startTask(LifeCycleManager.java:101)
	at com.datastax.kafkaconnector.DseSinkTask.start(DseSinkTask.java:74)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:244)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

cassandra或Kafka方面没有其他错误。我在cassandra节点上看到了活动的连接,但没有任何内容到达Keyspace。

知道为什么吗?

我正在尝试设置一个Kafka Connect Sink,以使用Datastax连接器将主题中的数据收集到Cassandra表中:https://downloads.datastax.com/#akc运行一个独立的工作程序,正在运行...

cassandra apache-kafka datastax apache-kafka-connect
1个回答
0
投票

我想这是由于将JSON内部转换器与BigDecimal数据(see related SO question)一起使用而引起的。如the following blog post中所述,internal.key.converterinternal.value.converter自Kafka 2.0起已被弃用,因此不应明确设置。您可以注释掉所有internal.属性并重试吗?

© www.soinside.com 2019 - 2024. All rights reserved.