我已经克隆了DL4J examples,只是尝试运行其中之一。我正在尝试的一个是LogDataExample.java。已成功构建项目,并且在引发异常后启动它时,每条缝都可以期待]
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>(Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExecutionHandler;)V
at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:78)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:73)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:60)
at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.datavec.transform.logdata.LogDataExample.main(LogDataExample.java:85)
我无法在网上找到任何可以帮助我解决此问题的内容。我的代码与example
中的完全相同pom.xml包含以下内容
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.46.Final</version>
</dependency>
我认为您正在强制使用比Spark支持的更新版本的netty。
通过运行mvn dependency:tree
,您可以在这里看到Spark想要的版本,并使用它代替您定义的版本。
[如果您不关心Spark,但只想使用DataVec转换数据,请查看https://www.dubs.tech/guides/quickstart-with-dl4j/。关于依赖项,它有些过时了,但是datavec部分显示了如何在没有火花的情况下使用它。