尝试在命令行上运行Java Spark应用程序时出现NoClassDefFoundError

问题描述 投票:0回答:1

我正在尝试在Java上使用spark运行应用程序,但是当我尝试mvn package; mvn exec:java时,我一直遇到NoClassDefFoundError。

[当我尝试在PowerShell和Intellij中运行程序并保持收到相同错误时(如下)。当我删除一堆Maven依赖项时,错误消失并且servlet在本地主机上运行。

错误消息:

Exception in thread "Thread-0" java.lang.NoClassDefFoundError: javax/servlet/http/HttpSessionIdListener
    at org.eclipse.jetty.server.session.SessionHandler.<clinit>(SessionHandler.java:140)
    at spark.embeddedserver.jetty.EmbeddedJettyFactory.create(EmbeddedJettyFactory.java:43)
    at spark.embeddedserver.EmbeddedServers.create(EmbeddedServers.java:65)
    at spark.Service.lambda$init$2(Service.java:497)
    at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: java.lang.ClassNotFoundException: javax.servlet.http.HttpSessionIdListener
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    ... 5 more

我从pom.xml中删除的依赖项:

<dependency>
            <groupId>com.googlecode.json-simple</groupId>
            <artifactId>json-simple</artifactId>
            <version>1.1</version>
        </dependency>

        <dependency>
            <groupId>edu.stanford.nlp</groupId>
            <artifactId>stanford-corenlp</artifactId>
            <version>3.9.2</version>
        </dependency>

        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>8.0.13</version>
        </dependency>

        <dependency>
            <groupId>edu.stanford.nlp</groupId>
            <artifactId>stanford-corenlp</artifactId>
            <version>3.9.2</version>
            <classifier>models</classifier>
        </dependency>

servlet仅通过spark依赖项即可正常运行:

        <dependency>
            <groupId>com.sparkjava</groupId>
            <artifactId>spark-core</artifactId>
            <version>2.6.0</version>
        </dependency>

但是,如果我添加更多,则会发生异常

java apache-spark classnotfoundexception noclassdeffounderror deploying
1个回答
0
投票

已解决! sparkjava依赖项需要5个以上的函数依赖项,可以在这里找到:https://mvnrepository.com/artifact/com.sparkjava/spark-core/2.9.1

© www.soinside.com 2019 - 2024. All rights reserved.