火花提交抛出一个错误抛出java.lang.ClassNotFoundException:scala.runtime.java8.JFunction2 $ mcIII $ SP

问题描述 投票:1回答:1

我写了一个字计数代码,但是当我试图从CMD窗口中使用以下命令运行它,它抛出一个异常。

spark-submit --class com.sample.WordCount --master local file:///E:/WordCountSample/target/WordCountSample-0.0.1-SNAPSHOT.jar file:///C:/Users/siddh/OneDrive/Desktop/sample.txt

pom.hml

<project xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>SparkSampleInScala</groupId>
    <artifactId>WordCountSample</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>
    <name>WordCountSample</name>
    <url>http://maven.apache.org</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <!--<build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> 
        <artifactId>maven-surefire-plugin</artifactId> <version>3.0.0-M1</version> 
        </plugin> </plugins> </build> -->
    <dependencies>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>2.4.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.12.8</version>
</dependency>


    </dependencies>
</project>  

但是,当我跑火花提交它抛出以下错误:

Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction2$mcIII$sp
        at com.sample.WordCount$.main(WordCount.scala:22)
        at com.sample.WordCount.main(WordCount.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction2$mcIII$sp
        ... 14 more
Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction2$mcIII$sp

所以需要的东西都在这个情况下做?我使用spark 2.4.0版本,并已经安装在我的Windows scala 2.12.8 version。请大家帮我,因为我,因为几天停留在此。提前致谢 :)

java scala apache-spark hadoop
1个回答
2
投票

看起来你正在使用2.4.x的使用Scala 2.12。这可能是兼容性问题。火花文档参考: - 火花上的Java 8+,Python的2.7运行+ / 3.4 +和R 3.1+。对于Scala的API,星火2.4.0使用Scala的2.11。您将需要使用兼容的Scala版本(2.11.x)。


0
投票

我有同样的问题,并通过改变斯卡拉的版本,我开发过程中使用相匹配的版本星火附带解决了这个问题。

当我开始用星火./spark-shell,它说Using Scala version 2.11.12,所以我在build.sbt改变斯卡拉版本从2.12.82.11.12和一切工作。我用星火版本2.4.3

© www.soinside.com 2019 - 2024. All rights reserved.