运行一个scala代码jar出现NoSuchMethodError:scala.Predef$.refArrayOps

问题描述 投票:0回答:2

我的代码在idea中本地模式下可以正常运行,当我打印成jar包上传到我部署的SPARK服务器运行时,NoSuchMethodError: scala.预定义 $. refArrayOps 出现了。 出错的那行代码如下

val expectArray=expectVertex.take(2).toArray.sortBy(it=>{it_1})
expectVertex 是一个 scala 映射,它的键类型是 graphx.VertexId,它的值类型是 Int

我在使用 Spark 简单代码时也遇到过这个问题,这个错误发生在我使用一行数组函数时,代码如下 包 org.example

import org.apache.spark.graphx.{Edge, Graph}
import org.apache.spark.{SparkConf, SparkContext}

import java.util.logging.{Level, Logger}

/**
 * Hello world!
 *
 */
class App{
  def run(): Unit ={
    Logger.getLogger("org.apache.spark").setLevel(Level.WARNING)
    Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
    val conf = new SparkConf().setAppName("AXU test")
      .setMaster("local")
    val sc = new SparkContext(conf)
    val vertices = sc.parallelize(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D")))
    val edges = sc.parallelize(Array(Edge(1L, 2L, "friend"), Edge(2L, 3L, "follow"), Edge(3L, 4L, "friend")))
    val graph = Graph(vertices, edges)
    val inDegrees = graph.inDegrees
    inDegrees.collect().foreach(println)
    val deg = inDegrees.collect()
    for( i <- 0 to deg.length-1){
      print("this is no." + (i+1) + " point indegree:")
      println("id: " + deg(i)._1 + " value: " + deg(i)._2)
    }
    sc.stop()
  }
}

日志是

Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:65)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at org.example.App.run(App.scala:23)
    at org.example.Main$.main(Main.scala:6)
    at org.example.Main.main(Main.scala)

如果我删除第 23 行的代码,代码是

inDegrees.collect().foreach(println)
它可以正常工作。 我编译和运行的scala版本都是2.12.7。 看起来我不能使用像 Array [T] 这样的方法。 foreach 或数组 [T]。 sortBy (it=>{it_1}) in jar packages(我用Maven打包jar)。 Maven 内容如下。

    <properties>
        <scala.version>2.12.7</scala.version>
        <spark.version>2.4.4</spark.version>
    </properties>


    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.2</version>
                <executions>
                    <execution>
                        <id>compile-scala</id>
                        <phase>compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>test-compile-scala</id>
                        <phase>test-compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <scalaVersion>${scala.version}</scalaVersion>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <mainClass>org.example.Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>assembly</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>exec-maven-plugin</artifactId>
                <version>1.6.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>exec</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <executable>java</executable>
                    <includeProjectDependencies>true</includeProjectDependencies>
                    <includePluginDependencies>false</includePluginDependencies>
                    <classpathScope>compile</classpathScope>
                    <mainClass>org.example.Main</mainClass>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

有人能告诉我为什么会出现这个问题吗?提前谢谢你。

scala maven apache-spark intellij-idea jar
2个回答
0
投票
classloader: scala.reflect.internal.util.ScalaClassLoader$URLClassLoader
classloader urls:
file:/home/hadoop/Spark/jdk/jre/lib/resources.jar
file:/home/hadoop/Spark/jdk/jre/lib/rt.jar
file:/home/hadoop/Spark/jdk/jre/lib/jsse.jar
file:/home/hadoop/Spark/jdk/jre/lib/jce.jar
file:/home/hadoop/Spark/jdk/jre/lib/charsets.jar
file:/home/hadoop/Spark/jdk/jre/lib/jfr.jar
file:/home/hadoop/Spark/scala/lib/jline-2.14.6.jar
file:/home/hadoop/Spark/scala/lib/scala-compiler.jar
file:/home/hadoop/Spark/scala/lib/scala-library.jar
file:/home/hadoop/Spark/scala/lib/scalap-2.12.7.jar
file:/home/hadoop/Spark/scala/lib/scala-parser-combinators_2.12-1.0.7.jar
file:/home/hadoop/Spark/scala/lib/scala-reflect.jar
file:/home/hadoop/Spark/scala/lib/scala-swing_2.12-2.0.3.jar
file:/home/hadoop/Spark/scala/lib/scala-xml_2.12-1.0.6.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunjce_provider.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/dnsns.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/jaccess.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/zipfs.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/nashorn.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunec.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunpkcs11.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/jfxrt.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/localedata.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/cldrdata.jar
file:/home/hadoop/./

0
投票

很可能您在本地使用 Scala 2.12 编译代码,但在服务器上运行的是 Scala 2.13(或 2.11?)。

尝试使用服务器上的 Scala 版本重新编译代码。

Scala 2.11、2.12、2.13 是二进制不兼容的。

refArrayOps
的签名不同(以二进制不兼容的方式)

  • 在斯卡拉 2.13

def refArrayOps(scala.Array[scala.Any]): scala.Any
(头皮)

public <T> T[] refArrayOps(T[])
(javap) scalap 和 javap 显示不同的方法签名

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T] @inline()
api

  • 在斯卡拉 2.12

def refArrayOps(scala.Array[scala.Any]): scala.Array[scala.Any]
(头皮)

public <T> T[] refArrayOps(T[])
(javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ofRef[T]
api

  • 在斯卡拉 2.11

def refArrayOps(scala.Array[scala.Any]): scala.collection.mutable.ArrayOps
(头皮)

public <T> scala.collection.mutable.ArrayOps<T> refArrayOps(T[])
(javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T]
api


Kafka 在 MAC 上启动错误 .. 与 java 和 scala 相关的东西 ... NoSuchMethodError: scala.Predef$.refArrayOps

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

如何修复 NoSuchMethodError?

java.lang.NoSuchMethodError: org.apache.hadoop.hive.common.FileUtils.mkdir 试图将表保存到 Hive

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps in Spark job with Scala

© www.soinside.com 2019 - 2024. All rights reserved.