Databricks错误java.lang.NoSuchMethodError:scala.Predef $ .refArrayOps([Ljava / lang / Object;)[Ljava / lang / Object;

问题描述 投票:1回答:2

我正在尝试从此链接运行一些示例代码:https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/5537430417240233/312903576646278/3506802399907740/latest.html

我正在运行时6.3的集群上的databricks笔记本中运行它(包括Apache Spark 2.4.4,Scala 2.11)我最初使用

创建一个数据框
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder.getOrCreate
import spark.implicits._

val df = Seq(
    ("one", 2.0),
    ("two", 1.5),
    ("three", 8.0)
  ).toDF("id", "val")

然后我尝试通过运行来获取字符串列表df.select("id").map(_.getString(0)).collect.toList

并且我得到下面的错误

java.lang.NoSuchMethodError:scala.Predef $ .refArrayOps([Ljava / lang / Object;)[Ljava / lang / Object;

t line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $ iw $ iw $ iw $ iw $$ iw $ iw $ iw。(command-1275538363433250:2)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $ iw $ iw $ iw $ iw $$ iw $$ iw。(command-1275538363433250:53)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $ iw $ iw $ iw $ iw $$ iw。(command-1275538363433250:55)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $ iw $ iw $ iw $$ iw。(command-1275538363433250:57)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $ iw $$ iw。(command-1275538363433250:59)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw $$ iw。(command-1275538363433250:61)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw $$ iw。(command-1275538363433250:63)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $$ iw。(command-1275538363433250:65)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read。(command-1275538363433250:67)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $。(command-1275538363433250:71)在line3700fe51392b4abe9744f6b3a059dbfa46。$ read $。(command-1275538363433250)在line3700fe51392b4abe9744f6b3a059dbfa46。$ eval $。$ print $ lzycompute(:7)在line3700fe51392b4abe9744f6b3a059dbfa46。$ eval $。$ print(:6)在line3700fe51392b4abe9744f6b3a059dbfa46。$ eval。$ print()在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)在java.lang.reflect.Method.invoke(Method.java:498)在scala.tools.nsc。解释器.IMain $ ReadEvalPrint.call(IMain.scala:793)在scala.tools.nsc。解释器.IMain $ Request.loadAndRun(IMain.scala:1054)在scala.tools.nsc.interpreter.IMain $ WrappedRequest $$ anonfun $ loadAndRunReq $ 1.apply(IMain.scala:645)在scala.tools.nsc.interpreter.IMain $ WrappedRequest $$ anonfun $ loadAndRunReq $ 1.apply(IMain.scala:644)在scala.reflect.internal.util.ScalaClassLoader $ class.asContext(ScalaClassLoader.scala:31)在scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)在scala.tools.nsc。解释器.IMain $ WrappedRequest.loadAndRunReq(IMain.scala:644)在scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)在scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)在com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)在com.databricks.backend.daemon.driver.ScalaDriverLocal $$ anonfun $ repl $ 1.apply $ mcV $ sp(ScalaDriverLocal.scala:202)在com.databricks.backend.daemon.driver.ScalaDriverLocal $$ anonfun $ repl $ 1.apply(ScalaDriverLocal.scala:202)在com.databricks.backend.daemon.driver.ScalaDriverLocal $$ anonfun $ repl $ 1.apply(ScalaDriverLocal.scala:202)在com.databricks.backend.daemon.driver.DriverLocal $ TrapExitInternal $ .trapExit(DriverLocal.scala:699)在com.databricks.backend.daemon.driver.DriverLocal $ TrapExit $ .apply(DriverLocal.scala:652)在com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)在com.databricks.backend.daemon.driver.DriverLocal $$ anonfun $ execute $ 9.apply(DriverLocal.scala:385)在com.databricks.backend.daemon.driver.DriverLocal $$ anonfun $ execute $ 9.apply(DriverLocal.scala:362)在com.databricks.logging.UsageLogging $$ anonfun $ withAttributionContext $ 1.apply(UsageLogging.scala:251)在scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)在com.databricks.logging.UsageLogging $ class.withAttributionContext(UsageLogging.scala:246)在com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)在com.databricks.logging.UsageLogging $ class.withAttributionTags(UsageLogging.scala:288)在com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)在com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:362)在com.databricks.backend.daemon.driver.DriverWrapper $$ anonfun $ tryExecutingCommand $ 2.apply(DriverWrapper.scala:644)在com.databricks.backend.daemon.driver.DriverWrapper $$ anonfun $ tryExecutingCommand $ 2.apply(DriverWrapper.scala:644)在scala.util.Try $ .apply(Try.scala:192)在com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)在com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)在com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)在com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)在com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)在com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)在java.lang.Thread.run(Thread.java:748)

运行时出现相同的错误df.select("id").collect().map(_(0)).toList

但跑步时不行df.select("id").rdd.map(_(0)).collect.toList

上面运行的命令成功返回一个列表[Any],但我需要一个列表[String]

有人可以请教吗?我怀疑这是sparkscala版本不匹配,但我不知道出了什么问题。

scala apache-spark azure-databricks nosuchmethoderror
2个回答
1
投票

0
投票
.map
© www.soinside.com 2019 - 2024. All rights reserved.