在我的系统上运行spark-shell 时出现错误; pyspark 运行良好

问题描述 投票:0回答:1

我最近在我的系统上安装了spark,但我无法运行spark-shell

这些是我所做的步骤:

  • spark-3.5.1-bin-hadoop3-scala2.13:安装了这个
  • 删除旧版本的jdk并安装21.0.2.0
  • 将各自的 winutils 放入 hadoop 目录中
  • 在环境变量中创建条目

现在 pyspark shell 工作正常,Spark UI 也正在运行。

当我尝试运行 Spark-shell 时,它抛出“java.nio.file.NoSuchFileException”。下面是我得到的输出。

输出

Using Scala version 2.13.8 (Java HotSpot(TM) 64-Bit Server VM, Java 21.0.2)
Type in expressions to have them evaluated.
Type :help for more information.
ReplGlobal.abort: bad constant pool index: 0 at pos: 48461
ReplGlobal.abort: bad constant pool index: 0 at pos: 48461
Exception in thread "main" scala.reflect.internal.FatalError:
  bad constant pool index: 0 at pos: 48461
     while compiling: <no file>
        during phase: globalPhase=<no phase>, enteringPhase=<some phase>
     library version: version 2.13.8
    compiler version: version 2.13.8
  reconstructed args: -classpath  -Yrepl-class-based -Yrepl-outdir C:\Users\xyz\AppData\Local\Temp\spark-07dbe56e-21ca-43d8-a131-ab0117e9c5f7\repl-90ddff5b-0bb5-4869-89d1-d9783e201afd

  last tree to typer: EmptyTree
       tree position: <unknown>
            tree tpe: <notype>
              symbol: null
           call site: <none> in <none>

== Source file context for tree position ==


        at scala.reflect.internal.Reporting.abort(Reporting.scala:69)
        at scala.reflect.internal.Reporting.abort$(Reporting.scala:65)
        at scala.tools.nsc.interpreter.IMain$$anon$1.scala$tools$nsc$interpreter$ReplGlobal$$super$abort(IMain.scala:149)
        at scala.tools.nsc.interpreter.ReplGlobal.abort(ReplGlobal.scala:27)
        at scala.tools.nsc.interpreter.ReplGlobal.abort$(ReplGlobal.scala:24)
        at scala.tools.nsc.interpreter.IMain$$anon$1.abort(IMain.scala:149)
        at scala.tools.nsc.symtab.classfile.ClassfileParser$ConstantPool.errorBadIndex(ClassfileParser.scala:407)
        at scala.tools.nsc.symtab.classfile.ClassfileParser$ConstantPool.getExternalName(ClassfileParser.scala:262)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.readParamNames$1(ClassfileParser.scala:853)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseAttribute$1(ClassfileParser.scala:859)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parseAttributes$6(ClassfileParser.scala:936)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseAttributes(ClassfileParser.scala:936)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseMethod(ClassfileParser.scala:635)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseClass(ClassfileParser.scala:548)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parse$2(ClassfileParser.scala:174)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parse$1(ClassfileParser.scala:159)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parse(ClassfileParser.scala:142)
        at scala.tools.nsc.symtab.SymbolLoaders$ClassfileLoader.doComplete(SymbolLoaders.scala:342)
        at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.$anonfun$complete$2(SymbolLoaders.scala:249)
        at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:247)
        at scala.reflect.internal.Symbols$Symbol.completeInfo(Symbols.scala:1561)
        at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1533)
        at scala.reflect.internal.Definitions.scala$reflect$internal$Definitions$$enterNewMethod(Definitions.scala:47)
        at scala.reflect.internal.Definitions$DefinitionsClass.String_$plus$lzycompute(Definitions.scala:1256)
        at scala.reflect.internal.Definitions$DefinitionsClass.String_$plus(Definitions.scala:1256)
        at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreMethods$lzycompute(Definitions.scala:1577)
        at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreMethods(Definitions.scala:1559)
        at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1590)
        at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1590)
        at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1646)
        at scala.tools.nsc.Global$Run.<init>(Global.scala:1226)
        at scala.tools.nsc.interpreter.IMain.liftedTree1$1(IMain.scala:152)
        at scala.tools.nsc.interpreter.IMain.global$lzycompute(IMain.scala:151)
        at scala.tools.nsc.interpreter.IMain.global(IMain.scala:142)
        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:106)
        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
        at org.apache.spark.repl.Main$.main(Main.scala:59)
        at org.apache.spark.repl.Main.main(Main.scala)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52)
        at java.base/java.lang.reflect.Method.invoke(Method.java:580)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
24/04/01 01:12:33 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\xyz\AppData\Local\Temp\spark-07dbe56e-21ca-43d8-a131-ab0117e9c5f7\repl-90ddff5b-0bb5-4869-89d1-d9783e201afd
java.nio.file.NoSuchFileException: C:\Users\xyz\AppData\Local\Temp\spark-07dbe56e-21ca-43d8-a131-ab0117e9c5f7\repl-90ddff5b-0bb5-4869-89d1-d9783e201afd
        at java.base/sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:85)
        at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:103)
        at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:108)
        at java.base/sun.nio.fs.WindowsFileAttributeViews$Basic.readAttributes(WindowsFileAttributeViews.java:53)
        at java.base/sun.nio.fs.WindowsFileAttributeViews$Basic.readAttributes(WindowsFileAttributeViews.java:38)
        at java.base/sun.nio.fs.WindowsFileSystemProvider.readAttributes(WindowsFileSystemProvider.java:197)
        at java.base/java.nio.file.Files.readAttributes(Files.java:1853)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:124)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:117)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:90)
        at org.apache.spark.util.SparkFileUtils.deleteRecursively(SparkFileUtils.scala:121)
        at org.apache.spark.util.SparkFileUtils.deleteRecursively$(SparkFileUtils.scala:120)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1126)
        at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4$adapted(ShutdownHookManager.scala:62)
        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1328)
        at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$2(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928)
        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
        at scala.util.Try$.apply(Try.scala:210)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
        at java.base/java.lang.Thread.run(Thread.java:1583)

有人可以帮忙吗?任何帮助将不胜感激。谢谢。

windows apache-spark
1个回答
0
投票

我也面临同样的问题,而spark-shell没有运行。任何提示都会有很大帮助。 .

© www.soinside.com 2019 - 2024. All rights reserved.