我在安装 pyspark 时遇到错误,如何修复它?

问题描述 投票:0回答:1

我想安装并练习pyspark。但是在安装和进入 pyspark-shell 过程中,出现以下错误。

C:\Windows\System32>spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
ReplGlobal.abort: bad constant pool index: 0 at pos: 48445
[init] error:
  bad constant pool index: 0 at pos: 48445
     while compiling: <no file>
        during phase: globalPhase=<no phase>, enteringPhase=<some phase>
     library version: version 2.12.15
    compiler version: version 2.12.15
  reconstructed args: -classpath  -Yrepl-class-based -Yrepl-outdir C:\Users\Ebi Kare Ganranwei\AppData\Local\Temp\spark-b51c6ad4-8d78-4aff-804a-c31d869d79e7\repl-8a94e77c-5fc5-4f91-b040-e50cc0dca2c7

  last tree to typer: EmptyTree
       tree position: <unknown>
            tree tpe: <notype>
              symbol: null
           call site: <none> in <none>

== Source file context for tree position ==


Exception in thread "main" scala.reflect.internal.FatalError:
  bad constant pool index: 0 at pos: 48445
     while compiling: <no file>
        during phase: globalPhase=<no phase>, enteringPhase=<some phase>
     library version: version 2.12.15
    compiler version: version 2.12.15
  reconstructed args: -classpath  -Yrepl-class-based -Yrepl-outdir C:\Users\Ebi Kare Ganranwei\AppData\Local\Temp\spark-b51c6ad4-8d78-4aff-804a-c31d869d79e7\repl-8a94e77c-5fc5-4f91-b040-e50cc0dca2c7

  last tree to typer: EmptyTree
       tree position: <unknown>
            tree tpe: <notype>
              symbol: null
           call site: <none> in <none>

== Source file context for tree position ==


        at scala.reflect.internal.Reporting.abort(Reporting.scala:69)
        at scala.reflect.internal.Reporting.abort$(Reporting.scala:65)
        at scala.tools.nsc.interpreter.IMain$$anon$2.scala$tools$nsc$interpreter$ReplGlobal$$super$abort(IMain.scala:239)
        at scala.tools.nsc.interpreter.ReplGlobal.abort(ReplGlobal.scala:31)
        at scala.tools.nsc.interpreter.ReplGlobal.abort$(ReplGlobal.scala:29)
        at scala.tools.nsc.interpreter.IMain$$anon$2.abort(IMain.scala:239)
        at scala.tools.nsc.symtab.classfile.ClassfileParser$ConstantPool.errorBadIndex(ClassfileParser.scala:386)
        at scala.tools.nsc.symtab.classfile.ClassfileParser$ConstantPool.getExternalName(ClassfileParser.scala:250)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.readParamNames$1(ClassfileParser.scala:841)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseAttribute$1(ClassfileParser.scala:847)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parseAttributes$7(ClassfileParser.scala:921)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseAttributes(ClassfileParser.scala:921)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseMethod(ClassfileParser.scala:623)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parseClass$4(ClassfileParser.scala:536)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parseClass(ClassfileParser.scala:536)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parse$2(ClassfileParser.scala:161)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.$anonfun$parse$1(ClassfileParser.scala:147)
        at scala.tools.nsc.symtab.classfile.ClassfileParser.parse(ClassfileParser.scala:130)
        at scala.tools.nsc.symtab.SymbolLoaders$ClassfileLoader.doComplete(SymbolLoaders.scala:343)
        at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:250)
        at scala.reflect.internal.Symbols$Symbol.completeInfo(Symbols.scala:1542)
        at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
        at scala.reflect.internal.Definitions.scala$reflect$internal$Definitions$$enterNewMethod(Definitions.scala:49)
        at scala.reflect.internal.Definitions$DefinitionsClass.String_$plus$lzycompute(Definitions.scala:1134)
        at scala.reflect.internal.Definitions$DefinitionsClass.String_$plus(Definitions.scala:1134)
        at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreMethods$lzycompute(Definitions.scala:1438)
        at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreMethods(Definitions.scala:1420)
        at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1450)
        at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1450)
        at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1506)
        at scala.tools.nsc.Global$Run.<init>(Global.scala:1213)
        at scala.tools.nsc.interpreter.IMain._initialize(IMain.scala:124)
        at scala.tools.nsc.interpreter.IMain.initializeSynchronous(IMain.scala:146)
        at org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
        at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
        at java.base/java.lang.reflect.Method.invoke(Method.java:580)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:984)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:191)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:214)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1072)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1081)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
python pyspark
1个回答
0
投票

我建议使用 docker 镜像来启动 Jupyter Notebook 或 Lab。这是迄今为止最简单的方法。

  1. 为您的操作系统安装 Docker 引擎
  2. 安装 Docker 后,您可以在命令行中运行以下命令:

docker run -it --rm -p 8888:8888 jupyter/pyspark-notebook

上面将运行所有必要的组件。您只需将浏览器指向 http://localhost:8888

供参考:

© www.soinside.com 2019 - 2024. All rights reserved.