在 Windows 上安装 Spark 的问题

问题描述 投票:0回答:1

Windows 11

    Spark -> C:\spark-3.0.0-bin-hadoop2.7 
    Winutils -> C:\winutils 
    Java -> C:\Program Files\Java\jdk-22.0.1

我下载了上述版本,创建了环境变量,

HADOOP_HOME ='c:\spark-3.0.0-bin-hadoop2.7',

JAVA_HOME = 'c:\winutils',

SPARK_HOME = 'c:\Program Files\Java\jdk-22.0.1'

并将这些路径添加到系统和用户变量中

我已经检查了环境并确保它们在路径中可用(系统和用户)。我在运行 Spark shell 时遇到以下问题:

C:\spark-3.0.0-bin-hadoop2.7\bin>spark-shell
    Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
        at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)        at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
        at scala.Option.orElse(Option.scala:447)
        at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
        at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
        at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:990)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:990)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalStateException: java.lang.NoSuchMethodException: java.nio.DirectByteBuffer.<init>(long,int)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:62)
        ... 13 more
Caused by: java.lang.NoSuchMethodException: java.nio.DirectByteBuffer.<init>(long,int)
        at java.base/java.lang.Class.getConstructor0(Class.java:3784)
        at java.base/java.lang.Class.getDeclaredConstructor(Class.java:2955)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:55)
        ... 13 more
java apache-spark directory dependencies spark-shell
1个回答
0
投票

我认为您使用的 Spark 版本不支持 Java 22。

查看spark官方文档查看spark支持哪个版本的java。

© www.soinside.com 2019 - 2024. All rights reserved.