使用start-all.sh启动spark时出错

问题描述 投票:2回答:2

当我尝试使用脚本start-all.sh启动spark时,它会抛出一个错误

> localhost: failed to launch: nice -n 0 bin/spark-class
> org.apache.spark.deploy.worker.Worker --webui-port 8081
> spark://dev-pipeline-west-eu.jwn4tgenexauzewylryxtm545b.ax.internal.cloudapp.net:7077
> localhost:       at
> sun.launcher.LauncherHelper.loadMainClass(java.base@9-internal/LauncherHelper.java:585)
> localhost:       at
> sun.launcher.LauncherHelper.checkAndLoadMain(java.base@9-internal/LauncherHelper.java:497)
> localhost: full log in
> /spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out

当我查看/spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out上可用的日志文件时有以下错误日志。

> Error: A JNI error has occurred, please check your installation and
> try again Exception in thread "main"
> java.lang.ArrayIndexOutOfBoundsException: 64
>     at java.util.jar.JarFile.match(java.base@9-internal/JarFile.java:983)
>     at java.util.jar.JarFile.checkForSpecialAttributes(java.base@9-internal/JarFile.java:1017)
>     at java.util.jar.JarFile.isMultiRelease(java.base@9-internal/JarFile.java:399)
>     at java.util.jar.JarFile.getEntry(java.base@9-internal/JarFile.java:524)
>     at java.util.jar.JarFile.getJarEntry(java.base@9-internal/JarFile.java:480)
>     at jdk.internal.util.jar.JarIndex.getJarIndex(java.base@9-internal/JarIndex.java:114)

什么导致错误任何想法?

scala apache-spark spark-dataframe
2个回答
3
投票

我和Ubuntu 16.04有同样的问题。更新Java修复了这个问题:

sudo apt-add-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java7-installer

java -version

java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)

-1
投票

解决方案使用Java版本8而不是版本9。

选项1一种选择是卸载Java(版本9)并重新安装Java(版本8)。 (您可以查看此帖子以安装Java;请确保进行必要的更改,以便安装版本8.)

选项2如果您已安装并且使用的是Ubuntu,则可以使用以下命令:

sudo update-alternatives --config java

您将看到提示,使用与Java 8关联的整数进行响应,然后按Enter键。

形式:http://continualintegration.com/miscellaneous-articles/how-do-you-troubleshoot-the-spark-shell-error-a-jni-error-has-occurred

© www.soinside.com 2019 - 2024. All rights reserved.