为什么start-all.sh从根本原因“无法启动org.apache.spark.deploy.master.Master:JAVA_HOME未设置”?

问题描述 投票:0回答:3

我正在尝试通过在 cloudera Quickstart VM 5.3.0 上运行的独立 Spark 服务来执行通过 Scala IDE 构建的 Spark 应用程序。

我的cloudera帐户JAVA_HOME是/usr/java/default

但是,我在从

cloudera
用户执行 start-all.sh 命令时遇到以下错误消息,如下所示:

[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password: 
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out' for reading: No such file or directory
localhost: full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out

我在

export CMF_AGENT_JAVA_HOME=/usr/java/default
中添加了
/etc/default/cloudera-scm-agent
并运行
sudo service cloudera-scm-agent restart
。请参阅如何设置 CMF_AGENT_JAVA_HOME

我还在文件

export JAVA_HOME=/usr/java/default
locate_java_home
函数定义中添加了
/usr/share/cmf/bin/cmf-server
并重新启动了集群和独立 Spark 服务

但是从

root
用户

启动 Spark 服务时会重复以下错误
[root@localhost spark]# sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
failed to launch org.apache.spark.deploy.master.Master:
  JAVA_HOME is not set
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
root@localhost's password: 
localhost: Connection closed by UNKNOWN

有人可以建议如何设置JAVA_HOME以便在cloudera manager上启动Spark独立服务吗?

java scala apache-spark cloudera
3个回答
5
投票

解决方案非常简单明了。刚刚在

export JAVA_HOME=/usr/java/default
中添加了
/root/.bashrc
,它成功地从
root
用户启动了 Spark 服务,没有出现
JAVA_HOME is not set
错误。希望它可以帮助遇到同样问题的人。


0
投票

~/.bashrc
中设置JAVA_HOME变量如下

sudo gedit ~/.bashrc

在文件中写入这一行(你安装的JDK的地址)

JAVA_HOME="/usr/lib/jvm/java-11-openjdk-amd64"

然后命令

source ~/.bashrc

0
投票

这对我不起作用。 jvm install中添加的路径仍然是/root

© www.soinside.com 2019 - 2024. All rights reserved.