spark提交“服务'驱动程序'无法绑定端口”错误

问题描述 投票:6回答:7

我使用以下命令来运行wordcount的spark java示例: -

time spark-submit --deploy-mode cluster --master spark://192.168.0.7:6066 --class org.apache.spark.examples.JavaWordCount /home/pi/Desktop/example/new/target/javaword.jar /books_50.txt 

当我运行它时,以下是输出: -

Running Spark using the REST application submission protocol.
16/07/18 03:55:41 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160718035543-0000. Polling submission state...
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160718035543-0000 in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: State of driver driver-20160718035543-0000 is now RUNNING.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160718041005-192.168.0.12-42405 at 192.168.0.12:42405.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160718035543-0000",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160718035543-0000",
  "success" : true
}

我检查了特定的工作人员(192.168.0.12)的日志,它说: -

Launch Command: "/usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/bin/java" "-cp" "/opt/spark/conf/:/opt/spark/lib/spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/spark/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark/lib/datanucleus-core-3.2.10.jar:/opt/spark/lib/datanucleus-rdbms-3.2.9.jar" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.JavaWordCount" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/home/pi/Desktop/example/new/target/javaword.jar" "-Dspark.master=spark://192.168.0.7:7077" "-Dspark.executor.memory=10M" "org.apache.spark.deploy.worker.DriverWrapper" "spark://[email protected]:42405" "/opt/spark/work/driver-20160718035543-0000/javaword.jar" "org.apache.spark.examples.JavaWordCount" "/books_50.txt"
========================================

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/18 04:10:58 INFO SecurityManager: Changing view acls to: pi
16/07/18 04:10:58 INFO SecurityManager: Changing modify acls to: pi
16/07/18 04:10:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pi); users with modify permissions: Set(pi)
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'Driver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'Driver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

我的spark-env.sh文件(对于master)包含: -

export SPARK_MASTER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"

我的spark-env.sh文件(for worker)包含: -

export SPARK_WORKER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"

请帮忙...!!

apache-spark word-count
7个回答
13
投票

尝试运行shell时遇到了同样的问题,并且能够通过设置SPARK_LOCAL_IP环境变量来实现此功能。运行shell时,可以从命令行分配:

SPARK_LOCAL_IP=127.0.0.1 ./bin/spark-shell

要获得更持久的解决方案,请在Spark根目录的conf目录中创建spark-env.sh文件。添加以下行:

SPARK_LOCAL_IP=127.0.0.1

使用chmod +x ./conf/spark-env.sh为脚本授予执行权限,默认情况下将设置此环境变量。


7
投票

我使用Maven / SBT来管理依赖项,并且Spark核心包含在jar文件中。

您可以通过设置“spark.driver.bindAddress”(此处为Scala)在运行时覆盖SPARK_LOCAL_IP:

val config = new SparkConf()
config.setMaster("local[*]")
config.setAppName("Test App")
config.set("spark.driver.bindAddress", "127.0.0.1")
val sc = new SparkContext(config)

2
投票

我也有这个问题。

(对我来说)的原因是我的本地系统的IP无法从我的本地系统访问。我知道这句话毫无意义,但请阅读以下内容。

我的系统名称(uname -s)显示我的系统名为“sparkmaster”。在我的/ etc / hosts文件中,我为sparkmaster系统分配了一个固定的IP地址为“192.168.1.70”。 sparknode01和sparknode02还有额外的固定IP地址,分别为...... 1.71和... 1.72。

由于我遇到的其他一些问题,我需要将所有网络适配器更改为DHCP。这意味着他们获得的地址如192.168.90.123。 DHCP地址与... 1.70范围不在同一网络中,并且没有配置路由。

当spark开始时,似乎想要尝试连接到以uname命名的主机(即我的情况下是sparkmaster)。这是IP 192.168.1.70 - 但没有办法连接到那个,因为该地址位于无法访问的网络中。

我的解决方案是将我的一个以太网适配器更改回固定的静态地址(即192.168.1.70)并解决问题。

所以问题似乎是当火花以“本地模式”启动时,它会尝试连接到以系统名称命名的系统(而不是本地主机)。我想这是有道理的,如果你想设置一个集群(就像我做的那样),但它可能导致上述令人困惑的消息。可能将系统的主机名放在/ etc / hosts中的127.0.0.1条目上也可以解决这个问题,但我没试过。


1
投票

您需要在/etc/hosts文件中输入主机名。就像是:

127.0.0.1   localhost "hostname"

0
投票

这可能是Spark 1.2.1 standalone cluster mode spark-submit is not working的副本

我尝试过相同的步骤,但能够运行这项工作。如果可能的话,请发布完整的spark-env.sh和spark-defaults。


0
投票

我有这个问题,这是因为在/ etc / hosts中使用我的IP更改了真实的IP。


-2
投票

我通过修改slave文件来解决这个问题。如果你需要检查你的配置,请点击spark-2.4.0-bin-hadoop2.7 / conf / slave。

© www.soinside.com 2019 - 2024. All rights reserved.