Py4JNetworkError:尝试连接到Java服务器时发生错误(127.0.0.1:43184)

问题描述 投票:0回答:1

[尝试在jupyter笔记本中运行pyspark时,我经常遇到这种常见的连接错误。重新启动内核,甚至我的腻子终端都无法正常工作。我了解原因是围绕Java和pyspark引用两个不同服务器的事实。如何解决此问题以确保服务器匹配?

 404 GET /nbextensions/widgets/notebook/js/extension                                                                                        .js?v=20190912013347 (127.0.0.1) 5.23ms referer=http://localhost:12345/notebooks                                                                                        /QuadID%20prepare%20indexes.ipynb
[I 01:34:06.143 NotebookApp] Kernel started: 835a6227-4699-49fd-b954-c627657b863                                                                                        c
log4j:ERROR Could not read configuration file from URL [file:/home/e079494/log4j                                                                                        .properties].
java.io.FileNotFoundException: /home/e079494/log4j.properties (No such file or d                                                                                        irectory)
        at java.io.FileInputStream.open0(Native Method)
        at java.io.FileInputStream.open(FileInputStream.java:195)
        at java.io.FileInputStream.<init>(FileInputStream.java:138)
        at java.io.FileInputStream.<init>(FileInputStream.java:93)
        at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection                                                                                        .java:90)
        at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLCon                                                                                        nection.java:188)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurato                                                                                        r.java:557)
        at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionCon                                                                                        verter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:6                                                                                        6)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:270)
        at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogF                                                                                        actory.java:156)
        at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogF                                                                                        actory.java:132)
        at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:274)
        at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:181)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSu                                                                                        bmit.scala:315)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j:ERROR Ignoring configuration file [file:/home/e079494/log4j.properties].
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLeve                                                                                        l(newLevel).
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Att                                                                                        empting port 4041.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4041. Att                                                                                        empting port 4042.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4042. Att                                                                                        empting port 4043.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4043. Att                                                                                        empting port 4044.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4044. Att                                                                                        empting port 4045.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4045. Att                                                                                        empting port 4046.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4046. Att                                                                                        empting port 4047.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4047. Att                                                                                        empting port 4048.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4048. Att                                                                                        empting port 4049.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4049. Att                                                                                        empting port 4050.
19/09/12 01:34:08 WARN Utils: Service 'SparkUI' could not bind on port 4050. Att                                                                                        empting port 4051.
19/09/12 01:34:09 WARN Utils: Service 'SparkUI' could not bind on port 4051. Att                                                                                        empting port 4052.
19/09/12 01:34:09 WARN Utils: Service 'SparkUI' could not bind on port 4052. Att                                                                                        empting port 4053.
19/09/12 01:34:09 WARN Utils: Service 'SparkUI' could not bind on port 4053. Att                                                                                        empting port 4054.
19/09/12 01:34:09 WARN Utils: Service 'SparkUI' could not bind on port 4054. Att                                                                                        empting port 4055.
19/09/12 01:34:09 WARN Utils: Service 'SparkUI' could not bind on port 4055. Att                                                                                        empting port 4056.
19/09/12 01:34:09 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service 'SparkUI' failed after 1                                                                                        6 retries (starting from 4040)! Consider explicitly setting the appropriate port                                                                                         for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an availab                                                                                        le port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:                                                                                        223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.j                                                                                        ava:317)
        at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(Abstr                                                                                        actNetworkConnector.java:80)
        at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnecto                                                                                        r.java:235)
        at org.spark_project.jetty.util.component.AbstractLifeCycle.start(Abstra                                                                                        ctLifeCycle.java:68)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newCo                                                                                        nnector$1(JettyUtils.scala:333)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpC                                                                                        onnect$1(JettyUtils.scala:365)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$                                                                                        sp(Utils.scala:2237)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2229)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:368                                                                                        )
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:46                                                                                        3)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:46                                                                                        3)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:463)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.sc                                                                                        ala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct                                                                                        orAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC                                                                                        onstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:236)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand                                                                                        .java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:748)
ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1035, in send_c                                                                                        ommand
    raise Py4JNetworkError("Answer from Java side is empty")
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_co                                                                                        mmand
    response = connection.send_command(command)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_c                                                                                        ommand
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java se                                                                                        rver (127.0.0.1:32844)
Traceback (most recent call last):
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/shell.py", line 47, in <module>
    .getOrCreate()
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/sql/session.py", line 169, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/context.py", line 334, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/context.py", line 180, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/pyspark/context.py", line 273, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in __call                                                                                        __
    answer, self._gateway_client, None, self._fqn)
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 327, in get_return_                                                                                        value
    format(target_id, ".", name))
py4j.protocol.Py4JError: An error occurred while calling None.org.apache.spark.a                                                                                        pi.java.JavaSparkContext

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 827, in _get_co                                                                                        nnection
    connection = self.deque.pop()
IndexError: pop from an empty deque

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/s                                                                                        park2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 963, in start
    self.socket.connect((self.address, self.port))
ConnectionRefusedError: [Errno 111] Connection refused
[IPKernelApp] WARNING | Unknown error in handling PYTHONSTARTUP file /opt/cloude                                                                                        ra/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/spark2/python/pyspar                                                                                        k/shell.py:
apache-spark pyspark jupyter-notebook jupyter
1个回答
0
投票

如果您已经重新启动内核而不能解决问题,则可能需要关闭浏览器并重新启动。这对我有用。

© www.soinside.com 2019 - 2024. All rights reserved.