Python 工作线程意外退出(崩溃)

问题描述 投票:0回答:1

在命令提示符下运行

rdd.first()
后,我在运行
rdd = sc.parallelize([1,2,3])
时收到此错误。

py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (LAPTOP-6IIGK4P5 executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)

附上此截图(https://i.stack.imgur.com/A4hOa.png)

python pyspark command-prompt rdd
1个回答
0
投票

你修复了这个错误吗@K Neelakanta?

© www.soinside.com 2019 - 2024. All rights reserved.