在命令提示符下运行
rdd.first()
后,我在运行 rdd = sc.parallelize([1,2,3])
时收到此错误。
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (LAPTOP-6IIGK4P5 executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
你修复了这个错误吗@K Neelakanta?