苏打水经常抛出java.lang.ArrayIndexOutOfBoundsException:65535

问题描述 投票:2回答:1

H2O苏打水经常抛到异常之下,我们会在发生这种情况时手动重新运行。问题是当发生此异常时,spark作业不会退出,它们不会返回退出状态,我们无法自动执行此过程。

App > Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 316 in stage 22.0 failed 4 times, most recent failure: Lost task 316.3 in stage 22.0 (TID 9470, ip-**-***-***-**.ec2.internal): java.lang.ArrayIndexOutOfBoundsException: 65535
App > at water.DKV.get(DKV.java:202)
App > at water.DKV.get(DKV.java:175)
App > at water.Key.get(Key.java:83)
App > at water.fvec.Frame.createNewChunks(Frame.java:896)
App > at water.fvec.FrameUtils$class.createNewChunks(FrameUtils.scala:43)
App > at water.fvec.FrameUtils$.createNewChunks(FrameUtils.scala:70)
App > at org.apache.spark.h2o.backends.internal.InternalWriteConverterContext.createChunks(InternalWriteConverterContext.scala:28)
App > at org.apache.spark.h2o.converters.SparkDataFrameConverter$class.org$apache$spark$h2o$converters$SparkDataFrameConverter$$perSQLPartition(SparkDataFrameConverter.scala:86)
App > at org.apache.spark.h2o.converters.SparkDataFrameConverter$$anonfun$toH2OFrame$1$$anonfun$apply$2.apply(SparkDataFrameConverter.scala:67)
App > at org.apache.spark.h2o.converters.SparkDataFrameConverter$$anonfun$toH2OFrame$1$$anonfun$apply$2.apply(SparkDataFrameConverter.scala:67)
App > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
App > at org.apache.spark.scheduler.Task.run(Task.scala:85)
App > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
App > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
App > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
apache-spark apache-spark-mllib h2o apache-spark-ml sparkling-water
1个回答
0
投票

该问题正在以下有关Sparkling Water项目的问题进行调查:

它似乎与数据的大小有某种关系。

当我们尝试将巨大的火花数据帧拉到h2o帧时会发生这种情况。 63m记录x 6300列。虽然H2O / Sparkling Water集群大小合适:(每个执行器有40个x 17g,每个Spark执行器有4个线程/核心)所以内存总量为680Gb

我们从未在较小的数据集上得到此错误。

© www.soinside.com 2019 - 2024. All rights reserved.