线程“主”中的异常java.lang.NoClassDefFoundError:com / fasterxml / jackson / datatype / jsr310 / JavaTimeModule

问题描述 投票:0回答:1

我正在按照本教程从此处使用kubectl命令运行Spark-Pi应用程序。 https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md#running-the-examples

我提交时

kubectl apply -f spark-pi.yaml并使用kubectl logs spark-pi-driver -f检查日志,我看到此异常。

20/03/20 01:47:45 INFO SparkEnv: Registering OutputCommitCoordinator
20/03/20 01:47:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/03/20 01:47:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-pi-1584668857472-driver-svc.default.svc:4040
20/03/20 01:47:46 INFO SparkContext: Added JAR file:///opt/spark/examples/jars/spark-examples_2.11-2.4.3.jar at spark://spark-pi-1584668857472-driver-svc.default.svc:7078/jars/spark-examples_2.11-2.4.3.jar with timestamp 1584668866199
Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/datatype/jsr310/JavaTimeModule
    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.<clinit>(OperationSupport.java:59)
    at io.fabric8.kubernetes.client.DefaultKubernetesClient.pods(DefaultKubernetesClient.java:204)
    at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
    at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.<init>(ExecutorPodsAllocator.scala:55)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:89)
    at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.datatype.jsr310.JavaTimeModule
    at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 28 more
20/03/20 01:47:47 INFO DiskBlockManager: Shutdown hook called
20/03/20 01:47:47 INFO ShutdownHookManager: Shutdown hook called
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /var/data/spark-97c7e689-9506-42a1-b3b1-578270832f75/spark-e99b532d-3c81-4a76-a05c-a4a753627db2/userFiles-7d18e8e4-74d6-4dbc-b967-31cd2c6d96d3
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /var/data/spark-97c7e689-9506-42a1-b3b1-578270832f75/spark-e99b532d-3c81-4a76-a05c-a4a753627db2
20/03/20 01:47:47 INFO ShutdownHookManager: Deleting directory /tmp/spark-72a95874-127a-4f0e-b32a-22d1aec74e1c```

Any help how to resolve this?


This jackson-annotations jar comes within jars folder in spark-2.4.3-bin-hadoop2.7. Not sure why it is not able to pick up in classpath. Any help would be appreciated.
java apache-spark kubernetes jackson classnotfoundexception
1个回答
1
投票

如@Andreas所指出,${SPARK_HOME}/jars不包含jackson-datatype-jsr310

您可以尝试修改spark-docker/Dockerfile并查看其工作方式:

    . . .

    ADD https://repo1.maven.org/maven2/com/fasterxml/jackson/datatype/jackson-datatype-jsr310/2.9.10/jackson-datatype-jsr310-2.9.10.jar $SPARK_HOME/jars
    . . .

[似乎是一个错误的思考,如果有帮助,请在存储库中提出问题。

© www.soinside.com 2019 - 2024. All rights reserved.