ClassNotFound与Ozzie,Azure HDInsight和Spark2

问题描述 投票:0回答:1

研究了1周后,不得不提出这个要求:

  • 环境:Azure HDInsight
  • Oozie版:“Oozie客户端构建版本:4.2.0.2.6.5.3004-13”
  • Spark:Spark2
  • 我的程序:简单的Scala程序读取文件i.csv,并将其写入o.csv
  • 使用Spark-Submit测试:是的

job.properties

nameNode=wasb://[email protected]
jobTracker=hn0-something.internal.cloudapp.net:8050
master=yarn-cluster
queueName=default
deployed_loc=zs_app
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/${deployed_loc}

workflow.xml:

<workflow-app xmlns='uri:oozie:workflow:0.3' name='zs-wf'>
    <start to="Loader" />
    <action name="Loader">
        <spark xmlns="uri:oozie:spark-action:0.1">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
               <delete path="${nameNode}/${deployed_loc}/output-data"/>
            </prepare>
            <configuration>
                <property>
                    <name>mapred.compress.map.output</name>
                    <value>true</value>
                </property>
            </configuration>
            <master>${master}</master>
            <mode>cluster</mode>
            <name>Spark-Loader</name>
            <class>zs.test</class>
            <jar>${nameNode}/${deployed_loc}/zs_app.jar</jar>                        
            <arg>--testId=1</arg>            
        </spark>
            <ok to="end" />
            <error to="fail" />
            </action>
            <kill name="fail">
            <message>Workflow failed, error
            message[${wf:errorMessage(wf:lastErrorNode())}] </message>
            </kill>
            <end name='end' />
</workflow-app>

我得到以下异常:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:556)
        at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:338)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:204)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:674)
        at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
        at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
        at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:67)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:672)
        at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 17 more

我总结这些:

  • 一些如何指向<Spark 2作为Spark会话引入Spark的后期版本
  • 此外,oozie可以提交作业,因为我使用“yarn logs -applicationId appid”提取了这个错误,我从oozie日志中获取了appid。

现在,如果我在job.properties中添加此行

oozie.action.sharelib.for.spark=spark2

我得到以下异常:

JOB[0000115-181216154825160-oozie-oozi-W] ACTION[0000115-181216154825160-oozie-oozi-W@Loader] Launcher exception: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2308)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:229)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2214)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2306)
    ... 9 more

我总结这些:

  • Oozie无法提交作业,因为我发现了oozie日志本身的错误。

我不明白为什么这要复杂,如果Microsoft Azure用spark2,oozie打包HDInsight ......这个东西应该运行顺利或稍有改动,应该在某处提供干净的文档。

classnotfoundexception oozie hdinsight apache-spark-2.2
1个回答
0
投票

尝试在job.properties中设置oozie共享库路径。例如我的是:

oozie.libpath=/user/oozie/share/lib/lib_20180312160954

不知道它在天蓝色环境中的位置。


0
投票

假设您已经使用过HDInsight 3.6,请在HDInsight 4.0环境中尝试使用Spark2和Spark2。在使用oozie时,早期版本似乎无法直接使用Spark2。

HDInsight 4.0使用HDP 3.0。这可能有所帮助。 Spark2 with Oozie in HDP3.0

© www.soinside.com 2019 - 2024. All rights reserved.