未找到拆分类org.apache.hadoop.hive.ql.io.orc.OrcSplit

问题描述 投票:1回答:2

我正在尝试使用orc作为hadoop流的输入格式

这是我如何运行它

export HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hive/lib/hive-exec.jar
hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-streaming.jar \
    -file /home/mr/mapper.py -mapper /home/mr/mapper.py \
    -file /home/mr/reducer.py -reducer /home/mr/reducer.py \
    -input /user/cloudera/input/users/orc \
    -output /user/cloudera/output/simple \
    -inputformat org.apache.hadoop.hive.ql.io.orc.OrcInputFormat \

但我得到这个错误:

错误:java.io.IOException:在org.apache的org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:363)找不到拆分类org.apache.hadoop.hive.ql.io.orc.OrcSplit位于org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)的.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426)位于org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild。 java:163)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:415)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1671)at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)引起:java.lang.ClassNotFoundException:在org中找不到类org.apache.hadoop.hive.ql.io.orc.OrcSplit .apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2018)org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:361)... 7更多

它看起来像OrcSplit类应该在hive-exec.jar中

hadoop hadoop-streaming
2个回答
1
投票

更简单的解决方案是让hadoop-streaming通过使用-libjars参数为您分发lib jar。此参数采用逗号分隔的列表jar。举个例子,你可以这样做:

hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-streaming.jar \
    -libjars /opt/cloudera/parcels/CDH/lib/hive/lib/hive-exec.jar
    -file /home/mr/mapper.py -mapper /home/mr/mapper.py \
    -file /home/mr/reducer.py -reducer /home/mr/reducer.py \
    -input /user/cloudera/input/users/orc \
    -output /user/cloudera/output/simple \
    -inputformat org.apache.hadoop.hive.ql.io.orc.OrcInputFormat

0
投票

我找到了答案。我的问题是我只在一个节点上设置HADOOP_CLASSPATH var。所以我应该在每个节点上设置它或使用分布式缓存

© www.soinside.com 2019 - 2024. All rights reserved.