在Mapper hadoop中获取文件名

问题描述 投票:0回答:1

我的映射器组织如下:

#!/usr/bin/python

import os
import sys    

for line in sys.stdin:
    filename = os.environ["map_input_file"]
    print(filename)

我只是想在mapper中获取文件名,但是map_input_file上的错误定义如下:

"File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/UserDict.py", line 23, in __getitem__
     raise KeyError(key)
KeyError: 'map_input_file'"

当我尝试不同的事情时,我对可能出现的问题感到困惑,例如:

try:
    filename = os.environ["mapreduce.map.input.file"]
except KeyError:
    filename = os.environ["map.input.file"]

要么

try:
    filename = os.environ["mapreduce_map_input_file"]
except KeyError:
    filename = os.environ["map_input_file"]

它总是引发KeyError并且无法导入文件名。

关于如何在映射器中获取文件名的任何解决方案都将非常感激。

有关信息,我使用cat text.txt |在本地运行此代码mapper.py管道。在群集上运行它,我的最终目标也不会起作用,可能是因为同样的错误。

python hadoop filenames mapper keyerror
1个回答
0
投票

@Ilko确实,尝试在群集上运行它会给我以下错误:

INFO mapreduce.Job: Task Id : attempt_1550240953895_0001_m_000005_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:325)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:538)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:344)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)

19/02/15 14:41:47 INFO mapreduce.Job:  map 6% reduce 0%
19/02/15 14:41:48 INFO mapreduce.Job:  map 100% reduce 100%
19/02/15 14:41:49 INFO mapreduce.Job: Job job_1550240953895_0001 failed with state FAILED due to: Task failed task_1550240953895_0001_m_000008
Job failed as tasks failed. failedMaps:1 failedReduces:0

对于像我这样经验不足的人来说,这对我没有帮助

© www.soinside.com 2019 - 2024. All rights reserved.