sqoop导入的数据但是有空的part-m-00000文件?

问题描述 投票:0回答:1

使用Apache sqoop将数据从oracle数据库导入HDFS时。它是导入但空文件。

sqoop import --connect jdbc:oracle:thin:@192.168.0.15:1521:XE --username system --password system --table EMP -m 1 --target-dir /user/sinha

运行后,在运行查询时创建没有任何数据的part-m-00000文件

    Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/03/05 09:43:57 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.0
18/03/05 09:43:57 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/03/05 09:44:00 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
18/03/05 09:44:58 INFO mapreduce.JobSubmitter: number of splits:1
18/03/05 09:45:01 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1520229051986_0016
18/03/05 09:45:03 INFO impl.YarnClientImpl: Submitted application application_1520229051986_0016
18/03/05 09:45:03 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1520229051986_0016/
18/03/05 09:45:03 INFO mapreduce.Job: Running job: job_1520229051986_0016
18/03/05 09:45:54 INFO mapreduce.Job: Job job_1520229051986_0016 running in uber mode : false
18/03/05 09:45:54 INFO mapreduce.Job:  map 0% reduce 0%
18/03/05 09:46:35 INFO mapreduce.Job:  map 100% reduce 0%
18/03/05 09:46:36 INFO mapreduce.Job: Job job_1520229051986_0016 completed successfully
18/03/05 09:46:36 INFO mapreduce.Job: Counters: 30
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=151209
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=87
        HDFS: Number of bytes written=0
        HDFS: Number of read operations=4
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters 
        Launched map tasks=1
        Other local map tasks=1
        Total time spent by all maps in occupied slots (ms)=37383
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=37383
        Total vcore-milliseconds taken by all map tasks=37383
        Total megabyte-milliseconds taken by all map tasks=38280192
    Map-Reduce Framework
        Map input records=0
        Map output records=0
        Input split bytes=87
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=546
        CPU time spent (ms)=5110
        Physical memory (bytes) snapshot=143175680
        Virtual memory (bytes) snapshot=1509150720
        Total committed heap usage (bytes)=74973184
    File Input Format Counters 
        Bytes Read=0
    File Output Format Counters 
        Bytes Written=0
18/03/05 09:46:36 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 108.9264 seconds (0 bytes/sec)
18/03/05 09:46:36 INFO mapreduce.ImportJobBase: Retrieved 0 records

我不知道是什么问题?即使我用“eval”命令检查它也只显示表的列名。

hadoop import hdfs sqoop hadoop2
1个回答
0
投票

查看日志,您的源表根本没有任何记录。在你的oracle表上做一个select *来验证。将一些记录添加到oracle表并再次尝试sqoop操作。您应该能够获取数据。

© www.soinside.com 2019 - 2024. All rights reserved.