[使用Java从Eclipse访问HDFS

问题描述 投票:0回答:1

enter image description here以下是访问HDFS的代码

package myDefaultPackage;

import java.io。;导入org.apache.hadoop.fs。;进口org.apache.hadoop.conf。*;

公共类Testing_HDFS_File {

public static void main(String [] args) throws Exception {
    try {

      Configuration config = new Configuration();
      config.set("fs.defaultFS","hdfs://192.168.28.153:9000/");
      FileSystem dfs = FileSystem.get(config);
        Path pt = new Path("hdfs://192.168.28.153:9000/user/hduser/wordcountinput/input.txt");
        config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
        BufferedReader br = new BufferedReader(new InputStreamReader(dfs.open(pt)));
        String line;
        line = br.readLine();
        while ((line = br.readLine()) != null) {
            System.out.println(line);
            line = br.readLine();
        }
        br.close();
    }
    catch (Exception e) {
        System.out.println(e.getMessage());
        e.printStackTrace();
    }
}

}

而且我正在收到此异常:

WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the

log4j.properties文件。没有用于方案的文件系统:hdfsjava.io.IOException:方案无文件系统:hdfs

在org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)在org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)在org.apache.hadoop.fs.FileSystem.access $ 200(FileSystem.java:80)在org.apache.hadoop.fs.FileSystem $ Cache.getInternal(FileSystem.java:2184)在org.apache.hadoop.fs.FileSystem $ Cache.get(FileSystem.java:2166)在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)处org.apache.hadoop.fs.FileSystem.get(FileSystem.java:158)在myDefaultPackage.Testing_HDFS_File.main(Testing_HDFS_File.java:15)

java hadoop hdfs hadoop2
1个回答
0
投票

只需使用特定的罐子。有很多不合适的罐子。

© www.soinside.com 2019 - 2024. All rights reserved.