HBase例外

问题描述 投票:0回答:1

当我在伪群集模式下使用HBase时,我得到以下异常。如果有人能够解决这个问题以解决它,那将是非常好的

org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Wed Feb 06 15:22:23 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: Could not iterate StoreFileScanner[HFileScanner for reader reader=file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9, compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false], firstKey=0deptempname0/cf:email/1360143938898/Put, lastKey=4191151deptempname4191151/cf:place/1360143938898/Put, avgKeyLen=45, avgValueLen=7, entries=17860666, length=1093021429, cur=10275517deptempname10275517/cf:place/1360143938898/Put/vlen=4]
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
    at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
    at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9 at 37837312 exp: -819174049 got: 1765448374
    at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
    at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
    at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
    at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
    at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
    at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
    at org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
    at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:48)
    at org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
    ... 12 more

Wed Feb 06 15:22:24 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException
    at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1079)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1068)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2182)
    at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: java.lang.IllegalArgumentException
    at java.nio.Buffer.position(Buffer.java:216)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
    at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
    ... 5 more
hadoop hbase cloudera
1个回答
0
投票

此问题的根本原因在于/ etc / hosts文件。如果你检查你的/ etc / hosts文件,你会发现一个类似下面的条目(在我的情况下,mu机器被命名为domainnameyouwanttogive)

127.0.0.1 localhost
127.0.1.1 domainnameyouwanttogive

对于支持IPv6的主机,以下行是理想的

::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

根本原因是domainnameyouwanttogive解析为127.0.1.1,这是不正确的,因为它应解析为127.0.0.1(或外部IP)。由于我的外部IP是192.168.58.10,我创建了以下/ etc / hosts配置;

127.0.0.1 localhost
192.168.43.3 domainnameyouwanttogive

对于支持IPv6的主机,以下行是理想的

::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

这将确保正确完成在本地主机上解析主机进程,并且可以在开发系统上正确启动HBase安装。

此外,请确保您的hadoop namenode运行时使用的是与hbase相同的域名

© www.soinside.com 2019 - 2024. All rights reserved.