Hadoop 连接超时

问题描述 投票:0回答:1

我正在创建一个hadoop单节点集群。我在谷歌云上使用RedHat。为了安全起见,我已经关闭了防火墙规则。我的Red Hat是基于x86/64架构的E2-medium。

Hadoop版本(由于工作需要,必须使用2.9.2)

hadoop-2.9.2

Java版本我设置jdk设置没有问题。

java -version
openjdk version "11.0.20" 2023-07-18 LTS
OpenJDK Runtime Environment (Red_Hat-11.0.20.0.8-1) (build 11.0.20+8-LTS)
OpenJDK 64-Bit Server VM (Red_Hat-11.0.20.0.8-1) (build 11.0.20+8-LTS, mixed mode, sharing)

我尝试查看hadoop namenode的初始化,但结果发现只有namenode管理器打开了。

[hadoop@rootvm ~]$ start-dfs.sh
Starting namenodes on [hadoop.tecadmin.com]
hadoop.tecadmin.com: ssh: connect to host hadoop.tecadmin.com port 22: Connection timed out
localhost: chown: changing ownership of '/usr/local/hadoop/logs': Operation not permitted
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hadoop-datanode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hadoop-datanode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied
localhost: head: cannot open '/usr/local/hadoop/logs/hadoop-hadoop-datanode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out' for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hadoop-datanode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hadoop-datanode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ED25519 key fingerprint is SHA256:FlOH6JIX6jQuQe0+SJP6ovh6ZFYCmRFK4zMxyz1aOx8.
This host key is known by the following other names/addresses:
    ~/.ssh/known_hosts:1: localhost
    ~/.ssh/known_hosts:4: 10.XXX.0.2
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ED25519) to the list of known hosts.
0.0.0.0: chown: changing ownership of '/usr/local/hadoop/logs': Operation not permitted
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hadoop-secondarynamenode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hadoop-secondarynamenode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied
0.0.0.0: head: cannot open '/usr/local/hadoop/logs/hadoop-hadoop-secondarynamenode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out' for reading: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hadoop-secondarynamenode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hadoop-secondarynamenode-cuhk-ssh-e2.asia-east2-a.c.root.internal.out: Permission denied

我希望我可以创建hadoop单节点集群。

google-cloud-platform hadoop ssh redhat
1个回答
0
投票

如果你想将数据存储在

/usr/local/hadoop
下,你需要将此目录的所有权更改为你想要运行hadoop的用户(看起来是
hadoop
)。您可以使用的命令是:

chown -R hadoop /usr/local/hadoop

且以上命令需要以

root
用户执行。

© www.soinside.com 2019 - 2024. All rights reserved.