Airflow EC2-Instance socket.getfqdn()Bug

问题描述 投票:1回答:1

我正在使用Airflow版本1.9,他们的软件中有一个错误,你可以阅读有关here on my previous Stackoverflow posthere on another one of my Stackoverflow postshere on Airflow's Github where the bug is reported and discussed的信息。

简而言之,Airflow的代码中有一些位置需要获取服务器的IP地址。他们通过运行此命令来实现此目的:

socket.getfqdn()

问题是在Amazon EC2-Instances(Amazon Linux 1)上,此命令不返回IP地址,而是返回主机名,如下所示:

IP-1-2-3-4

在哪里需要这样的IP地址:

1.2.3.4

为了获得这个IP值,我从here发现我可以使用这个命令:

socket.gethostbyname(socket.gethostname())

我已经在Python shell中测试了命令并返回了正确的值。所以我在Airflow包上搜索了所有出现的socket.getfqdn(),这就是我得到的:

[airflow@ip-1-2-3-4 site-packages]$ cd airflow/
[airflow@ip-1-2-3-4 airflow]$ grep -r "fqdn" .

./security/utils.py:    fqdn = host
./security/utils.py:    if not fqdn or fqdn == '0.0.0.0':
./security/utils.py:        fqdn = get_localhost_name()
./security/utils.py:    return '%s/%s@%s' % (components[0], fqdn.lower(), components[2])
./security/utils.py:    return socket.getfqdn()
./security/utils.py:def get_fqdn(hostname_or_ip=None):
./security/utils.py:            fqdn = socket.gethostbyaddr(hostname_or_ip)[0]
./security/utils.py:            fqdn = get_localhost_name()
./security/utils.py:        fqdn = hostname_or_ip
./security/utils.py:    if fqdn == 'localhost':
./security/utils.py:        fqdn = get_localhost_name()
./security/utils.py:    return fqdn

Binary file ./security/__pycache__/utils.cpython-36.pyc matches
Binary file ./security/__pycache__/kerberos.cpython-36.pyc matches

./security/kerberos.py:    principal = configuration.get('kerberos', 'principal').replace("_HOST", socket.getfqdn())
./security/kerberos.py:        principal = "%s/%s" % (configuration.get('kerberos', 'principal'), socket.getfqdn())

Binary file ./contrib/auth/backends/__pycache__/kerberos_auth.cpython-36.pyc matches

./contrib/auth/backends/kerberos_auth.py:        service_principal = "%s/%s" % (configuration.get('kerberos', 'principal'), utils.get_fqdn())

./www/views.py:            'airflow/circles.html', hostname=socket.getfqdn()), 404
./www/views.py:            hostname=socket.getfqdn(),

Binary file ./www/__pycache__/app.cpython-36.pyc matches
Binary file ./www/__pycache__/views.cpython-36.pyc matches

./www/app.py:                'hostname': socket.getfqdn(),

Binary file ./__pycache__/jobs.cpython-36.pyc matches
Binary file ./__pycache__/models.cpython-36.pyc matches

./bin/cli.py:    hostname = socket.getfqdn()

Binary file ./bin/__pycache__/cli.cpython-36.pyc matches

./config_templates/default_airflow.cfg:# gets augmented with fqdn

./jobs.py:        self.hostname = socket.getfqdn()
./jobs.py:        fqdn = socket.getfqdn()
./jobs.py:        same_hostname = fqdn == ti.hostname
./jobs.py:                                "{fqdn}".format(**locals()))

Binary file ./api/auth/backend/__pycache__/kerberos_auth.cpython-36.pyc matches

./api/auth/backend/kerberos_auth.py:from socket import getfqdn
./api/auth/backend/kerberos_auth.py:        hostname = getfqdn()

./models.py:        self.hostname = socket.getfqdn()
./models.py:        self.hostname = socket.getfqdn()

我不确定我是否应该用socket.getfqdn()替换所有出现的socket.gethostbyname(socket.gethostname())命令。对于其中一个,维护起来很麻烦,因为我不再使用从Pip安装的Airflow软件包。我尝试升级到Airflow版本1.10,但它非常错误,我无法启动并运行。所以现在看来​​我已经坚持使用Airflow版本1.9但我需要纠正这个Airflow错误,因为它导致我的任务偶尔失败。

amazon-ec2 ip airflow hostname airflow-scheduler
1个回答
0
投票

只需将故障函数调用的所有出现替换为有效的函数调用。这是我跑步的步骤。如果您使用的是Airflow群集,请确保对所有Airflow服务器(Masters and Workers)执行此操作。

[ec2-user@ip-1-2-3-4 ~]$ cd /usr/local/lib/python3.6/site-packages/airflow

[ec2-user@ip-1-2-3-4 airflow]$ grep -r "socket.getfqdn()" .
./security/utils.py:    return socket.getfqdn()
./security/kerberos.py:    principal = configuration.get('kerberos', 'principal').replace("_HOST", socket.getfqdn())
./security/kerberos.py:        principal = "%s/%s" % (configuration.get('kerberos', 'principal'), socket.getfqdn())
./www/views.py:            'airflow/circles.html', hostname=socket.getfqdn()), 404
./www/views.py:            hostname=socket.getfqdn(),
./www/app.py:                'hostname': socket.getfqdn(),
./bin/cli.py:    hostname = socket.getfqdn()
./jobs.py:        self.hostname = socket.getfqdn()
./jobs.py:        fqdn = socket.getfqdn()
./models.py:        self.hostname = socket.getfqdn()
./models.py:        self.hostname = socket.getfqdn()

[ec2-user@ip-1-2-3-4 airflow]$ sudo find . -type f -exec sed -i 's/socket.getfqdn()/socket.gethostbyname(socket.gethostname())/g' {} +

[ec2-user@ip-1-2-3-4 airflow]$ grep -r "socket.getfqdn()" .

[ec2-user@ip-1-2-3-4 airflow]$ grep -r "socket.gethostbyname(socket.gethostname())" .

./security/utils.py:    return socket.gethostbyname(socket.gethostname())
./security/kerberos.py:    principal = configuration.get('kerberos', 'principal').replace("_HOST", socket.gethostbyname(socket.gethostname()))
./security/kerberos.py:        principal = "%s/%s" % (configuration.get('kerberos', 'principal'), socket.gethostbyname(socket.gethostname()))
./www/views.py:            'airflow/circles.html', hostname=socket.gethostbyname(socket.gethostname())), 404
./www/views.py:            hostname=socket.gethostbyname(socket.gethostname()),
./www/app.py:                'hostname': socket.gethostbyname(socket.gethostname()),
./bin/cli.py:    hostname = socket.gethostbyname(socket.gethostname())
./jobs.py:        self.hostname = socket.gethostbyname(socket.gethostname())
./jobs.py:        fqdn = socket.gethostbyname(socket.gethostname())
./models.py:        self.hostname = socket.gethostbyname(socket.gethostname())
./models.py:        self.hostname = socket.gethostbyname(socket.gethostname())

完成更新后,只需重新启动Airflow Webserver,Scheduler和Worker进程即可完成所有设置。请注意,当我正在使用python包进行气流时,我正在使用python 3.6,你们中的一些人可能会像3.7一样,所以你的路径可能需要调整到像/usr/local/lib/python3.7/site -packages / airflow所以只需cd进入/ usr / local / lib,看看你要进入什么python文件夹。我不认为气流在这个位置下,但有时python包也位于这里/usr/local/lib64/python3.6/site-packages所以路径的差异在于它是lib64而不是lib。此外,请记住,这已在Airflow版本1.10中得到修复,因此您不需要在最新版本的Airflow中进行这些更改。

© www.soinside.com 2019 - 2024. All rights reserved.