Airflow SFTPHook transport.py 使用私钥进行身份验证(密码)失败

问题描述 投票:0回答:0

我正在从下面的 Docker 镜像运行 Airflow v2.3.2 / Python 3.10。

apache/airflow:2.3.2-python3.10

Docker 镜像已设置

paramiko==2.7.2
以解决在测试中出现的身份验证问题。

调用 sftp 时,我使用以下内容:

sftp = SFTPHook("connection|sftp")
sftp.look_for_keys = False
sftp.get_conn()

没有

sftp.look_for_keys
线我也试过了

在 Airflow UI 的连接中,我配置了

Extra
部分如下:

{
    "private_key": "privatekeyinfo", 
    "no_host_key_check": true
}

"privatekeyinfo"
是字符串格式 "-----BEGIN OPENSSH PRIVATE KEY----- with ' ' 换行符写在.

当我在 UI 中测试连接时,它报告

Connection successfully tested
。但是,当调用 Hook 的脚本运行时,我收到以下信息:

[TIMESTAMP] {transport.py:1819} INFO - Connected (version 2.0, client dropbear)
[TIMESTAMP] {transport.py:1819} INFO - Authentication (password) failed.

我也尝试过在

"host_key"
字段中传递
Extras
但得到相同的身份验证错误。

明确地说,我尝试了以下 -

  1. sftp.look_for_keys = False
    "no_host_key_check": true
  2. sftp.look_for_keys = False
    "host_key": "host_key_value"
  3. #sftp.look_for_keys = False
    "no_host_key_check": true
  4. #sftp.look_for_keys = False
    "host_key": "host_key_value"
  5. Airflow 中的
  6. Connections
    对于
    "no_host_key_check": true
    中的
    Extras
  7. 是成功的 Airflow 中的
  8. Connections
    对于
    "host_key": "host_key_value"
    中的
    Extras
  9. 是成功的

引用SO问题-

Paramiko 的额外记录 -

[TIMESTAMP] {transport.py:1819} DEBUG - starting thread (client mode): 0x9e33d000
[TIMESTAMP] {transport.py:1819} DEBUG - Local version/idstring: SSH-2.0-paramiko_2.7.2
[TIMESTAMP] {transport.py:1819} DEBUG - Remote version/idstring: SSH-2.0-dropbear [SERVER]
[TIMESTAMP] {transport.py:1819} INFO - Connected (version 2.0, client dropbear)
[TIMESTAMP] {transport.py:1819} DEBUG - kex algos:['diffie-hellman-group1-sha1', 'diffie-hellman-group14-sha256', 'diffie-hellman-group14-sha1'] server key:['ssh-dss', 'ssh-rsa'] client encrypt:['blowfish-cbc', 'aes128-ctr', 'aes128-cbc', '3des-cbc'] server encrypt:['blowfish-cbc', 'aes128-ctr', 'aes128-cbc', '3des-cbc'] client mac:['hmac-sha1', 'hmac-md5-96', 'hmac-sha1-96', 'hmac-md5'] server mac:['hmac-sha1', 'hmac-md5-96', 'hmac-sha1-96', 'hmac-md5'] client compress:['none'] server compress:['none'] client lang:[''] server lang:[''] kex follows?False
[TIMESTAMP] {transport.py:1819} DEBUG - Kex agreed: diffie-hellman-group14-sha256
[TIMESTAMP] {transport.py:1819} DEBUG - HostKey agreed: ssh-rsa
[TIMESTAMP] {transport.py:1819} DEBUG - Cipher agreed: aes128-ctr
[TIMESTAMP] {transport.py:1819} DEBUG - MAC agreed: hmac-sha1
[TIMESTAMP] {transport.py:1819} DEBUG - Compression agreed: none
[TIMESTAMP] {transport.py:1819} DEBUG - kex engine KexGroup14SHA256 specified hash_algo <built-in function openssl_sha256>
[TIMESTAMP] {transport.py:1819} DEBUG - Switch to new keys ...
[TIMESTAMP] {transport.py:1819} DEBUG - Attempting password auth...
[TIMESTAMP] {transport.py:1819} DEBUG - userauth is OK
[TIMESTAMP] {transport.py:1819} INFO - Authentication (password) failed.

另外- SFTP 服务器已经有公钥,可以使用私钥连接(使用 CyberDuck 和本地运行的 Airflow 版本进行验证)。

即使在 Airflow 的托管版本上,在

Connections
下拉列表中的
Admin
部分,当我进入 sftp 连接并选择
Test
时,它会返回
Connection successfully tested
。问题只发生在 DAG 中,因为它看起来正在尝试使用密码而不是为该连接提供的私钥进行身份验证。

Airflow GH 讨论链接 - https://github.com/apache/airflow/discussions/31318

authentication airflow sftp private-key
© www.soinside.com 2019 - 2024. All rights reserved.