如何在Azure Databricks Notebook中使用pysft模块

问题描述 投票:0回答:1

我正在尝试使用databricks笔记本中的pysftp模块建立sftp连接。这是我编写的代码。

import pysftp as sftp

HOSTNAME = my_sftp_hostname
USERNAME = my_sftp_username
PASSWORD = my_sftp_password
FOLDER = dir_to_be_accessed_in

print('HOSTNAME : '+HOSTNAME )
print('USERNAME : '+USERNAME )
print('PASSWORD : '+PASSWORD )
print('FOLDER : '+FOLDER )


cnopts = sftp.CnOpts()
cnopts.hostkeys = None   
con = sftp.Connection(HOSTNAME,username=USERNAME,password=PASSWORD, cnopts=cnopts)

print(con)

但是当我运行此代码时,我得到此错误TypeError:from_buffer()无法返回字节或Unicode对象内原始字符串的地址

/databricks/python/lib/python3.5/site-packages/pysftp/__init__.py:61: UserWarning: Failed to load HostKeys from /root/.ssh/known_hosts.  You will need to explicitly load HostKeys (cnopts.hostkeys.load(filename)) or disableHostKey checking (cnopts.hostkeys = None).
  warnings.warn(wmsg, UserWarning)
Unknown exception: from_buffer() cannot return the address of the raw string within a bytes or unicode object
Traceback (most recent call last):
  File "/databricks/python/lib/python3.5/site-packages/paramiko/transport.py", line 2075, in run
    self.kex_engine.parse_next(ptype, m)
  File "/databricks/python/lib/python3.5/site-packages/paramiko/kex_ecdh_nist.py", line 53, in parse_next
    return self._parse_kexecdh_reply(m)
  File "/databricks/python/lib/python3.5/site-packages/paramiko/kex_ecdh_nist.py", line 136, in _parse_kexecdh_reply
    self.transport._verify_key(K_S, sig)
  File "/databricks/python/lib/python3.5/site-packages/paramiko/transport.py", line 1886, in _verify_key
    if not key.verify_ssh_sig(self.H, Message(sig)):
  File "/databricks/python/lib/python3.5/site-packages/paramiko/rsakey.py", line 134, in verify_ssh_sig
    msg.get_binary(), data, padding.PKCS1v15(), hashes.SHA1()
  File "/databricks/python/lib/python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py", line 474, in verify
    self._backend, data, algorithm
  File "/databricks/python/lib/python3.5/site-packages/cryptography/hazmat/backends/openssl/utils.py", line 41, in _calculate_digest_and_algorithm
    hash_ctx.update(data)
  File "/databricks/python/lib/python3.5/site-packages/cryptography/hazmat/primitives/hashes.py", line 93, in update
    self._ctx.update(data)
  File "/databricks/python/lib/python3.5/site-packages/cryptography/hazmat/backends/openssl/hashes.py", line 50, in update
    data_ptr = self._backend._ffi.from_buffer(data)
TypeError: from_buffer() cannot return the address of the raw string within a bytes or unicode object

我看过很少的博客,但没有得到任何具体结果。如果有人对此有任何看法,请告诉我。

我的pysftp版本是:0.2.9paramiko versio是:2.7.1

python openssl azure-databricks pyopenssl pysftp
1个回答
0
投票

以下修补程序对我有用。

看起来好像默认情况下已安装了更高版本的加密程序包以及另一个PyPI库,并且此加密版本与Databricks运行时中包含的pyOpenSSL的版本不兼容。

您可以尝试解决软件包的兼容性问题,如下文所述。然后尝试安装最新版本的pyOpenSSL

https://kb.databricks.com/python/python-exec-display-cancelled.html#problem-module-lib-has-no-attribute-ssl_st_init

© www.soinside.com 2019 - 2024. All rights reserved.