无法从 Databricks 社区版安装 Azure ADLS Gen 2:com.databricks.rpc.UnknownRemoteException:发生远程异常

问题描述 投票:0回答:1

我正在尝试从我的 databricks Community Edition 安装 ADLS Gen 2,但是当我运行以下代码时:

test = spark.read.csv("/mnt/lake/RAW/csds.csv", inferSchema=True, header=True)

我收到错误:

com.databricks.rpc.UnknownRemoteException: Remote exception occurred:

我使用以下代码来安装 ADLS Gen 2

def check(mntPoint):
  a= []
  for test in dbutils.fs.mounts():
    a.append(test.mountPoint)
  result = a.count(mntPoint)
  return result

mount = "/mnt/lake"

if check(mount)==1:
  resultMsg = "<div>%s is already mounted. </div>" % mount
else:
  dbutils.fs.mount(
  source = "wasbs://[email protected]",
  mount_point = mount,
  extra_configs = {"fs.azure.account.key.xxxxxxxx.blob.core.windows.net":""})
  resultMsg = "<div>%s was mounted. </div>" % mount

displayHTML(resultMsg)


ServicePrincipalID = 'xxxxxxxxxxx'
ServicePrincipalKey = 'xxxxxxxxxxxxxx'
DirectoryID =  'xxxxxxxxxxxxxxx'
Lake =  'adlsgen2'


# Combine DirectoryID into full string
Directory = "https://login.microsoftonline.com/{}/oauth2/token".format(DirectoryID)

# Create configurations for our connection
configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": ServicePrincipalID,
           "fs.azure.account.oauth2.client.secret": ServicePrincipalKey,
           "fs.azure.account.oauth2.client.endpoint": Directory}



mount = "/mnt/lake"

if check(mount)==1:
  resultMsg = "<div>%s is already mounted. </div>" % mount
else:
  dbutils.fs.mount(
  source = f"abfss://root@{Lake}.dfs.core.windows.net/",
  mount_point = mount,
  extra_configs = configs)
  resultMsg = "<div>%s was mounted. </div>" % mount

然后,我尝试使用以下命令读取 ADLS Gen 2 中的数据帧:

dataPath = "/mnt/lake/RAW/DummyEventData/Tools/"

test = spark.read.csv("/mnt/lake/RAW/csds.csv", inferSchema=True, header=True)

com.databricks.rpc.UnknownRemoteException: Remote exception occurred:

有什么想法吗?

apache-spark pyspark databricks azure-data-lake-gen2 databricks-community-edition
1个回答
2
投票

根据堆栈跟踪,该错误的最可能原因是您没有为服务主体分配存储 Blob 数据贡献者(或存储 Blob 数据读取器)角色(如文档中所述)。这个角色与通常的“贡献者”角色不同,这非常令人困惑。

© www.soinside.com 2019 - 2024. All rights reserved.