无法使用 Microsoft Entra ID(服务主体)授权将文件上传到 ADLS Gen2

问题描述 投票:0回答:1

我正在尝试使用 Microsoft Entra ID(服务主体)授权将文件上传到 ADLS Gen2 存储。

使用 DataLakeServiceClient 将文件上传到 ADLS gen2 时收到以下错误消息

com.azure.storage.file.datalake.models.DataLakeStorageException:状态代码 400,“?

<Code>MissingRequiredHeader</Code><Message>An HTTP header that's mandatory for this request is not specified.
RequestId:adad-dads-d-20a2-xxdadad
Time:2024-05-07T17:18:58.0750289Z</Message><HeaderName>x-ms-type</HeaderName></Error>
DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClientBuilder()
        .endpoint("https://<storageaccount>.file.core.windows.net/")
        .credential(new ClientSecretCredentialBuilder()
                .clientId("<client_id>")
                .clientSecret("<client_secret>")
                .tenantId("<tenantId>")
                .build()).buildClient();
                
        DataLakeFileSystemClient container  = dataLakeServiceClient.getFileSystemClient("test");
        
        DataLakeDirectoryClient directoryClient = container.getDirectoryClient("upload").getSubdirectoryClient("customer_info");
        
        InputStream stream = new ByteArrayInputStream("hello".getBytes(StandardCharsets.UTF_8));
        long fileLength;
        try {
            fileLength = stream.available();
        
            DataLakeFileClient fileClient = directoryClient.createFileIfNotExists("test.csv");
            //fileClient.cr
            fileClient.upload(stream, fileLength);
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

POM.xml

`<dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-storage-file-datalake</artifactId>
    <version>12.18.3</version>
</dependency>
<dependency>
    <groupId>com.azure.spring</groupId>
    <artifactId>spring-cloud-azure-starter-keyvault-secrets</artifactId>
    <version>5.11.0</version>
</dependency>

` 有办法解决问题吗?

java azure-data-lake-gen2 azure-rest-api azure-service-principal microsoft-entra-id
1个回答
0
投票

无法使用 Microsoft Enterprise ID(服务主体)授权将文件上传到 ADLS Gen2。

您可以使用下面的代码通过 Java 将流上传到 Azure Data Lake Storage。

代码:

import com.azure.identity.ClientSecretCredentialBuilder;
import com.azure.storage.file.datalake.DataLakeDirectoryClient;
import com.azure.storage.file.datalake.DataLakeFileClient;
import com.azure.storage.file.datalake.DataLakeFileSystemClient;
import com.azure.storage.file.datalake.DataLakeServiceClient;
import com.azure.storage.file.datalake.DataLakeServiceClientBuilder;

import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;

public class App {

    public static void main(String[] args) {

        DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClientBuilder()
            .endpoint("https://<storageaccount>.file.core.windows.net/")
            .credential(new ClientSecretCredentialBuilder()
            .clientId("<client_id>")
            .clientSecret("<client_secret>")
            .tenantId("<tenantId>")
            .build()).buildClient();

        DataLakeFileSystemClient container = dataLakeServiceClient.getFileSystemClient("test");
        DataLakeDirectoryClient directoryClient = container.getDirectoryClient("upload").getSubdirectoryClient("customer_info");

        String content = "Name,Age,City\n" +
                 "John Doe,30,New York";

        InputStream stream = new ByteArrayInputStream(content.getBytes(StandardCharsets.UTF_8));
        long contentLength = content.getBytes(StandardCharsets.UTF_8).length;

        DataLakeFileClient fileClient = directoryClient.getFileClient("test.csv");
        fileClient.upload(stream,contentLength);
        try {
            stream.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

在上面的代码中,要在文件系统中上传文件,您需要使用

dfs
端点。

输出:

上面的代码被执行并作为 CSV 文件上传到我的文件系统中。

Uploaded CSV file

参考:

DataLakeFileClient 类 |微软学习

© www.soinside.com 2019 - 2024. All rights reserved.