Azure数据工厂在复制大数据文件时失败

问题描述 投票:0回答:1

我正在使用Azure Data Factory,将数据从REST API复制到Azure Data Lake Store。以下是我的活动的JSON

{
    "name": "CopyDataFromGraphAPI",
    "type": "Copy",
    "policy": {
        "timeout": "7.00:00:00",
        "retry": 0,
        "retryIntervalInSeconds": 30,
        "secureOutput": false
    },
    "typeProperties": {
        "source": {
            "type": "HttpSource",
            "httpRequestTimeout": "00:30:40"
        },
        "sink": {
            "type": "AzureDataLakeStoreSink"
        },
        "enableStaging": false,
        "cloudDataMovementUnits": 0,
        "translator": {
            "type": "TabularTranslator",
            "columnMappings": "id: id, name: name, email: email, administrator: administrator"
        }
    },
    "inputs": [
        {
            "referenceName": "MembersHttpFile",
            "type": "DatasetReference"
        }
    ],
    "outputs": [
        {
            "referenceName": "MembersDataLakeSink",
            "type": "DatasetReference"
        }
    ]
}

REST API是我创建的。首先出于测试目的,我仅返回2500行,并且管道运行正常。它将数据从REST API调用复制到Azure Data Lake Store。

经过测试,我更新了REST API,现在它返回125000行。我在REST客户端中测试了该API并正常工作。但是在Azure Data Factory的复制活动中,将数据复制到Azure Data Lake Store时出现以下错误。

{
    "errorCode": "2200",
    "message": "Failure happened on 'Sink' side. ErrorCode=UserErrorFailedToReadHttpFile,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to read data from http source file.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (500) Internal Server Error.,Source=System,'",
    "failureType": "UserError",
    "target": "CopyDataFromGraphAPI"
}

接收器端是Azure Data Lake Store。我从REST呼叫复制到Azure Data Lake Store的内容大小是否有限制?

我还通过更新REST API调用(2500行)重新测试了管道,并且工作正常,当我更新API调用并返回125000行时。我的管道开始出现上述相同的错误。

复制活动中的我的源数据集是

{
    "name": "MembersHttpFile",
    "properties": {
        "linkedServiceName": {
            "referenceName": "WM_GBS_LinikedService",
            "type": "LinkedServiceReference"
        },
        "type": "HttpFile",
        "structure": [
            {
                "name": "id",
                "type": "String"
            },
            {
                "name": "name",
                "type": "String"
            },
            {
                "name": "email",
                "type": "String"
            },
            {
                "name": "administrator",
                "type": "Boolean"
            }
        ],
        "typeProperties": {
            "format": {
                "type": "JsonFormat",
                "filePattern": "arrayOfObjects",
                "jsonPathDefinition": {
                    "id": "$.['id']",
                    "name": "$.['name']",
                    "email": "$.['email']",
                    "administrator": "$.['administrator']"
                }
            },
            "relativeUrl": "api/workplace/members",
            "requestMethod": "Get"
        }
    }
}

接收器数据集为

{
    "name": "MembersDataLakeSink",
    "properties": {
        "linkedServiceName": {
            "referenceName": "DataLakeLinkService",
            "type": "LinkedServiceReference"
        },
        "type": "AzureDataLakeStoreFile",
        "structure": [
            {
                "name": "id",
                "type": "String"
            },
            {
                "name": "name",
                "type": "String"
            },
            {
                "name": "email",
                "type": "String"
            },
            {
                "name": "administrator",
                "type": "Boolean"
            }
        ],
        "typeProperties": {
            "format": {
                "type": "JsonFormat",
                "filePattern": "arrayOfObjects",
                "jsonPathDefinition": {
                    "id": "$.['id']",
                    "name": "$.['name']",
                    "email": "$.['email']",
                    "administrator": "$.['administrator']"
                }
            },
            "fileName": "WorkplaceMembers.json",
            "folderPath": "rawSources"
        }
    }
}
azure-data-factory azure-data-lake u-sql azure-data-factory-2
1个回答
0
投票

据我所知,文件大小没有限制。我有一个10 gb的csv,具有数百万行,而data lake不在乎。

我可以看到的是,虽然错误显示为“ sink”,但错误代码为UserErrorFailedToReadHttpFile,所以我认为如果更改源上的httpRequestTimeout可能会解决此问题,到目前为止,它是“ 00:30:40 ”,也许行传输因此而被中断。 30分钟对于2500行来说是很多时间,但是也许125k不合适。

希望这有所帮助!

© www.soinside.com 2019 - 2024. All rights reserved.