是否可以使用Python中的线程池来限制对azure函数的同时请求?

问题描述 投票:0回答:1

我正在设置配置以将包含正文的数组发送到 Azure 函数。但是,即使使用 time.sleep,对 Azure 函数的请求也会继续,导致该函数崩溃。我正在 Synapse 笔记本中完成这项工作。这个想法是发送这些数组,知道 Azure Function 接受 200 个请求,然后它应该暂停,然后再继续处理接下来的 200 个请求,依此类推连续

def readarray(array):
    try:
        print(f"array: {array}, thread: {threading.currentThread().getName()}")  
        time.sleep(10)
        print(f"Array {array} processed")
    except Exception as e:
        print(f"Error array {array}: {str(e)}")    

try:
    array_count = 0
    with ThreadPoolExecutor(max_workers=5) as executor:
        # Use threadpool
        executor.map(readarray, array_of_objects)
except Exception as e:
    print("Error:", str(e))
python azure-functions apache-synapse spark-notebook
1个回答
0
投票

您可以将数据拆分为 200 个批次,对其进行处理,直到上一个批次完成后才处理下一批数据。

由于您的问题中没有详细的函数,我假设您尝试使用 post 方法调用远程函数“your_function_url”:

import time
import requests
import concurrent.futures


def read_array(data):
    response = requests.post('your_function_url', json=request_data)
    print(response.status_code)

def send_requests_to_azure(batch):
    with concurrent.futures.ThreadPoolExecutor() as executor:
        executor.map(read_array, batch)
        concurrent.futures.wait(tasks)

array_of_objects = [ ... ]  # Your array of data
batch_size = 200

for i in range(0, len(array_of_objects), batch_size):
    current_batch = array_of_objects[i:i+batch_size]
    
    send_requests_to_azure(current_batch)
© www.soinside.com 2019 - 2024. All rights reserved.