aiohttp ClientSession.get()方法默默地失效 - Python3.7

问题描述 投票:1回答:1

我正在做一个小的应用程序,试图通过搜索可通过必应他们的名字找公司网站网址。这需要在公司名称的大名单中,使用Bing搜索API获得第一个URL,和节省这些URL返回该列表中。

我在与aiohttpClientSession.get()方法的问题,具体而言,它静静地失败与我想不通为什么。

下面是我如何初始化脚本。留意了worker.perform_mission()

async def _execute(workers,*, loop=None):
    if not loop:
        loop = asyncio.get_event_loop()
    [asyncio.ensure_future(i.perform_mission(verbose=True), loop=loop) for i in workers]

def main():
    filepth = 'c:\\SOME\\FILE\\PATH.xlsx'
    cache = pd.read_excel(filepth)

    # CHANGE THE NUMBER IN range(<here>) TO ADD MORE WORKERS.
    workers = (Worker(cache) for i in range(1))
    loop = asyncio.get_event_loop()

    loop.run_until_complete(_execute(workers, loop=loop))

    ...<MORE STUFF>...

worker.perform_mission()方法执行以下操作(滚动到底部,并期待在_split_up_request_like_they_do_in_the_docs()):

class Worker(object):
    def __init__(self, shared_cache):
        ...<MORE STUFF>...

    async def perform_mission(self, verbose=False):
        while not self.mission_complete:
            if not self.company_name:
                await self.find_company_name()
                if verbose:
                    print('Obtained Company Name')
            if self.company_name and not self.website:
                print('Company Name populated but no website found yet.')
                data = await self.call_bing() #<<<<< THIS IS SILENTLY FAILING.
                if self.website and ok_to_set_website(self.shared_cache, self):
                    await self.try_set_results(data)
                    self.mission_complete = True
                else:
                    print('{} worker failed at setting website.'.format(self.company_name))
                    pass
            else:
                print('{} worker failed at obtaining data from Bing.'.format(self.company_name))
                pass

    async def call_bing(self):
        async with aiohttp.ClientSession() as sesh:
            sesh.headers = self.headers
            sesh.params = self.params
            return await self._split_up_request_like_they_do_in_the_docs(sesh)

    async def _split_up_request_like_they_do_in_the_docs(self, session):
        print('_bing_request() successfully called.') #<<<THIS CATCHES
        async with session.get(self.search_url) as resp:
            print('Session.get() successfully called.') #<<<THIS DOES NOT.
            return await resp.json()

最后我的输出是:

Obtained Company Name
Company Name populated but no website found yet.
_bing_request() successfully called.

Process finished with exit code 0

谁能帮我弄清楚为什么print('Session.get() successfully called.'),不触发?或者......帮我问这个问题好?

python asynchronous exception-handling python-asyncio aiohttp
1个回答
2
投票

看看这一部分:

async def _execute(workers,*, loop=None):
    # ...
    [asyncio.ensure_future(i.perform_mission(verbose=True), loop=loop) for i in workers]

您创建了一堆的任务,但你不等待这些任务完成。这意味着_execute本身创建任务后立即将完成,这些任务完成后不久。而且因为你运行事件循环直到_execute完成,它会停止不久后开始。

为了解决这个问题,使用asyncio.gather等多个awaitables完成:

async def _execute(workers,*, loop=None):
    # ...
    tasks = [asyncio.ensure_future(i.perform_mission(verbose=True), loop=loop) for i in workers]
    await asyncio.gather(*tasks)
© www.soinside.com 2019 - 2024. All rights reserved.