Sqalchemy 缓慢异步执行

问题描述 投票:0回答:1

我有一个异步 FastApi 服务在 nginx 和 NLB 后面的 kustomizse 上运行。一直工作得很好,但在本地做了一些分析,我注意到我使用 sqlalchemy asyncSessions() 和 AsyncEngine 的数据库查询似乎有点瓶颈,但我在网上看不到太多,我认为它的设置方式是好的。

查询如下:

    async with AsyncSession(master_db) as session:
        result = (
            await session.execute(
                select(User, UserMapping, UserOrders, UserInvoices)
                .join(UserMapping, User.id == UserMapping.user_id)
                .outerjoin(
                    UserOrders,
                    and_(User.email == UserOrders.email),
                )
                .outerjoin(
                    UserInvoices,
                    and_(
                        UserInvoices.email == User.email,
                        UserInvoices.state == "paid",
                    ),
                )
                .where(UserMapping.uuid == uuid)
                .order_by(UserOrders.created_at.desc())
                .limit(3)
            )
        ).all()

至于引擎,我是这样创建的:

@functools.cache
def get_master_db(settings: Settings = Depends(get_settings)):
    master_db_engine: AsyncEngine = create_async_engine(
        settings.master_db_url,
        pool_recycle=3600,
        pool_pre_ping=True
    )
    return master_db_engine

它在我的 fastapi 中设置为依赖项,如下所示:

async def get_user_orders_and_invoices(
    user_uuid: int,
    master_db_engine: AsyncEngine = Depends(get_master_db):
   # make call to the method which has the above sqlalchemy call
   await get_user_order_and_invoices(master_db_engine,user_uuid)

与 pyinstrument 使用的分析相比,.execute() 似乎有时需要一段时间。

勾选一些我知道人们会提到的东西:

  • 查询正是我所需要的
  • 是的,右侧字段有索引
  • 如果我直接在 MySQL cli 中点击,原始 SQL 中的查询将花费最短的时间
  • 表的架构无法更改,因为它们是遗留的,目前不可能

在 sqlalchemy 级别或至少在我的 Python 代码级别,瓶颈是 100%。如果有任何帮助,那就太好了,谢谢!

供参考:

  • 使用Python 3.11
  • 最新的fastapi
  • aiomysql
  • sqalchemy 1.4
python-3.x sqlalchemy fastapi
1个回答
0
投票

尝试像这样创建会话:

from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, AsyncEngine
from sqlalchemy.orm import sessionmaker
from sqlalchemy import select, and_
import functools

###; Engine creation with caching and potential performance tuning parameters

@functools.cache
def get_master_db(db_url):
    master_db_engine: AsyncEngine = create_async_engine(
        db_url,
        echo=False,  # Turn off echo to reduce logging overhead
        pool_recycle=3600,
        pool_pre_ping=True,
        pool_size=10,  # Adjust pool size to your needs
        max_overflow=20  # Adjust max overflow to your needs
    )
    return master_db_engine

##; AsyncSession factory creation
##; Avoids automatic refresh of instances upon commit
AsyncSessionFactory = sessionmaker(
    bind=get_master_db(),
    class_=AsyncSession,
    expire_on_commit=False  
)

© www.soinside.com 2019 - 2024. All rights reserved.