MySQL使用来自python的LOAD DATA从大型csv传输数据时崩溃

问题描述 投票:0回答:1

我有一个3000万行(1.6 GB)的大型csv文件,我使用pymysql将数据从csv加载到mysql表。我已删除表模式中的所有约束以使加载更快并且还将超时值设置为大值。

def setTimeOutLimit(connection):
try:
    with connection.cursor() as cursor:
        query = "SET GLOBAL innodb_lock_wait_timeout = 28800"
        cursor.execute(query)

        query2 = "SET innodb_lock_wait_timeout = 28800"
        cursor.execute(query2)

        query3 = "SET GLOBAL connect_timeout = 28800"
        cursor.execute(query3)

        query4 = "SET GLOBAL wait_timeout = 28800"
        cursor.execute(query4)

        query5 = "SET GLOBAL interactive_timeout = 28800"
        cursor.execute(query5)

        query6 = "SET GLOBAL max_allowed_packet = 1073741824"
        cursor.execute(query6)

except:
    conn.close()
    sys.exit(" Could not set timeout limit ")

数据被插入到表中但我需要将其中一个列作为主键,因此我创建另一个表,通过忽略重复值来使该列成为主列索引。 (tableName_1是旧表tableName是新表)

def createNewTableFromOld(connection, tableName):

try:
    pprint( " Creating new table from old table  with constraints" )

    with connection.cursor() as cursor:

        query = (" CREATE TABLE " + tableName + 
                 " Like " + tableName + "_1")

        cursor.execute(query)

        query2 = (" ALTER TABLE " + tableName +
                  " ADD PRIMARY KEY(TimeStamp) ")

        cursor.execute(query2)

        query3 = (" INSERT IGNORE INTO " + tableName + 
                  " SELECT * FROM " + tableName + "_1")

        cursor.execute(query3)

        query4 = ("DROP TABLE " + tableName + "_1")

        cursor.execute(query4)

        connection.commit()

except:
    conn.close()
    sys.exit(" Could not create table with Primary Key ") 

在这个方法执行期间,在5-6分钟后的某个地方,我得到了这个错误,pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query ([WinError 10054] An existing connection was forcibly closed by the remote host)')

当我检查服务时,MYSQL80会自动崩溃并停止。我还在my.ini文件中将max_allowed_pa​​cket_size设置为1 gb,并将所有超时手动设置为8小时。可能是什么问题?

原始表架构是:

query = ("CREATE TABLE IF NOT EXISTS " + table + " ("
                  " TimeStamp  DECIMAL(15, 3), " + 
                  " Value      DECIMAL(30, 11), " +
                  " Quality    INT, " +
                  " TagName    varchar(30) )"
                  )
python mysql csv
1个回答
1
投票

我终于通过将my.ini文件中的innodb_buffer_pool_size设置为2GB(之前只有4M)来解决了这个问题。

© www.soinside.com 2019 - 2024. All rights reserved.