从php [copy]中的csv文件中读取大数据

问题描述 投票:6回答:3

这个问题在这里已有答案:

我正在阅读csv并使用mysql检查记录是否存在于我的表中或不存在于php中。

csv有大约25000条记录,当我运行我的代码时,它显示“服务不可用”错误2m 10s后(onload:2m 10s)

这里我添加了代码

// for set memory limit & execution time
ini_set('memory_limit', '512M');
ini_set('max_execution_time', '180');

//function to read csv file
function readCSV($csvFile)
{
    $file_handle = fopen($csvFile, 'r');
    while (!feof($file_handle) ) {

       set_time_limit(60); // you can enable this if you have lot of data

       $line_of_text[] = fgetcsv($file_handle, 1024);
   }
   fclose($file_handle);
   return $line_of_text;
 }

// Set path to CSV file
$csvFile = 'my_records.csv';

$csv = readCSV($csvFile);

for($i=1;$i<count($csv);$i++)
{
   $user_email= $csv[$i][1];

   $qry = "SELECT u.user_id, u.user_email_id FROM tbl_user as u WHERE u.user_email_id = '".$user_email."'";

   $result = @mysql_query($qry) or die("Couldn't execute query:".mysql_error().''.mysql_errno());

   $rec = @mysql_fetch_row($result);

   if($rec)
   {
      echo "Record exist";
   }
   else
   {
      echo "Record not exist"; 
   }
}

注意:我只想列出我表中不存在的记录。

请建议我解决这个问题......

php csv large-data
3个回答
10
投票

处理大文件的一种很好的方法是:https://stackoverflow.com/a/5249971/797620

这个方法用于 http://www.cuddlycactus.com/knownpasswords/ (页面已被删除)在几毫秒内搜索超过170万个密码。


5
投票

在经历了很多苦苦挣扎之后,我终于找到了一个很好的解决方案,可能也有助于其他人。当我尝试包含18226行的2,367KB csv文件时,不同php脚本占用的时间最少(1)来自php.net fgetcsv文档,名为CsvImporter,以及(2)file_get_contents => PHP Fatal error: Allowed memory exhausted

(1)取0.92574405670166(2)取0.12543702125549(字符串格式)和0.52903485298157(拆分为数组)注意:此计算不包括添加到mysql。

我找到的最佳解决方案使用3.0644409656525总计包括添加到数据库和一些条件检查。处理8MB文件花了11秒。解决方案是:

$csvInfo = analyse_file($file, 5);
    $lineSeperator = $csvInfo['line_ending']['value'];
    $fieldSeperator = $csvInfo['delimiter']['value'];
    $columns = getColumns($file);
    echo '<br>========Details========<br>';
    echo 'Line Sep: \t '.$lineSeperator;
    echo '<br>Field Sep:\t '.$fieldSeperator;
    echo '<br>Columns: ';print_r($columns);
    echo '<br>========Details========<br>';
    $ext = pathinfo($file, PATHINFO_EXTENSION);
    $table = str_replace(' ', '_', basename($file, "." . $ext));
    $rslt = table_insert($table, $columns);
    if($rslt){
        $query = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE $table FIELDS TERMINATED BY '$fieldSeperator' ";

        var_dump(addToDb($query, false));
    }


function addToDb($query, $getRec = true){
//echo '<br>Query : '.$query;
$con = @mysql_connect('localhost', 'root', '');
@mysql_select_db('rtest', $con);
$result = mysql_query($query, $con);
if($result){
    if($getRec){
         $data = array();
        while ($row = mysql_fetch_assoc($result)) { 
            $data[] = $row;
        }
        return $data;
    }else return true;
}else{
    var_dump(mysql_error());
    return false;
}
}


function table_insert($table_name, $table_columns) {
    $queryString = "CREATE TABLE " . $table_name . " (";
    $columns = '';
    $values = '';

    foreach ($table_columns as $column) {
        $values .= (strtolower(str_replace(' ', '_', $column))) . " VARCHAR(2048), ";
    }
    $values = substr($values, 0, strlen($values) - 2);

    $queryString .= $values . ") ";

    //// echo $queryString;

    return addToDb($queryString, false);
}


function getColumns($file){
    $cols = array();
    if (($handle = fopen($file, 'r')) !== FALSE)
    {
        while (($row = fgetcsv($handle)) !== FALSE) 
        {
           $cols = $row;
           if(count($cols)>0){
                break;
           }
        }
        return $cols;
    }else return false;
}

function analyse_file($file, $capture_limit_in_kb = 10) {
// capture starting memory usage
$output['peak_mem']['start']    = memory_get_peak_usage(true);

// log the limit how much of the file was sampled (in Kb)
$output['read_kb']                 = $capture_limit_in_kb;

// read in file
$fh = fopen($file, 'r');
    $contents = fread($fh, ($capture_limit_in_kb * 1024)); // in KB
fclose($fh);

// specify allowed field delimiters
$delimiters = array(
    'comma'     => ',',
    'semicolon' => ';',
    'tab'         => "\t",
    'pipe'         => '|',
    'colon'     => ':'
);

// specify allowed line endings
$line_endings = array(
    'rn'         => "\r\n",
    'n'         => "\n",
    'r'         => "\r",
    'nr'         => "\n\r"
);

// loop and count each line ending instance
foreach ($line_endings as $key => $value) {
    $line_result[$key] = substr_count($contents, $value);
}

// sort by largest array value
asort($line_result);

// log to output array
$output['line_ending']['results']     = $line_result;
$output['line_ending']['count']     = end($line_result);
$output['line_ending']['key']         = key($line_result);
$output['line_ending']['value']     = $line_endings[$output['line_ending']['key']];
$lines = explode($output['line_ending']['value'], $contents);

// remove last line of array, as this maybe incomplete?
array_pop($lines);

// create a string from the legal lines
$complete_lines = implode(' ', $lines);

// log statistics to output array
$output['lines']['count']     = count($lines);
$output['lines']['length']     = strlen($complete_lines);

// loop and count each delimiter instance
foreach ($delimiters as $delimiter_key => $delimiter) {
    $delimiter_result[$delimiter_key] = substr_count($complete_lines, $delimiter);
}

// sort by largest array value
asort($delimiter_result);

// log statistics to output array with largest counts as the value
$output['delimiter']['results']     = $delimiter_result;
$output['delimiter']['count']         = end($delimiter_result);
$output['delimiter']['key']         = key($delimiter_result);
$output['delimiter']['value']         = $delimiters[$output['delimiter']['key']];

// capture ending memory usage
$output['peak_mem']['end'] = memory_get_peak_usage(true);
return $output;
}

1
投票

通常,发生500错误时会出现“服务不可用”错误。我认为这是因为执行时间不足。请检查您的日志/浏览器控制台,可能是您可以看到500错误。

首先,保持set_time_limit(60)不在循环中。

做一些改变,比如

  1. 在user_email_id列上应用INDEX,这样您就可以使用select查询更快地获取行。
  2. 不回显消息,保持输出缓冲区空闲。

我使用开源程序完成了这些操作。你可以在这里得到它http://sourceforge.net/projects/phpexcelreader/

试试这个。

© www.soinside.com 2019 - 2024. All rights reserved.