我必须编写一个PHP脚本,该脚本将从给定的CSV文件中将数据导入MySQL数据库。给定的CSV文件最多可以包含200,000行。我尝试了以下方法,但出现了问题:
现在,我正在考虑通过将CSV文件拆分为较小的块,然后异步插入它们的解决方案。我已经完成了CSV的拆分,但是目前不知道如何以快速安全的方式异步插入到数据库中。但是我听说我将在这里使用Ajax。
您可以推荐任何解决方案吗?在此先多谢!
感谢所有为这个问题提供答案的人。我发现了解决方案!我只是想共享它,以防万一有人需要创建一个PHP脚本来将一个巨大的CSV文件导入到MySQL数据库中(异步且快速!),我已经测试了40万行代码,并且在几秒钟内完成了导入。我相信它将适用于较大的文件,您只需要修改最大上传文件大小即可。
在此示例中,我将一个包含两列(名称,contact_number)的CSV文件导入到包含相同列的MySQL数据库中。
您的CSV文件应如下所示:
安娜,0906123489
约翰,0908989199
彼得,0908298392
…
所以,这是解决方案。
首先,创建表格
CREATE TABLE `testdb`.`table_test` ( `id` INT NOT NULL AUTO_INCREMENT , `name` VARCHAR(100) NOT NULL , `contact_number` VARCHAR(100) NOT NULL , PRIMARY KEY (`id`)) ENGINE = InnoDB;
其次,我有4个PHP文件。您所要做的就是将其放置在单个文件夹中。PHP文件如下:
index.php
<form action="upload.php" method="post" enctype="multipart/form-data"> <input type="file" name="csv" value="" /> <input type="submit" name="submit" value="Save" /></form>
connect.php
<?php //modify your connections here $servername = "localhost"; $username = "root"; $password = ""; $dbname = "testDB"; $conn = new mysqli($servername, $username, $password, $dbname); if ($conn->connect_error) { die("Connection failed: " . $conn->connect_error); } ?>
senddata.php
<?php include('connect.php'); $data = $_POST['file']; $handle = fopen($data, "r"); $test = file_get_contents($data); if ($handle) { $counter = 0; //instead of executing query one by one, //let us prepare 1 SQL query that will insert all values from the batch $sql ="INSERT INTO table_test(name,contact_number) VALUES "; while (($line = fgets($handle)) !== false) { $sql .= "($line),"; $counter++; } $sql = substr($sql, 0, strlen($sql) - 1); if ($conn->query($sql) === TRUE) { } else { } fclose($handle); } else { } //unlink CSV file once already imported to DB to clear directory unlink($data); ?>
upload.php
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js"></script> <script> //Declaration of function that will insert data into database function senddata(filename){ var file = filename; $.ajax({ type: "POST", url: "senddata.php", data: {file}, async: true, success: function(html){ $("#result").html(html); } }) } </script> <?php $csv = array(); $batchsize = 1000; //split huge CSV file by 1,000, you can modify this based on your needs if($_FILES['csv']['error'] == 0){ $name = $_FILES['csv']['name']; $ext = strtolower(end(explode('.', $_FILES['csv']['name']))); $tmpName = $_FILES['csv']['tmp_name']; if($ext === 'csv'){ //check if uploaded file is of CSV format if(($handle = fopen($tmpName, 'r')) !== FALSE) { set_time_limit(0); $row = 0; while(($data = fgetcsv($handle)) !== FALSE) { $col_count = count($data); //splitting of CSV file : if ($row % $batchsize == 0): $file = fopen("minpoints$row.csv","w"); endif; $csv[$row]['col1'] = $data[0]; $csv[$row]['col2'] = $data[1]; $min = $data[0]; $points = $data[1]; $json = "'$min', '$points'"; fwrite($file,$json.PHP_EOL); //sending the splitted CSV files, batch by batch... if ($row % $batchsize == 0): echo "<script> senddata('minpoints$row.csv'); </script>"; endif; $row++; } fclose($file); fclose($handle); } } else { echo "Only CSV files are allowed."; } //alert once done. echo "<script> alert('CSV imported!') </script>"; } ?>
而已!您已经有一个纯PHP脚本,可以在几秒钟内导入多个行!:)(感谢我的合伙人,他教给我有关如何使用Ajax的想法)