Reputation: 36354
I'm importing a .CSV into an application with the following:
if (($handle = fopen($file, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
echo $data[$c] . "<br />\n";
}
}
fclose($handle);
}
It sort of works but it's far from perfect. I wanted to first get out the row heads and put them into an array and then loop round each row to get the data sets in.
It seems to be having delimitting problems as the first row (heads) are also including a few parts of the second row.
I exported the .csv file straight from Excel. Wonder if there are encoding tricks etc I might be missing.
Upvotes: 0
Views: 2143
Reputation: 55
I changed to 8192 and got same results
while (($data = fgetcsv($handle, 8192, ",")) !== FALSE) {
...
}
and then did this one and got same results
$handle = fopen($_FILES['filename']['tmp_name'], "r");
ini_set('auto_detect_line_endings', true);
while (($data = fgetcsv($handle, 8192, ",")) !== FALSE) {
$data = array_map('mysql_real_escape_string', $data);
Upvotes: 0
Reputation: 143
fgetcsv
doesnt always properly detect line endings.
Try using before the fgetcsv call:
ini_set('auto_detect_line_endings', true);
Upvotes: 1
Reputation: 173542
It sounds like your 1000
limit is not long enough; you should set it to a high enough value for that file ... or set to 0
for unlimited (not recommended, because it tends to be slower).
Set it to 4096
or 8192
first and see how it goes.
// use 8kB buffer for reading comma delimited line
while (($data = fgetcsv($handle, 8192, ",")) !== FALSE) {
Update
Okay, on second thought, perhaps you should inspect the file and confirm a few things:
Upvotes: 3
Reputation: 1163
Why are you not using file()?
$trimmed = file('somefile.txt', FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
Upvotes: -1