Large CSV files

Topics: Developer Forum, User Forum
Jul 29, 2009 at 11:03 AM

I have seen plenty of posts regarding problems reading large excel files, but none on large csv files. I have a csv file with 30k rows (16MB) which fails to be read, due to running out of memory.

I have tried both the newest PHPExcel source and with PHP 5.3. Is it ever possible to read such a large file with this library, or should i give up and go back to fgetcsv?

I am already using PHPExcel in my project to get data from (small) excel files and it would be very nice if i could use the library for csv files aswell, as this means i could reuse the much code already written for reading the excel files.

Btw really liking this lib, if i could just get it to work :)

Developer
Jul 29, 2009 at 6:21 PM

>> Is it ever possible to read such a large file with this library, or should i give up and
>> go back to fgetcsv?


It will be possible one day, I feel sure about that. PHPExcel 1.6.7 needed approx. 10KB PHP memory per cell. Latest source code and PHPExcel 1.7.0 needs approx. 1KB PHP memory per cell. That means PHPExcel 1.7.0 is able to handle around 10 times larger workbooks using the same amount of memory.

But it is still not good enough. That is why we are looking at how to implement cell caching (disk / database). I believe it will solve essentially all memory problems with large-scale workbooks including reading CSV.

Jul 30, 2009 at 11:10 AM

I am sorry to hear that, but thank you for the prompt reply. Will look out for those improvements in the future