Read in chunks not working.. so slowly and still with memory errors

Topics: Developer Forum, Project Management Forum, User Forum
Apr 6, 2013 at 3:12 PM
PHPExcel 1.7.8
PHP 5.3
I've got a 2012 MBA with 4gb ram. 512M as php memory limit.

I'm trying to read a big excel file of about 20mb to import into mysql.

I've searched across internet and found the "Chunks reading" solution, I checked the examples (11 and 12) and tried them with others solutions on the internet. However is not working... or is SO slowly for me, and I'm not sure why.

This is what im doing:
// .....
// into MyReadFilter class.. this is the most important function, you already know it because is almost the same on all versions of filters to read in chunks
public function readCell($column, $row, $worksheetName = '') {
        //  Only read the rows and columns that were configured
        if (($row == 1) || ($row >= $this->_startRow && $row < $this->_endRow)) {
            if (in_array($column,$this->_columns)) {
                return true;
        return false;
// .....

// now my code...
$filter = new MyReadFilter(1, 22000); 
$chunkSize = 10;

$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$objReader->setReadDataOnly(false); //not sure if this should be true

for ($startRow = 2; $startRow <= 65536; $startRow += $chunkSize) {

  echo "Reading";
  $filterSubset->setRows($startRow, $chunkSize);
  $objPHPExcel = $objReader->load($inputFileName); // this line takes like 40 seconds... for 10 rows?
  echo "chunk done! ";

However, inside the for, the $objReader->load() is taking like 40 seconds, and in fact, after 2 loops I still get a memory error.

If I unset() the $objReader inside the for I can make the loop run about 20 times... (although it take like 10 minutes) and.. memory error.

I'm wondering why the load function seems to read all the file if im using a filter, also the filter strategy seems to parse all rows and return false for all rows that are not being required... so from my point of view is not efficient as i have like 100k rows... Is not posible to just abort reading or really read just the required rows?

I've tried a couple of FilterClass and code snippets but got same results...

So... suggestions?