free up memory

Topics: Developer Forum, Project Management Forum, User Forum
Jan 10, 2011 at 10:22 AM

HI PHPExcel team,

               First of all Happy New Year to all. I have a question regarding a issue coming while developing a project using PHPExcel.

                Well I am generating a report xls sheet, which is having some 100 odd sheets in it, with each sheet having a different design. So it is using some 360MB memory to generate a single report xls sheet. So if I try to generate 4 reports one after other, It is using the memory as such:

Peak memory usage: 358.25 MB   report_one.xls

Peak memory usage: 686 MB   report_two.xls

Peak memory usage: 1031 MB   report_three.xls

Peak memory usage: 1344.75 MB   report_four.xls

(NOTE:-  The 1st column is the Memory usage while the 2nd column is the File name)

So my question is Is there a way through which I can free up the memory before the process starts again for another one, like after generating the report_one.xls, it'll free up the memory space, then try to generate report_two.xls.

Right now It looks like it is only adding up the memory usage, not freeing anything.

So Is that possible?? Kindly let me know.


With regards,


Jan 10, 2011 at 2:33 PM

Section 4.3 of the Developer Documentation,,, Clearing a Workbook from Memory

Jan 10, 2011 at 5:30 PM

Does this function help me at all if I am creating 10 reports within the same Excel file?

I mean, I create the first sheet, write it and free it up. When I restart the process I need to reopen the initial excel file, using the same amount of memory again?

Is freeing up the memory only usable when creating different excel files?


Kim Steinhaug

Jan 10, 2011 at 7:46 PM

These special methods to clear workbooks and worksheets from memory were written because a simple unset() will not work with PHPExcel or PHPExcel_Worksheet objects because they contain cyclic references. They do not completely unset the object, but the clear the cyclic references, allowing the object to be unset subsequently. How they are used is up to the individual developer.

One example of use might be to load a template file, then iterate through a database... cloning the loaded template (using $activeWorkbook = $templateWorkbook->copy();), writing (say) 50k rows to the template, save the template to file, then disconnect and unset before going reiterating and processing the next 50k rows.

Jan 10, 2011 at 8:14 PM

Just as a point of reference, when you say 50K of rows, what kind of memory footprint are you usually considering for this kind of workbook. The reason I ask is because when I write a 7000 row excel file with 29 cols I get around 180mb RAM usage (with the cell caching). (reading the same file as in the examples, example 28), uses 300MB.

Is using a 1 GB of ram "recommended" setting when having larger excel files?

Jan 10, 2011 at 9:18 PM

We use PHPExcel to generate big xlsx reports containing up to 400K rows with up to 36 columns. The generation of these files may take up to 45 minutes, however, the db queries to generate the data take far longer. Ive seen peak usages of 24GB(!), but in the end I'll end up with the excelsheet I wanted to.


Mind you, we specifically upgraded our reporting server to 24GB to be able to generate the reports we need ;)

Jan 10, 2011 at 9:48 PM

> 24GB

OH LORD! I see the headache. Well, in a way I agree that this is not a problem really aslong as you control the server which is running the system. It is nice though to see that huge amounts of RAM is required for this process, as this means I shouldt waste to much time in trying to optimize something that isn't possible to optimize. Better invest a few bucks into more RAM on the production server and force our customers using our software to have webhosting on our server if they need the export/import feature.

Thanks for your input Borf.