Hmm - same results from server - the function just bombs without any PHP error whent theres more than about 15000 rows. (The use of the copied array adds very little overhead in the scheme of things)
It seems to use a lot more memory than the PEAR Excel writer - so at the moment I cant even build 1 large worksheet- let alone the several worksheets I needed
I must say I'm surprised that processing a row at a time as it's read from the database uses as much memory as building an array of data from the database before passing it to PHPExcel to process. That just seems intrinsically wrong... not least because
PHP simply needs more memory to hold a large 2D array than it does a single row array.
PHPExcel does use a lot more memory than PEAR, and it isn't as fast... it does a lot more than PEAR (reading and writing, multiple formats, rich text, conditional formatting, etc). We've been striving for over a year now to reduce the memory overhead, and
to increase speed, and have made significant inroads in that time... but it is still slower and more memory-hungry than PEAR.
Is there a more efficient way I can use apart from setCellValueByColumnAndRow?
$excel->getActiveSheet()->fromArray($row_array,NULL,'A1'); should work, but you've already tried that... don't know why it didn't work for you, although looking over the method, I can see how to boost its speed slightly.
setCellValue() is fractionally faster than setCellValueByColumnAndRow(), but it's microseconds difference... one call to PHPExcel_Cell::stringFromColumnIndex() and a concatenation.
Your problem seems to be memory rather than simply speed: to conserve memory, look to using cell caching; though there's a cost in speed. My rule of thumb is 1k/cell, doubled when you intend to write the workbook, plus between 10 and 25MB for the code itself.
Add 60% to each figure if you're running 64-bit PHP. If you're using cell caching (particularly php://temp or diskISAM), you can reduce that 1k/cell to about 330bytes/cell... you can reduce it even further with APC, memcache or wincache, but that requires
adequate memory allocated to your cache (about 1.2k/cell).
If you're working with larger workbooks, then CLI is better than web interface to prevent timeouts.
What is the largest worksheet size possible/been done before?
I know that both borft and myself are working with large workbooks: in my case, 64k rows/worksheet for 2-3 worksheets with 100+ columns; but both of us have the available memory to throw at PHP and the time to process the workbooks.
Is there a way to build the spreadsheet part at a time so I can schedule 5000 rows at a time in different scripts?
Not without a complete rewrite of PHPExcel that totally changes the way that it works, and that significantly reduces the ability of the library to work with different file formats, or switching to a non-PHP solution... unless you're outputting to CSV (and
in that case, I'd assume you'd use PHP's standard fputcsv().