Memory limit, cache extensions limit and alternative to PHPExcel

Topics: User Forum
May 15, 2013 at 10:25 PM
Edited May 15, 2013 at 10:26 PM
I've used PHPExcel in other projects and it's proved great. Now I'm in a really low budget scenario needing to import more than 60k rows into a MySQL database from either .xls, .xlsx or .csv.

I've followed a bunch of other topics concerning memory limit and cache for PHPExcel. I've tested locally with similar specs plus APC and proved it would handle it all perfectly, but this hosting plan is very limited and does not provide any perl. Good thing is in this scenario I don't need to do any fancy stuff with data, just scape cells and insert rows straight into database.

Question: do I really need PHPExcel or can this be acomplished via another less featured and memory-hungry tool I'm not aware of?
Coordinator
May 15, 2013 at 11:05 PM
Only you can answer whether you really need PHPExcel or not. Asking the developers of a library whether you should use an alternative is pretty crass, and I don't know what the relevance of perl is; but as I answered you on StackOverflow, you use whatever library works for you... the posting on StackOverflow gives you a list of every PHP library that I'm aware of that offers an alternative.
May 15, 2013 at 11:37 PM
Edited May 15, 2013 at 11:41 PM
Sorry for messing up about perl. I meant the client host does not provide the modules/extensions required by PHPExcel caching alternatives.

It's me who is enforcing the use of PHPExcel as I've successfully used it in other projects and I'm now out of time to try and test other libs. I was hoping you could stop me from doing that to myself pointing me right to the perfect tool for the job. Yeah, sorry for that too.

I guess I'll stick with PHPExcel. For the record, It's now taking around half an hour to get 10k rows (4 collumns) inserted into MySQL. I'm on 256MB using chunks of 500 and no caching feature.

I still have to try the required 60k, but just to clarify: when you split it into chunks, does memory usage still increase as I loop through chunks or when I call setRows() it swaps the whole data from previous chunk?
Coordinator
May 16, 2013 at 8:46 AM
Chunking wilkl alway be slow: when you load a file in chunks, PHPExcel still parses the whole file, but only loads the cells that match the filter criteria into memory, so you still have the parsing overhead for the entire file each itearation. It can also result in memory leaks unless you disconnect the workbook cleanly before unsetting it.

Caching is much more efficient in terms of speed than chunking, although with large files you may still need to combine the two methods if you're working with a limited amount of memory.