First of all, I have to say that I've read a lot of post about this problem, but I think I'm not getting the point about techniques when loading big files.
I have an excel file of 65000 rows (aprox) and 164 columns (EX column is the last one), that's give me a 10.660.000 cells.
I'm trying to load it to get values (only for reading) and I've tryed a lot of alternatives but none of them has worked (sometimes because memory limits and sometimes because time_execution).
I'm running a local server with WAMP, 640M of memory_limit and 3000 for time_execution.
I've tried the following techniques:
- Basic loading: problems with memory usage.
- Basic loading with cache methods (disk, sqlite3,...): Problems with memory usage.
- Chunks loading (with a ReadFilter): problems with time_execution
I think I'm not understanding the load by chunks method, because I've tried to load only 20 rows, or 50 and it takes the same amount of time.. which has sense for me because is an unique load task (I write this attempt outside of a bucle).
I've tried to load a 1000 rows with this technique and the server goes down after two intensive memory usages... (I've seen that in the performance pannel at windows task manager)
I need to know if my excel is too big for PHPExcel library, if there is an specific format of excel documents that works better in terms of performance... all the info you can give me is much appreciated... :(
I'm using other optimizing techniques like:
Thanks in advance,