Memory exhaustion during the file parsing

Topics: User Forum
Aug 30, 2013 at 10:27 AM
Edited Aug 30, 2013 at 10:28 AM
Hi there!

I'm trying to extract some information from a large Excel file, with something like 40852 rows by 38 columns.

Taking into consideration the large amount of data, I'm already several recommended techniques for this situations, like caching and read by chunks. However, even if I define a chunk size of 1, the memory still runs out, probably during the parsing process.

Here are my questions:
  1. How can I be sure that the memory exhaustion is occurring during the parsing process?
  2. If so, what can I do to prevent this situation?
Thanks in advance for all the help,
Best regards!
Aug 30, 2013 at 6:19 PM
Edited Aug 30, 2013 at 6:19 PM
As you read each line open a file and append it as csv data and then close the file.

When you get the out of memory error, check to see if the data stopped in the middle.

That will at least let you know if you are running out of memory during parsing.

Note: Reading in a CSV file requires less memory than a spreadsheet.
  • Christopher Mullins