11
TIL my simple Python loop was making a huge list in memory
I was trying to read a big text file, about 50,000 lines, and process each line. I wrote a for loop that appended each processed line to a new list. After running it, my computer got really slow and the program crashed. I looked it up and found out I was storing the entire processed file in memory twice. The fix was to use a generator expression instead, which processes one line at a time. Has anyone else run into memory problems with their first big data project? What was your fix?
2 comments
Log in to join the discussion
Log In2 Comments
jennyt834d ago
Lol, classic memory hog moment.
4
the_diana4d agoMost Upvoted
A quick restart usually fixes it for me, @jennyt83.
6