-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Line 32 in 73c7dbb
| readed_text = data.read() |
Think how this code would behave if your file would have millions of rows. One can do read and process more efficient - The simple solution here is to split data to chunks. Python support this concept with generators.
https://stackoverflow.com/questions/519633/lazy-method-for-reading-big-file-in-python
https://campus.datacamp.com/courses/python-data-science-toolbox-part-2/bringing-it-all-together-3?ex=8#skiponboarding
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels