Skip to content

Think about better file read and processing  #3

@kfigaj

Description

@kfigaj

readed_text = data.read()

Think how this code would behave if your file would have millions of rows. One can do read and process more efficient - The simple solution here is to split data to chunks. Python support this concept with generators.

https://stackoverflow.com/questions/519633/lazy-method-for-reading-big-file-in-python
https://campus.datacamp.com/courses/python-data-science-toolbox-part-2/bringing-it-all-together-3?ex=8#skiponboarding

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions