Skip to content

Conversation

@fabianp
Copy link
Collaborator

@fabianp fabianp commented Jun 28, 2018

This should fix #195. Increment should reflect the largest increment with respect to the previous line, which is not what the previous code was doing. If run in a for loop, for some reason it was accumulating the increments, which was explaining why some people were seeing large negative increments (https://stackoverflow.com/questions/51077423/strange-negative-value-in-memory-profiler-while-reading-file-python)

I also disabled a test that was assuming that the increment of a function that deletes its temporaries need to be zero, which given the above was a wrong test.

@Juanlu001 : I would appreciate your OK if you have time

This should fix pythonprofilers#195. Increment should reflect the largest increment with respect to the previous line, which is not what the previous code was doing. If run in a for loop, for some reason it was accumulating the increments.

I also disabled a test that was assuming that the increment of a function that deletes its temporaries need to be zero, which given the above should not be its behaviour.
@astrojuanlu
Copy link
Collaborator

Hi @fabianp! I was trying to reproduce the behavior with this program:

$ cat test_file.py 
@profile
def get_file():
    filename = "/tmp/test.txt"
    f_out = "/tmp/out.txt"
    count = 0
    with open(filename, 'r') as f_in, open(f_out, 'w') as f_out:
        for line in f_in:
            f_out.write(line)

    print(str(count))


if __name__ == '__main__':
    get_file()

(that count variable does not make any sense as it's written, but I just transcribed what I saw in the image)

This is what I get with 0.52:

0
Filename: test_file.py

Line #    Mem usage    Increment   Line Contents
================================================
     1   31.941 MiB   31.941 MiB   @profile
     2                             def get_file():
     3   31.941 MiB    0.000 MiB       filename = "/tmp/test.txt"
     4   31.941 MiB    0.000 MiB       f_out = "/tmp/out.txt"
     5   31.941 MiB    0.000 MiB       count = 0
     6   31.941 MiB    0.000 MiB       with open(filename, 'r') as f_in, open(f_out, 'w') as f_out:
     7   31.941 MiB    0.000 MiB           for line in f_in:
     8   31.941 MiB    0.000 MiB               f_out.write(line)
     9                             
    10   31.941 MiB    0.000 MiB       print(str(count))

and this with your patch:

0
Filename: test_file.py

Line #    Mem usage    Increment   Line Contents
================================================
     1   31.914 MiB   31.914 MiB   @profile
     2                             def get_file():
     3   31.914 MiB    0.000 MiB       filename = "/tmp/test.txt"
     4   31.914 MiB    0.000 MiB       f_out = "/tmp/out.txt"
     5   31.914 MiB    0.000 MiB       count = 0
     6   31.914 MiB    0.000 MiB       with open(filename, 'r') as f_in, open(f_out, 'w') as f_out:
     7   32.016 MiB    0.000 MiB           for line in f_in:
     8   32.016 MiB    0.102 MiB               f_out.write(line)
     9                             
    10   32.016 MiB    0.000 MiB       print(str(count))

No one provided a script to reproduce in #195. Codewise this patch looks good to me, but I still would like to understand the problem more.

@fabianp
Copy link
Collaborator Author

fabianp commented Aug 16, 2018

@Juanlu001 would you be OK merging and closing #195 to see if it fixes the issue?

@astrojuanlu
Copy link
Collaborator

Ok, let's merge them!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

large negative increment values in line profiler

2 participants