Skip to content

[request] print informative error if fwrite uses more memory than allowed #2612

@zachokeeffe

Description

@zachokeeffe

This request follows my post on StackOverflow. I've found that fwrite will do strange things in memory constrained environments. If write.table uses more memory than it is allowed, I've found that my high performance computing cluster will kill the job. But if I use fwrite, the job will not be killed--instead, fwrite will either write nothing, write an empty file, write part of the file, or write the file with errors--but the R code will appear to have finished correctly. It's taken me quite some time to figure this all out--it would be nice if there were a way to alert users in the future that the write operation failed because of memory constraints (i.e., so they know they just need to boost the RAM allocation instead of hunting down errors in their code that don't exist). I don't imagine that's an easy task (I know this is a rather vague request), but I thought I would bring this to your attention because I've spent the last few days wrangling with the issue.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions