Skip to content

fwrite crashes for tables with large columns #2974

@logworthy

Description

@logworthy

Bit of an edge case here, but when trying to fwrite a table with a very large column R crashes every time.

Potentially related to #2612.

Minimal reproducible example:

huge_vector <- paste(rep('x', 30e6), collapse='')

foo <- data.table::data.table(x=huge_vector)

tf <- tempfile()

data.table::tables()
# NAME NROW NCOL MB COLS KEY
# 1:  foo    1    1 29    x    
# Total: 29MB

# this works fine
write.table(foo, file=tf)

# this reliably crashes my R session
data.table::fwrite(foo, file=tf)

Output of sessionInfo():

R version 3.4.3 (2017-11-30)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)

Matrix products: default

locale:
[1] LC_COLLATE=English_Australia.1252  LC_CTYPE=English_Australia.1252    LC_MONETARY=English_Australia.1252
[4] LC_NUMERIC=C                       LC_TIME=English_Australia.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

loaded via a namespace (and not attached):
[1] compiler_3.4.3    tools_3.4.3       yaml_2.1.19       data.table_1.11.0

For what it's worth, I have 16GB RAM on the system with ~8GB free before running the above example.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions