-
Notifications
You must be signed in to change notification settings - Fork 4k
Description
While working on ARROW-3308, I noticed that write_feather has a chunk_size argument, which by default will write batches of 64k rows into the file. In principle, a chunking strategy like this would prevent the need to bump up to large_utf8 when ingesting a large character vector because you'd end up with many chunks that each fit into a regular utf8 type. However, the way the function works, the data.frame is converted to a Table with all ChunkedArrays containing a single chunk first, which is where the large_utf8 type gets set. But if Table$create() could be instructed to make multiple chunks, this would be resolved.
Reporter: Neal Richardson / @nealrichardson
Related issues:
- [Python][R] Expose incremental write API for Feather files (is related to)
- [R] Use Converter API to convert SEXP to Array/ChunkedArray (is related to)
Note: This issue was originally created as ARROW-9293. Please see the migration documentation for further details.