We are building a high performance training system, and we do care about the performance a lot. We store the training data in arrow ipc format file, say, there are 100M rows and 1000 columns. We just need to read data of 10 columns each time for training.
Seems with the arrow ipc format file, we have to read the whole file first to get the 10 columns. We did to use parquet, because there will be serialize and deserialize and we think arrow ipc format will be faster.
Is there's any suggestion if we can just read 10 columns only to get better performance?