GH-15100: [C++][Parquet] Add benchmark for reading strings from Parquet#15101
GH-15100: [C++][Parquet] Add benchmark for reading strings from Parquet#15101pitrou merged 4 commits intoapache:masterfrom
Conversation
|
@ursabot please benchmark command=cpp-micro --suite-filter=parquet-arrow-reader-writer-benchmark |
|
@ursabot please benchmark command=cpp-micro --suite-filter=parquet-arrow-reader-writer-benchmark |
|
@ursabot please benchmark |
|
Benchmark runs are scheduled for baseline = 6236dba and contender = 3c02495. Results will be available as each benchmark for each run completes. |
|
['Python', 'R'] benchmarks have high level of regressions. |
| ::arrow::schema({::arrow::field("column", type, null_percentage > 0)}), {arr}); | ||
| } | ||
|
|
||
| static void BM_WriteBinaryColumn(::benchmark::State& state) { |
There was a problem hiding this comment.
Does it use the PLAIN encoding? Add a comment?
There was a problem hiding this comment.
I added a comment near the parameters of each benchmark, explaining we are using the unique_values to trigger the code paths for dictionary and plain encodings. I tried to add a test within the benchmark to validate we are getting the expected encodings. But I found that it was too complicated, as the encodings can change from page to page and also apply to the definition and repetition levels (IIUC).
There was a problem hiding this comment.
I see. Can you just confirm that the expected encodings are used (and add a comment)?
There was a problem hiding this comment.
Just saw the comment below, sorry. Please disregard. :-)
… Parquet (apache#15101) * Closes: apache#15100 Authored-by: Will Jones <willjones127@gmail.com> Signed-off-by: Antoine Pitrou <antoine@python.org>
|
Benchmark runs are scheduled for baseline = 25b5093 and contender = 040310f. 040310f is a master commit associated with this PR. Results will be available as each benchmark for each run completes. |
|
['Python', 'R'] benchmarks have high level of regressions. |
|
['Python', 'R'] benchmarks have high level of regressions. |
Uh oh!
There was an error while loading. Please reload this page.