post-processing: avoid re-reading config files#663
Conversation
✅ Deploy Preview for docs-rapids-ai ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
|
This did not quite work... the post-processing only processed I think I see the problem... now that the code in docs/ci/customization/customize_doc.py Lines 282 to 286 in 0c0433a I'm not sure why I didn't observe this early stopping when I tested locally, as I did try to run this over the entire set of API docs, for all libraries. Maybe I had other local modifications that led to this not happening. I'll put up a change shortly, and test it locally in a completely clean environment. Sorry. |
Contributes to rapidsai/build-planning#197 and https://github.com/rapidsai/ops/issues/4044
In #657 (and previously in #654), I'm working on adding an "Inactive Projects" section to https://docs.rapids.ai/api/, and struggling with how long iteration on the API docs takes... both locally and via merges to
mainand waiting for deployments.After a merge to
main, it takes slightly over an hour for a deployment to complete, and almost all of that time is spent in "post-processing".(link to recent build)
This PR attempts to reduce that.
Today,
ci/customization/customize_doc.pyis invoked once per HTML file in all of the API docs. Here in 2025, that's close to 10,000 files.To make the end-to-end time faster, this proposes the following:
customize_doc.pyonce and passing it a list of files (instead offorlooping around it and invoking it once per file)_data/releases.jsononce per post-processing run, instead of once per fileIn local testing on my Mac, with these changes
./ci/post-process.shrun over all of the API docs completes in around 4 minutes. If we get anywhere even close to that amount of a speedup in CI, it'd help with iteration time here AND reduce the lead time to publish updates like #662 for users.Notes for Reviewers
How I tested this
Using the steps added to the docs in #659
Beyond that, we'll have to merge this to find out if it worked.