Skip to content

Adding compile option warmup_execute_after_compile to optionally run execute rigtht after compilation to create command buffers.#15962

Merged
meta-codesync[bot] merged 1 commit intopytorch:mainfrom
trviv:export-D87781471
Nov 25, 2025
Merged

Adding compile option warmup_execute_after_compile to optionally run execute rigtht after compilation to create command buffers.#15962
meta-codesync[bot] merged 1 commit intopytorch:mainfrom
trviv:export-D87781471

Conversation

@trviv
Copy link
Contributor

@trviv trviv commented Nov 24, 2025

Summary: This diff introduces a new compile option, warmup_execute_after_compile, which allows the graph to be executed once immediately after it is compiled. This option is disabled by default and can be enabled by setting the warmup_execute_after_compile flag to true in the GraphConfig object. When enabled, the optional_warmup_execute method makes a dummy call to execute function after the graph is prepacked.

Differential Revision: D87781471

@trviv trviv requested a review from SS-JIA as a code owner November 24, 2025 17:49
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 24, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/15962

Note: Links to docs will display an error until the docs builds have been completed.

⏳ No Failures, 114 Pending

As of commit 8ed7b28 with merge base a7d7db9 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 24, 2025
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Nov 24, 2025

@trivedivivek has exported this pull request. If you are a Meta employee, you can view the originating Diff in D87781471.

trviv added a commit to trviv/executorch that referenced this pull request Nov 24, 2025
…execute rigtht after compilation to create command buffers. (pytorch#15962)

Summary:

This diff introduces a new compile option, `warmup_execute_after_compile`, which allows the graph to be executed once immediately after it is compiled. This option is disabled by default and can be enabled by setting the `warmup_execute_after_compile` flag to `true` in the `GraphConfig` object. When enabled, the `optional_warmup_execute` method makes a dummy call to `execute` function after the graph is prepacked.

Differential Revision: D87781471
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Nov 24, 2025

@trivedivivek has exported this pull request. If you are a Meta employee, you can view the originating Diff in D87781471.

@trviv trviv added the release notes: vulkan Changes to the Vulkan backend delegate label Nov 24, 2025
…execute rigtht after compilation to create command buffers. (pytorch#15962)

Summary:

This diff introduces a new compile option, `warmup_execute_after_compile`, which allows the graph to be executed once immediately after it is compiled. This option is disabled by default and can be enabled by setting the `warmup_execute_after_compile` flag to `true` in the `GraphConfig` object. When enabled, the `optional_warmup_execute` method makes a dummy call to `execute` function after the graph is prepacked.

Reviewed By: yipjustin

Differential Revision: D87781471
@meta-codesync meta-codesync bot merged commit 7329730 into pytorch:main Nov 25, 2025
142 checks passed
jirioc pushed a commit to nxp-upstream/executorch that referenced this pull request Dec 19, 2025
…execute rigtht after compilation to create command buffers.

Differential Revision: D87781471

Pull Request resolved: pytorch#15962
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported release notes: vulkan Changes to the Vulkan backend delegate

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants