Bump tauri-apps/tauri-action from 0.6.0 to 0.6.2#7
Merged
cryptopoly merged 1 commit intomainfrom Apr 29, 2026
Merged
Conversation
Bumps [tauri-apps/tauri-action](https://github.com/tauri-apps/tauri-action) from 0.6.0 to 0.6.2. - [Release notes](https://github.com/tauri-apps/tauri-action/releases) - [Changelog](https://github.com/tauri-apps/tauri-action/blob/v0.6.2/CHANGELOG.md) - [Commits](tauri-apps/tauri-action@v0.6.0...v0.6.2) --- updated-dependencies: - dependency-name: tauri-apps/tauri-action dependency-version: 0.6.2 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com>
This was referenced Apr 29, 2026
cryptopoly
added a commit
that referenced
this pull request
May 1, 2026
Two related Windows-only bugs surfaced by the v0.7.2 smoke test on an RTX 4090 box: Bug #6 — RTX 4090 reported as 12 GB total GPUMonitor._snapshot_nvidia() shells out to nvidia-smi, and on Windows boxes without it on PATH (driver installed but no CUDA toolkit) it fell through to _fallback_psutil() which returns psutil.virtual_memory().total — system RAM, not VRAM. The image / video safety estimators then read that as the GPU budget and produced 'Likely to crash' warnings on a 24 GB card holding an 11 GB FLUX model. Fix: - Try torch.cuda.get_device_properties(0).total_memory first. When the GPU bundle is installed this is the most reliable source — it reads through the CUDA driver, no PATH needed. - Fall back to nvidia-smi as before. - Drop the psutil fallback. When neither answers we now return {'vram_total_gb': None}, which the TS estimators (utils/images.ts, utils/videos.ts) already treat as 'unknown' via the DEFAULT_*_MEMORY_GB fallbacks. Better an honest 'unknown' than a wrong 12 GB. Bug #7 — Image gen produces gibberish placeholder after install DiffusersImageEngine.probe() uses importlib.util.find_spec to decide between the placeholder engine and the real diffusers pipeline. Once the GPU bundle install lands new packages into the extras dir, importlib's negative-lookup cache still answers None for the new modules until invalidate_caches() is called. The probe kept reporting realGenerationAvailable=False and the generation pipeline returned the SVG placeholder, which lands as a gibberish image when the frontend renders it as data:image/svg+xml. Fix: - probe() now calls importlib.invalidate_caches() before find_spec so newly-installed packages are picked up without a backend restart. - The GPU bundle worker (_gpu_bundle_job_worker) now also calls invalidate_caches and resets the VRAM total cache when it transitions to phase=done, so the immediately-following capabilities snapshot reflects the freshly-importable torch. Tests tests/test_gpu_detection.py — 9 unit tests covering torch.cuda detection, nvidia-smi precedence, the new no-system-RAM fallback path, and the process-lifetime cache. All pass; existing pytest suite still green.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Bumps tauri-apps/tauri-action from 0.6.0 to 0.6.2.
Release notes
Sourced from tauri-apps/tauri-action's releases.
Changelog
Sourced from tauri-apps/tauri-action's changelog.
Commits
84b9d35Apply Version Updates From Current Changes (#1291)73e111ffix: check if tauri project is workspace root first (#1288)73fb865apply version updates (#1242)ad5c271fix: process multiple config args (#1241)51d54a6ci: sync workflow with dev branch29a724fchore(deps): update dependency@types/nodeto v24.10.0 (#1160)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)