Skip to content

Bump tauri-apps/tauri-action from 0.6.0 to 0.6.2#7

Merged
cryptopoly merged 1 commit intomainfrom
dependabot/github_actions/tauri-apps/tauri-action-0.6.2
Apr 29, 2026
Merged

Bump tauri-apps/tauri-action from 0.6.0 to 0.6.2#7
cryptopoly merged 1 commit intomainfrom
dependabot/github_actions/tauri-apps/tauri-action-0.6.2

Conversation

@dependabot
Copy link
Copy Markdown
Contributor

@dependabot dependabot Bot commented on behalf of github Apr 25, 2026

Bumps tauri-apps/tauri-action from 0.6.0 to 0.6.2.

Release notes

Sourced from tauri-apps/tauri-action's releases.

action v0.6.2

[0.6.2]

  • 73e111f (#1288) The action can now detect the workspace root correctly if the tauri project is configured as the cargo workspace root.

action v0.6.1

[0.6.1]

Changelog

Sourced from tauri-apps/tauri-action's changelog.

[0.6.2]

  • 73e111f (#1288) The action can now detect the workspace root correctly if the tauri project is configured as the cargo workspace root.

[0.6.1]

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [tauri-apps/tauri-action](https://github.com/tauri-apps/tauri-action) from 0.6.0 to 0.6.2.
- [Release notes](https://github.com/tauri-apps/tauri-action/releases)
- [Changelog](https://github.com/tauri-apps/tauri-action/blob/v0.6.2/CHANGELOG.md)
- [Commits](tauri-apps/tauri-action@v0.6.0...v0.6.2)

---
updated-dependencies:
- dependency-name: tauri-apps/tauri-action
  dependency-version: 0.6.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot Bot added dependencies Pull requests that update a dependency file github_actions Pull requests that update GitHub Actions code labels Apr 25, 2026
@cryptopoly cryptopoly merged commit c9934bd into main Apr 29, 2026
2 checks passed
@dependabot dependabot Bot deleted the dependabot/github_actions/tauri-apps/tauri-action-0.6.2 branch April 29, 2026 19:22
cryptopoly added a commit that referenced this pull request May 1, 2026
Two related Windows-only bugs surfaced by the v0.7.2 smoke test on
an RTX 4090 box:

Bug #6 — RTX 4090 reported as 12 GB total
  GPUMonitor._snapshot_nvidia() shells out to nvidia-smi, and on
  Windows boxes without it on PATH (driver installed but no CUDA
  toolkit) it fell through to _fallback_psutil() which returns
  psutil.virtual_memory().total — system RAM, not VRAM. The image /
  video safety estimators then read that as the GPU budget and
  produced 'Likely to crash' warnings on a 24 GB card holding an
  11 GB FLUX model.

  Fix:
  - Try torch.cuda.get_device_properties(0).total_memory first.
    When the GPU bundle is installed this is the most reliable
    source — it reads through the CUDA driver, no PATH needed.
  - Fall back to nvidia-smi as before.
  - Drop the psutil fallback. When neither answers we now return
    {'vram_total_gb': None}, which the TS estimators
    (utils/images.ts, utils/videos.ts) already treat as 'unknown'
    via the DEFAULT_*_MEMORY_GB fallbacks. Better an honest
    'unknown' than a wrong 12 GB.

Bug #7 — Image gen produces gibberish placeholder after install
  DiffusersImageEngine.probe() uses importlib.util.find_spec to
  decide between the placeholder engine and the real diffusers
  pipeline. Once the GPU bundle install lands new packages into
  the extras dir, importlib's negative-lookup cache still answers
  None for the new modules until invalidate_caches() is called.
  The probe kept reporting realGenerationAvailable=False and the
  generation pipeline returned the SVG placeholder, which lands as
  a gibberish image when the frontend renders it as data:image/svg+xml.

  Fix:
  - probe() now calls importlib.invalidate_caches() before
    find_spec so newly-installed packages are picked up without a
    backend restart.
  - The GPU bundle worker (_gpu_bundle_job_worker) now also calls
    invalidate_caches and resets the VRAM total cache when it
    transitions to phase=done, so the immediately-following
    capabilities snapshot reflects the freshly-importable torch.

Tests
  tests/test_gpu_detection.py — 9 unit tests covering
  torch.cuda detection, nvidia-smi precedence, the new
  no-system-RAM fallback path, and the process-lifetime cache.
  All pass; existing pytest suite still green.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file github_actions Pull requests that update GitHub Actions code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant