Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
b8313ea
rusk sdk v2 init
samuel100 Mar 9, 2026
5a24e71
fix formatting
samuel100 Mar 9, 2026
563d154
fix clippy error
samuel100 Mar 9, 2026
2a05d30
fix clippy treating unused items as errors
samuel100 Mar 9, 2026
c08c2fb
enable integration tests
samuel100 Mar 9, 2026
7a9fd6b
added test data step to rust build step
samuel100 Mar 9, 2026
bef92eb
ensure integration tests run and remove redundant workflow
samuel100 Mar 10, 2026
759f83b
fix: consolidate integration tests into single binary
samuel100 Mar 10, 2026
e45b405
fix: deterministic audio tests and correct error assertion
samuel100 Mar 10, 2026
f7e5b14
fix: link ole32 on Windows for CoTaskMemFree
samuel100 Mar 10, 2026
5dfb2bb
fix: unload models in integration tests to prevent OGA resource leaks
samuel100 Mar 10, 2026
4dd7729
test: add model introspection integration tests
samuel100 Mar 10, 2026
6ce7d5e
test: add web service integration tests
samuel100 Mar 10, 2026
5897778
ci: add artifact upload for Rust SDK builds
samuel100 Mar 10, 2026
c9b5c68
test: add verbose output logging to integration tests
samuel100 Mar 10, 2026
93ed177
refactor(rust): apply Canonical Rust Best Practices to sdk_v2/rust
samuel100 Mar 10, 2026
38649d9
update ordering and description (style guide)
samuel100 Mar 10, 2026
553690f
update to sample README.
samuel100 Mar 10, 2026
9183d44
Expand Rust SDK README with features and usage examples
samuel100 Mar 12, 2026
06da7f0
refactor(rust): split integration tests into separate files
samuel100 Mar 12, 2026
93ae47a
Address PR #500 review feedback (37 threads) and fix WinAppSDK bootst…
samuel100 Mar 12, 2026
b3e4857
fix: propagate FFI errors from streaming JoinHandle instead of swallo…
samuel100 Mar 12, 2026
92dd723
refactor: send FFI errors through channel instead of JoinHandle
samuel100 Mar 12, 2026
2e06a0d
fix: remove close() calls from Rust samples
samuel100 Mar 12, 2026
b0f3878
remove winml in sample
samuel100 Mar 12, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
112 changes: 112 additions & 0 deletions .github/workflows/build-rust-steps.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,112 @@
name: Build Rust SDK

on:
workflow_call:
inputs:
platform:
required: false
type: string
default: 'ubuntu' # or 'windows' or 'macos'
useWinML:
required: false
type: boolean
default: false
run-integration-tests:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tests should run on all platforms

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed. Changed run-integration-tests default to true so tests run on all platforms.

required: false
type: boolean
default: true

permissions:
contents: read

jobs:
build:
runs-on: ${{ inputs.platform }}-latest

defaults:
run:
working-directory: sdk_v2/rust

env:
CARGO_FEATURES: ${{ inputs.useWinML && '--features winml' || false }}

steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
clean: true

- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@stable
with:
components: clippy, rustfmt

- name: Cache cargo dependencies
uses: Swatinem/rust-cache@v2
with:
workspaces: sdk_v2/rust -> target

- name: Checkout test-data-shared from Azure DevOps
if: ${{ inputs.run-integration-tests }}
shell: pwsh
working-directory: ${{ github.workspace }}/..
run: |
$pat = "${{ secrets.AZURE_DEVOPS_PAT }}"
$encodedPat = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$pat"))

# Configure git to use the PAT
git config --global http.https://dev.azure.com.extraheader "AUTHORIZATION: Basic $encodedPat"

# Clone with LFS to parent directory
git lfs install
git clone --depth 1 https://dev.azure.com/microsoft/windows.ai.toolkit/_git/test-data-shared test-data-shared

Write-Host "Clone completed successfully to ${{ github.workspace }}/../test-data-shared"

- name: Checkout specific commit in test-data-shared
if: ${{ inputs.run-integration-tests }}
shell: pwsh
working-directory: ${{ github.workspace }}/../test-data-shared
run: |
Write-Host "Current directory: $(Get-Location)"
git checkout 231f820fe285145b7ea4a449b112c1228ce66a41
if ($LASTEXITCODE -ne 0) {
Write-Error "Git checkout failed."
exit 1
}
Write-Host "`nDirectory contents:"
Get-ChildItem -Recurse -Depth 2 | ForEach-Object { Write-Host " $($_.FullName)" }

- name: Check formatting
run: cargo fmt --all -- --check

# Run Clippy - Rust's official linter for catching common mistakes, enforcing idioms, and improving code quality
- name: Run clippy
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is clippy for? would be helpful to include a comment since its new for rust and not in js/cs

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed. Added comment: Clippy is Rust's official linter for catching common mistakes, enforcing idioms, and improving code quality.

run: cargo clippy --all-targets ${{ env.CARGO_FEATURES }} -- -D warnings

- name: Build
run: cargo build ${{ env.CARGO_FEATURES }}

- name: Run unit tests
run: cargo test --lib ${{ env.CARGO_FEATURES }}

- name: Run integration tests
if: ${{ inputs.run-integration-tests }}
run: cargo test --tests ${{ env.CARGO_FEATURES }} -- --include-ignored --test-threads=1 --nocapture

Comment on lines +90 to +96
Copy link

Copilot AI Mar 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR description says integration tests are #[ignore] by default and enabled in CI with --include-ignored, but the current workflow runs cargo test without --include-ignored, and sdk_v2/rust/tests/integration.rs contains many non-ignored tests. Either update the workflow/commands to match the intended ignored-test model, or update the PR description (and/or mark the tests #[ignore]) to match the actual behavior.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed. Added --include-ignored to the CI test command to align with the #[ignore] test annotation strategy.

# --allow-dirty allows publishing with uncommitted changes, needed because the build process modifies generated files
- name: Package crate
run: cargo package ${{ env.CARGO_FEATURES }} --allow-dirty
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is allow dirty? implies some sort of non-compliant build which is OK for public pipelines but would be good to document for clarity

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed. Added comment explaining --allow-dirty flag purpose.


- name: Upload SDK artifact
uses: actions/upload-artifact@v4
with:
name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }}
path: sdk_v2/rust/target/package/*.crate

- name: Upload flcore logs
uses: actions/upload-artifact@v4
if: always()
with:
name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }}-logs
path: sdk_v2/rust/logs/**
21 changes: 20 additions & 1 deletion .github/workflows/foundry-local-sdk-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,12 @@ jobs:
version: '0.9.0.${{ github.run_number }}'
platform: 'windows'
secrets: inherit
build-rust-windows:
uses: ./.github/workflows/build-rust-steps.yml
with:
platform: 'windows'
run-integration-tests: true
secrets: inherit

build-cs-windows-WinML:
uses: ./.github/workflows/build-cs-steps.yml
Expand All @@ -44,7 +50,14 @@ jobs:
platform: 'windows'
useWinML: true
secrets: inherit

build-rust-windows-WinML:
uses: ./.github/workflows/build-rust-steps.yml
with:
platform: 'windows'
useWinML: true
run-integration-tests: true
secrets: inherit

build-cs-macos:
uses: ./.github/workflows/build-cs-steps.yml
with:
Expand All @@ -56,4 +69,10 @@ jobs:
with:
version: '0.9.0.${{ github.run_number }}'
platform: 'macos'
secrets: inherit
build-rust-macos:
uses: ./.github/workflows/build-rust-steps.yml
with:
platform: 'macos'
run-integration-tests: true
secrets: inherit
29 changes: 0 additions & 29 deletions .github/workflows/rustfmt.yml

This file was deleted.

5 changes: 4 additions & 1 deletion samples/rust/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
[workspace]
members = [
"hello-foundry-local"
"foundry-local-webserver",
"tool-calling-foundry-local",
"native-chat-completions",
"audio-transcription-example",
]
resolver = "2"
21 changes: 14 additions & 7 deletions samples/rust/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,21 @@ This directory contains samples demonstrating how to use the Foundry Local Rust
## Prerequisites

- Rust 1.70.0 or later
- Foundry Local installed and available on PATH

## Samples

### [Hello Foundry Local](./hello-foundry-local)
### [Foundry Local Web Server](./foundry-local-webserver)

A simple example that demonstrates how to:
- Start the Foundry Local service
- Download and load a model
- Send a prompt to the model using the OpenAI-compatible API
- Display the response from the model
Demonstrates how to start a local OpenAI-compatible web server using the SDK, then call it with a standard HTTP client.

### [Native Chat Completions](./native-chat-completions)

Shows both non-streaming and streaming chat completions using the SDK's native chat client.

### [Tool Calling with Foundry Local](./tool-calling-foundry-local)

Demonstrates tool calling with streaming responses, multi-turn conversation, and local tool execution.

### [Audio Transcription](./audio-transcription-example)

Demonstrates audio transcription (non-streaming and streaming) using the `whisper` model.
10 changes: 10 additions & 0 deletions samples/rust/audio-transcription-example/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
[package]
name = "audio-transcription-example"
version = "0.1.0"
edition = "2021"
description = "Audio transcription example using the Foundry Local Rust SDK"

[dependencies]
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
tokio-stream = "0.1"
25 changes: 25 additions & 0 deletions samples/rust/audio-transcription-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Sample: Audio Transcription

This example demonstrates audio transcription (non-streaming and streaming) using the Foundry Local Rust SDK. It uses the `whisper` model to transcribe a WAV audio file.

The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required:

```toml
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
```

Run the application with a path to a WAV file:

```bash
cargo run -- path/to/audio.wav
```

## Using WinML (Windows only)

To use the WinML backend, enable the `winml` feature in `Cargo.toml`:

```toml
foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] }
```

No code changes are needed — same API, different backend.
70 changes: 70 additions & 0 deletions samples/rust/audio-transcription-example/src/main.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

use std::env;
use std::io::{self, Write};

use foundry_local_sdk::{FoundryLocalConfig, FoundryLocalManager};
use tokio_stream::StreamExt;

const ALIAS: &str = "whisper-tiny";

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
println!("Audio Transcription Example");
println!("===========================\n");

// Accept an audio file path as a CLI argument.
let audio_path = env::args().nth(1).unwrap_or_else(|| {
eprintln!("Usage: cargo run -- <path-to-audio.wav>");
std::process::exit(1);
});

// ── 1. Initialise the manager ────────────────────────────────────────
let manager = FoundryLocalManager::create(FoundryLocalConfig::new("foundry_local_samples"))?;

// ── 2. Pick the whisper model and ensure it is downloaded ────────────
let model = manager.catalog().get_model(ALIAS).await?;
println!("Model: {} (id: {})", model.alias(), model.id());

if !model.is_cached().await? {
println!("Downloading model...");
model
.download(Some(|progress: &str| {
print!("\r {progress}%");
io::stdout().flush().ok();
}))
.await?;
println!();
}

println!("Loading model...");
model.load().await?;
println!("✓ Model loaded\n");

// ── 3. Create an audio client ────────────────────────────────────────
let audio_client = model.create_audio_client();

// ── 4. Non-streaming transcription ───────────────────────────────────
println!("--- Non-streaming transcription ---");
let result = audio_client.transcribe(&audio_path).await?;
println!("Transcription: {}", result.text);

// ── 5. Streaming transcription ───────────────────────────────────────
println!("--- Streaming transcription ---");
print!("Transcription: ");
let mut stream = audio_client.transcribe_streaming(&audio_path).await?;
while let Some(chunk) = stream.next().await {
let chunk = chunk?;
print!("{}", chunk.text);
io::stdout().flush().ok();
}
println!("\n");

// ── 6. Unload the model──────────────────────────────────────────────
println!("Unloading model...");
model.unload().await?;
println!("Done.");

Ok(())
}
11 changes: 11 additions & 0 deletions samples/rust/foundry-local-webserver/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
[package]
name = "foundry-local-webserver"
version = "0.1.0"
edition = "2021"
description = "Example of using the Foundry Local SDK with a local OpenAI-compatible web server"

[dependencies]
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
serde_json = "1"
reqwest = { version = "0.12", features = ["json"] }
25 changes: 25 additions & 0 deletions samples/rust/foundry-local-webserver/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Sample: Foundry Local Web Server

This example demonstrates how to start a local OpenAI-compatible web server using the Foundry Local SDK, then call it with a standard HTTP client. This is useful when you want to use the OpenAI REST API directly or integrate with tools that expect an OpenAI-compatible endpoint.

The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required:

```toml
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
```

Run the application:

```bash
cargo run
```

## Using WinML (Windows only)

To use the WinML backend, enable the `winml` feature in `Cargo.toml`:

```toml
foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] }
```

No code changes are needed — same API, different backend.
Loading
Loading