-
Notifications
You must be signed in to change notification settings - Fork 262
rusk sdk v2 init #500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
rusk sdk v2 init #500
Changes from all commits
b8313ea
5a24e71
563d154
2a05d30
c08c2fb
7a9fd6b
bef92eb
759f83b
e45b405
f7e5b14
5dfb2bb
4dd7729
6ce7d5e
5897778
c9b5c68
93ed177
38649d9
553690f
9183d44
06da7f0
93ae47a
b3e4857
92dd723
2e06a0d
b0f3878
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,112 @@ | ||
| name: Build Rust SDK | ||
|
|
||
| on: | ||
| workflow_call: | ||
| inputs: | ||
| platform: | ||
| required: false | ||
| type: string | ||
| default: 'ubuntu' # or 'windows' or 'macos' | ||
| useWinML: | ||
| required: false | ||
| type: boolean | ||
| default: false | ||
| run-integration-tests: | ||
| required: false | ||
| type: boolean | ||
| default: true | ||
|
|
||
| permissions: | ||
| contents: read | ||
|
|
||
| jobs: | ||
| build: | ||
| runs-on: ${{ inputs.platform }}-latest | ||
|
|
||
| defaults: | ||
| run: | ||
| working-directory: sdk_v2/rust | ||
|
|
||
| env: | ||
| CARGO_FEATURES: ${{ inputs.useWinML && '--features winml' || false }} | ||
|
|
||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v4 | ||
| with: | ||
| clean: true | ||
|
|
||
| - name: Install Rust toolchain | ||
| uses: dtolnay/rust-toolchain@stable | ||
| with: | ||
| components: clippy, rustfmt | ||
|
|
||
| - name: Cache cargo dependencies | ||
| uses: Swatinem/rust-cache@v2 | ||
| with: | ||
| workspaces: sdk_v2/rust -> target | ||
|
|
||
| - name: Checkout test-data-shared from Azure DevOps | ||
| if: ${{ inputs.run-integration-tests }} | ||
| shell: pwsh | ||
| working-directory: ${{ github.workspace }}/.. | ||
| run: | | ||
| $pat = "${{ secrets.AZURE_DEVOPS_PAT }}" | ||
| $encodedPat = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$pat")) | ||
|
|
||
| # Configure git to use the PAT | ||
| git config --global http.https://dev.azure.com.extraheader "AUTHORIZATION: Basic $encodedPat" | ||
|
|
||
| # Clone with LFS to parent directory | ||
| git lfs install | ||
| git clone --depth 1 https://dev.azure.com/microsoft/windows.ai.toolkit/_git/test-data-shared test-data-shared | ||
|
|
||
| Write-Host "Clone completed successfully to ${{ github.workspace }}/../test-data-shared" | ||
|
|
||
| - name: Checkout specific commit in test-data-shared | ||
| if: ${{ inputs.run-integration-tests }} | ||
| shell: pwsh | ||
| working-directory: ${{ github.workspace }}/../test-data-shared | ||
| run: | | ||
| Write-Host "Current directory: $(Get-Location)" | ||
| git checkout 231f820fe285145b7ea4a449b112c1228ce66a41 | ||
| if ($LASTEXITCODE -ne 0) { | ||
| Write-Error "Git checkout failed." | ||
| exit 1 | ||
| } | ||
| Write-Host "`nDirectory contents:" | ||
| Get-ChildItem -Recurse -Depth 2 | ForEach-Object { Write-Host " $($_.FullName)" } | ||
|
|
||
| - name: Check formatting | ||
| run: cargo fmt --all -- --check | ||
|
|
||
| # Run Clippy - Rust's official linter for catching common mistakes, enforcing idioms, and improving code quality | ||
| - name: Run clippy | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what is clippy for? would be helpful to include a comment since its new for rust and not in js/cs
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fixed. Added comment: Clippy is Rust's official linter for catching common mistakes, enforcing idioms, and improving code quality. |
||
| run: cargo clippy --all-targets ${{ env.CARGO_FEATURES }} -- -D warnings | ||
|
|
||
| - name: Build | ||
| run: cargo build ${{ env.CARGO_FEATURES }} | ||
|
|
||
| - name: Run unit tests | ||
| run: cargo test --lib ${{ env.CARGO_FEATURES }} | ||
|
|
||
| - name: Run integration tests | ||
| if: ${{ inputs.run-integration-tests }} | ||
| run: cargo test --tests ${{ env.CARGO_FEATURES }} -- --include-ignored --test-threads=1 --nocapture | ||
|
|
||
|
Comment on lines
+90
to
+96
|
||
| # --allow-dirty allows publishing with uncommitted changes, needed because the build process modifies generated files | ||
| - name: Package crate | ||
| run: cargo package ${{ env.CARGO_FEATURES }} --allow-dirty | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what is allow dirty? implies some sort of non-compliant build which is OK for public pipelines but would be good to document for clarity
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fixed. Added comment explaining |
||
|
|
||
| - name: Upload SDK artifact | ||
| uses: actions/upload-artifact@v4 | ||
| with: | ||
| name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }} | ||
| path: sdk_v2/rust/target/package/*.crate | ||
|
|
||
| - name: Upload flcore logs | ||
| uses: actions/upload-artifact@v4 | ||
| if: always() | ||
| with: | ||
| name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }}-logs | ||
| path: sdk_v2/rust/logs/** | ||
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,5 +1,8 @@ | ||
| [workspace] | ||
| members = [ | ||
| "hello-foundry-local" | ||
| "foundry-local-webserver", | ||
| "tool-calling-foundry-local", | ||
| "native-chat-completions", | ||
| "audio-transcription-example", | ||
| ] | ||
| resolver = "2" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| [package] | ||
| name = "audio-transcription-example" | ||
| version = "0.1.0" | ||
| edition = "2021" | ||
| description = "Audio transcription example using the Foundry Local Rust SDK" | ||
|
|
||
| [dependencies] | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust" } | ||
| tokio = { version = "1", features = ["rt-multi-thread", "macros"] } | ||
| tokio-stream = "0.1" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,25 @@ | ||
| # Sample: Audio Transcription | ||
|
|
||
| This example demonstrates audio transcription (non-streaming and streaming) using the Foundry Local Rust SDK. It uses the `whisper` model to transcribe a WAV audio file. | ||
|
|
||
| The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required: | ||
|
|
||
| ```toml | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust" } | ||
| ``` | ||
|
|
||
| Run the application with a path to a WAV file: | ||
|
|
||
| ```bash | ||
| cargo run -- path/to/audio.wav | ||
| ``` | ||
|
|
||
| ## Using WinML (Windows only) | ||
|
|
||
| To use the WinML backend, enable the `winml` feature in `Cargo.toml`: | ||
|
|
||
| ```toml | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] } | ||
| ``` | ||
|
|
||
| No code changes are needed — same API, different backend. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,70 @@ | ||
| // Copyright (c) Microsoft Corporation. All rights reserved. | ||
| // Licensed under the MIT License. | ||
|
|
||
| use std::env; | ||
| use std::io::{self, Write}; | ||
|
|
||
| use foundry_local_sdk::{FoundryLocalConfig, FoundryLocalManager}; | ||
| use tokio_stream::StreamExt; | ||
|
|
||
| const ALIAS: &str = "whisper-tiny"; | ||
|
|
||
| #[tokio::main] | ||
| async fn main() -> Result<(), Box<dyn std::error::Error>> { | ||
| println!("Audio Transcription Example"); | ||
| println!("===========================\n"); | ||
|
|
||
| // Accept an audio file path as a CLI argument. | ||
| let audio_path = env::args().nth(1).unwrap_or_else(|| { | ||
| eprintln!("Usage: cargo run -- <path-to-audio.wav>"); | ||
| std::process::exit(1); | ||
| }); | ||
|
|
||
| // ── 1. Initialise the manager ──────────────────────────────────────── | ||
| let manager = FoundryLocalManager::create(FoundryLocalConfig::new("foundry_local_samples"))?; | ||
|
|
||
| // ── 2. Pick the whisper model and ensure it is downloaded ──────────── | ||
| let model = manager.catalog().get_model(ALIAS).await?; | ||
| println!("Model: {} (id: {})", model.alias(), model.id()); | ||
|
|
||
| if !model.is_cached().await? { | ||
| println!("Downloading model..."); | ||
| model | ||
| .download(Some(|progress: &str| { | ||
| print!("\r {progress}%"); | ||
| io::stdout().flush().ok(); | ||
| })) | ||
| .await?; | ||
| println!(); | ||
| } | ||
|
|
||
| println!("Loading model..."); | ||
| model.load().await?; | ||
| println!("✓ Model loaded\n"); | ||
|
|
||
| // ── 3. Create an audio client ──────────────────────────────────────── | ||
| let audio_client = model.create_audio_client(); | ||
|
|
||
| // ── 4. Non-streaming transcription ─────────────────────────────────── | ||
| println!("--- Non-streaming transcription ---"); | ||
| let result = audio_client.transcribe(&audio_path).await?; | ||
| println!("Transcription: {}", result.text); | ||
|
|
||
| // ── 5. Streaming transcription ─────────────────────────────────────── | ||
| println!("--- Streaming transcription ---"); | ||
| print!("Transcription: "); | ||
| let mut stream = audio_client.transcribe_streaming(&audio_path).await?; | ||
| while let Some(chunk) = stream.next().await { | ||
| let chunk = chunk?; | ||
| print!("{}", chunk.text); | ||
| io::stdout().flush().ok(); | ||
| } | ||
| println!("\n"); | ||
|
|
||
| // ── 6. Unload the model────────────────────────────────────────────── | ||
| println!("Unloading model..."); | ||
| model.unload().await?; | ||
| println!("Done."); | ||
|
|
||
| Ok(()) | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,11 @@ | ||
| [package] | ||
| name = "foundry-local-webserver" | ||
| version = "0.1.0" | ||
| edition = "2021" | ||
| description = "Example of using the Foundry Local SDK with a local OpenAI-compatible web server" | ||
|
|
||
| [dependencies] | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust" } | ||
| tokio = { version = "1", features = ["rt-multi-thread", "macros"] } | ||
| serde_json = "1" | ||
| reqwest = { version = "0.12", features = ["json"] } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,25 @@ | ||
| # Sample: Foundry Local Web Server | ||
|
|
||
| This example demonstrates how to start a local OpenAI-compatible web server using the Foundry Local SDK, then call it with a standard HTTP client. This is useful when you want to use the OpenAI REST API directly or integrate with tools that expect an OpenAI-compatible endpoint. | ||
|
|
||
| The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required: | ||
|
|
||
| ```toml | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust" } | ||
| ``` | ||
|
|
||
| Run the application: | ||
|
|
||
| ```bash | ||
| cargo run | ||
| ``` | ||
|
|
||
| ## Using WinML (Windows only) | ||
|
|
||
| To use the WinML backend, enable the `winml` feature in `Cargo.toml`: | ||
|
|
||
| ```toml | ||
| foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] } | ||
| ``` | ||
|
|
||
| No code changes are needed — same API, different backend. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tests should run on all platforms
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed. Changed
run-integration-testsdefault totrueso tests run on all platforms.