feat: managed fork setup — README, FUNDING, crate rename to structured-zstd#2
feat: managed fork setup — README, FUNDING, crate rename to structured-zstd#2
Conversation
- Package name: ruzstd → structured-zstd - Repository/homepage: point to structured-world fork - Add SW Foundation as co-author - Version: 0.0.1 (initial fork release) Closes #1
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: ASSERTIVE Plan: Pro Run ID: ⛔ Files ignored due to path filters (136)
📒 Files selected for processing (149)
📝 WalkthroughWalkthroughAdds managed-fork infrastructure and rebranding: CI/workflows, release automation, Dependabot and CodeRabbit configs, benchmarks and benchmark runner, extended tests (roundtrip & cross-validation), README/NOTICE, license change to Apache-2.0, crate metadata rename to Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer
participant GH as GitHub (PR/Push)
participant Actions as GitHub Actions
participant Nextest as nextest runner
participant BenchScript as run-benchmarks.sh
participant Codecov as Codecov
participant ReleasePlz as release-plz
participant CratesIO as crates.io
Dev->>GH: Push / open PR
GH->>Actions: Trigger CI workflow
Actions->>Actions: lint job (fmt/clippy)
Actions->>Nextest: test matrix (ubuntu/macos/windows/msrv)
Nextest->>Actions: test results (JUnit)
Actions->>BenchScript: benchmark job (runs benches)
BenchScript->>Actions: benchmark-results.json
Actions->>Codecov: coverage upload
GH->>ReleasePlz: push to main triggers release workflow
ReleasePlz->>CratesIO: publish release (on release event)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Pull request overview
Sets up the repository as a maintained fork and rebrands the Rust crate for publication as structured-zstd, with updated public-facing metadata and funding links.
Changes:
- Renames the crate/package from
ruzstdtostructured-zstdand updates Cargo metadata (repo/homepage/authors/keywords/readme). - Replaces the top-level README with a fork-focused README and adds a donation QR SVG asset.
- Adds GitHub Sponsors configuration via
.github/FUNDING.yml.
Reviewed changes
Copilot reviewed 4 out of 5 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
ruzstd/Cargo.toml |
Renames/publishes the package as structured-zstd and points crates.io metadata at the fork and new README. |
README.md |
New fork README content, status, usage snippets, and donation section. |
assets/usdt-qr.svg |
Adds a QR code image referenced by the README. |
.github/FUNDING.yml |
Enables GitHub Sponsors link for the org. |
Readme.md |
Removes the previous top-level README file. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
- CI: lint, test (3 OS × stable + i686 cross + MSRV 1.87), codecov, bench - release-plz + cargo publish via OIDC - Dependabot auto-merge for minor/patch - nextest with ci profile - License: MIT → Apache-2.0
- compare_ffi bench: pure Rust vs C FFI zstd decompression - Baseline: pure Rust 6.02ms vs C FFI 2.80ms (2.15x slower) - Fix cli/Cargo.toml dependency: ruzstd → structured-zstd - Fix bench/cli imports for renamed crate
- decompress: pure Rust vs C FFI - compress: pure Rust fastest vs C FFI level 1 vs C FFI level 3 - Regression alerts: 15% warn, 25% fail - Results stored in gh-pages via SW Release Bot
…-dep-of-std rustc-dep-of-std pulls compiler_builtins which requires nightly. Use --features hash,std,dict_builder for all CI jobs.
|
@coderabbitai re-review |
|
✅ Actions performedFull review triggered. |
Welcome to Codecov 🎉Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests. ℹ️ You can also turn on project coverage checks and project coverage reporting on Pull Request comment Thanks for integrating Codecov - We've got you covered ☂️ |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 22 out of 23 changed files in this pull request and generated 7 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
- roundtrip_random_data: 1000 iterations, random data 0-64KB - roundtrip_compressible_data: 1000 iterations, repeating patterns - roundtrip_streaming_api: 1000 iterations via FrameCompressor - cross_validation: 1000 iterations rust→ffi + 1000 ffi→rust - edge cases: empty, single byte, all zeros, all 0xFF, ascending, 1MB RLE - nextest ci profile with retries and JUnit XML
There was a problem hiding this comment.
Actionable comments posted: 15
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
cli/Cargo.toml (1)
8-12:⚠️ Potential issue | 🟠 MajorLicense and metadata inconsistencies.
Several issues with the package metadata:
License mismatch:
license = "MIT"conflicts with the repository'sLICENSEfile which is now Apache-2.0. This will publish incorrect license metadata to crates.io.Stale readme path:
readme = "../Readme.md"references the old filename. The README was renamed toREADME.mdat the repository root.Repository URLs:
homepageandrepositorystill point toKillingSpark/zstd-rs(upstream) rather than the fork atstructured-world/structured-zstd.Proposed fix
license = "MIT" +license = "Apache-2.0" homepage = "https://github.com/KillingSpark/zstd-rs" +homepage = "https://github.com/structured-world/structured-zstd" repository = "https://github.com/KillingSpark/zstd-rs" +repository = "https://github.com/structured-world/structured-zstd" description = "A command line interface for the `ruzstd` zstd implementation" readme = "../Readme.md" +readme = "../README.md"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/Cargo.toml` around lines 8 - 12, Update the package metadata in Cargo.toml to reflect the fork: change the license value from "MIT" to "Apache-2.0" (the correct license), update readme from "../Readme.md" to "../README.md" (correct filename casing), and replace both homepage and repository values that reference "KillingSpark/zstd-rs" with the fork's URL "https://github.com/structured-world/structured-zstd"; ensure you edit the existing keys (license, readme, homepage, repository) rather than adding duplicates.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/FUNDING.yml:
- Line 1: The FUNDING.yml entry "github: structured-world" points to an account
without GitHub Sponsors; either remove that line, replace it with a valid
sponsoring account handle, or enable Sponsors on the structured-world account.
Edit the .github/FUNDING.yml file to delete the "github: structured-world" entry
or substitute it with a confirmed sponsor-enabled GitHub username, then commit
the change so the Sponsor button points to a working account.
In @.github/scripts/run-benchmarks.sh:
- Around line 33-36: The current run-benchmarks.sh fallback that sets results =
[{"name":"no_results","unit":"ms","value":0}] masks real failures; change the
block that checks results (the `if not results:` branch) to exit non‑zero
instead of injecting a dummy result—print the STDERR warning and call `exit 1`
so CI fails when no results are parsed; if you must keep a fallback for
compatibility, set the fallback `value` to a clearly failing non‑zero sentinel
(e.g., 999999) and document it, but prefer replacing the dummy with `exit 1` in
run-benchmarks.sh.
- Around line 4-9: Remove the stderr suppression and enable pipefail so failures
aren't hidden: delete the "2>/dev/null" on the cargo bench pipeline and add "set
-o pipefail" (in addition to the existing "set -e") so that if "cargo bench
--bench compare_ffi -p structured-zstd -- --output-format bencher | tee
/tmp/bench-raw.txt" fails the script exits with a non-zero status; keep using
tee to capture output but ensure stderr is preserved for diagnostics.
In @.github/workflows/ci.yml:
- Around line 37-41: Change the GitHub Actions matrix strategy to report
failures across all OSes by setting the strategy.fail-fast flag to false (modify
the existing strategy block where "fail-fast: true" is defined); update the YAML
entry under the "strategy" section so it reads "fail-fast: false" while leaving
the matrix (os: [ubuntu-latest, windows-latest, macos-latest]) and runs-on: ${{
matrix.os }} intact.
In @.github/workflows/dependabot-auto-merge.yml:
- Around line 36-41: Update the gh CLI merge command to remove the merged
Dependabot branch by adding the --delete-branch flag to the existing gh pr merge
invocation (the command that currently reads gh pr merge --auto --squash
"$PR_NUMBER"); keep the existing GH_TOKEN and PR_NUMBER env usage intact so the
command can authenticate and target the same PR.
- Around line 17-18: Remove the unnecessary checkout step named "Checkout" that
uses actions/checkout@v6 since the job only invokes gh pr API commands; delete
the step (or comment it out) and verify no later steps reference the local
workspace or files from the repo—if any do, either restore minimal checkout or
adjust those steps to use the GitHub REST/CLI APIs instead and ensure
authentication via GITHUB_TOKEN is available to the gh commands.
In @.github/workflows/release-plz.yml:
- Around line 1-7: Add an explicit permissions block to the workflow named
"Release" so token privileges are limited and documented; update the top-level
YAML (in the file containing the workflow name "Release" and the trigger block
under on: push) to include a permissions: section with the minimal required
scopes (for example: contents: read or contents: write and id-token: write if
needed for signing/publishing) tailored to this workflow's actions.
In @.github/workflows/release.yml:
- Around line 4-5: The release workflow uses the "release" job with types:
created which triggers too early; update the event filter for the release
workflow by replacing types: created with types: [published] (i.e., change the
release trigger under the "release" job from created to published) so the job
runs only once the release is finalized and visible.
- Around line 14-16: Pin actions/checkout to a commit SHA instead of a floating
tag: replace uses: actions/checkout@v6 with the v6 commit SHA (e.g., de0fac2 for
v6.0.2) to mitigate tag-mutation risks, leave dtolnay/rust-toolchain@stable
as-is if you want dynamic resolution but optionally pin to a specific Rust
version/date if reproducibility is required, and keep
rust-lang/crates-io-auth-action@v1 unchanged unless you also want to pin it to a
specific commit SHA.
In @.release-plz.toml:
- Line 2: The repository currently has semver_check disabled in
.release-plz.toml (semver_check), which is fine for initial setup; update the
file to re-enable semver checking by setting the semver_check key back to true
(or implement an environment-gated toggle around semver_check) once the crate
API stabilizes so release-plz will enforce semantic versioning on future
releases.
- Around line 8-21: Add explicit parsing for breaking changes to the
commit_parsers array: detect commit messages with the "!" conventional marker
(e.g., pattern like "^.*!:" or "^feat!") and detect footers containing "BREAKING
CHANGE" (e.g., message or footer match for "BREAKING CHANGE") and assign them to
a dedicated group name such as "Breaking Changes" (or a breaking flag if
supported). Update the commit_parsers configuration (the commit_parsers array)
to include entries that match "!:" and "BREAKING CHANGE" so
release-plz/git-cliff will surface a "Breaking Changes" section in the
changelog.
In `@LICENSE`:
- Around line 1-202: Update the repo so the declared package licenses match the
new Apache-2.0 LICENSE: change the license field in cli/Cargo.toml from "MIT" to
"Apache-2.0" (or an explicit SPDX expression if multi-licensed) and verify any
other Cargo.toml/package manifests use the same value; add a NOTICE file at repo
root containing an attribution to the upstream ruzstd project and author Moritz
Borcherding per Apache-2.0 §4(d), and ensure the LICENSE file and source headers
retain the required copyright notices.
In `@README.md`:
- Around line 60-69: The example uses an undefined variable compressed_data;
define a sample compressed byte slice before creating the source and use it when
constructing StreamingDecoder (reference StreamingDecoder::new, variable
compressed_data, mutable source, decoder, and result) — e.g. declare
compressed_data: &[u8] = &[/* zstd-compressed bytes */]; then set let mut
source: &[u8] = compressed_data; and keep the rest (decoder.read_to_end(&mut
result).unwrap()) so the example is self-contained and compiles.
In `@ruzstd/benches/compare_ffi.rs`:
- Around line 14-31: The benchmark is asymmetric: FrameDecoder `fr` and `target`
are allocated once for "pure_rust" but "c_ffi" allocates a fresh output each
iteration via `zstd::decode_all`, and the hardcoded 200 MiB buffer is brittle.
Fix by computing `decompressed` once to get `expected_len`, then allocate
`target` to `expected_len` and reuse that buffer for both benchmarks; update the
"c_ffi" bench to decode into the preallocated buffer (or use the FFI API that
writes into a provided buffer) inside `group.bench_function("c_ffi", ...)` and
only verify length/assert without allocating each iteration, keeping
`fr.decode_all(..., target)` and the C-FFI call symmetric.
In `@ruzstd/src/lib.rs`:
- Line 13: The README doc-test refers to an undefined variable
`compressed_data`, causing test failures; update the decompression example in
README.md to use the actual variable from the preceding compression example
(`compressed`) or explicitly define `compressed_data` in that snippet, or
alternatively mark the entire code block as `no_run`/`ignore` if it’s not meant
to execute; locate the example referenced by the include (README.md included via
the crate root attribute in src/lib.rs) and change the decompression code to use
`&compressed` (or add a `let compressed_data = &compressed;`) or add the
appropriate docblock directive to skip running it.
---
Outside diff comments:
In `@cli/Cargo.toml`:
- Around line 8-12: Update the package metadata in Cargo.toml to reflect the
fork: change the license value from "MIT" to "Apache-2.0" (the correct license),
update readme from "../Readme.md" to "../README.md" (correct filename casing),
and replace both homepage and repository values that reference
"KillingSpark/zstd-rs" with the fork's URL
"https://github.com/structured-world/structured-zstd"; ensure you edit the
existing keys (license, readme, homepage, repository) rather than adding
duplicates.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: c97e9c83-a4c1-4631-bfb8-f2ecbaa64ec6
⛔ Files ignored due to path filters (1)
assets/usdt-qr.svgis excluded by!**/*.svg
📒 Files selected for processing (21)
.coderabbit.yaml.config/nextest.toml.github/FUNDING.yml.github/copilot-instructions.md.github/dependabot.yml.github/scripts/run-benchmarks.sh.github/workflows/ci.yml.github/workflows/cleanup-branches.yml.github/workflows/dependabot-auto-merge.yml.github/workflows/release-plz.yml.github/workflows/release.yml.release-plz.tomlLICENSEREADME.mdReadme.mdcli/Cargo.tomlcli/src/main.rsruzstd/Cargo.tomlruzstd/benches/compare_ffi.rsruzstd/benches/decode_all.rsruzstd/src/lib.rs
💤 Files with no reviewable changes (1)
- Readme.md
- NOTICE: retain upstream MIT license attribution (Moritz Borcherding) - Benchmarks: wrap compress outputs in black_box to prevent elision - Benchmark script: add pipefail, remove stderr suppression, fail on no results - cli/Cargo.toml: fix readme path Readme.md → README.md
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
cli/Cargo.toml (1)
3-11:⚠️ Potential issue | 🟡 MinorCLI metadata still references upstream repository.
The package metadata appears inconsistent with the managed-fork setup objective:
Field Current Value Expected nameruzstd-cliConsider structured-zstd-clihomepageKillingSpark/zstd-rsstructured-world/structured-zstdrepositoryKillingSpark/zstd-rsstructured-world/structured-zstddescriptionReferences "ruzstd" Reference "structured-zstd" If the CLI will be published to crates.io under this fork, these fields should point to the new repository.
🔧 Suggested metadata update
[package] -name = "ruzstd-cli" +name = "structured-zstd-cli" version = "0.8.2" rust-version = "1.87.0" authors = ["Moritz Borcherding <moritz.borcherding@web.de>"] edition = "2018" license = "MIT" -homepage = "https://github.com/KillingSpark/zstd-rs" -repository = "https://github.com/KillingSpark/zstd-rs" -description = "A command line interface for the `ruzstd` zstd implementation" +homepage = "https://github.com/structured-world/structured-zstd" +repository = "https://github.com/structured-world/structured-zstd" +description = "A command line interface for the `structured-zstd` zstd implementation" readme = "../README.md"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/Cargo.toml` around lines 3 - 11, The Cargo.toml metadata still points to the upstream project; update the package metadata fields in Cargo.toml (specifically the name, homepage, repository, and description entries) to reflect the forked project (e.g., change name from "ruzstd-cli" to "structured-zstd-cli", update homepage and repository to the structured-world/structured-zstd URLs, and replace "ruzstd" in the description with "structured-zstd") so the crate metadata matches the managed-fork repository and intended crate name.
♻️ Duplicate comments (1)
ruzstd/benches/compare_ffi.rs (1)
13-33:⚠️ Potential issue | 🟠 MajorDecompression benchmark is still asymmetric and biases the Rust-vs-FFI ratio.
Line 14 and Line 15 allocate/reuse once for
pure_rust, while Line 30 allocates every iteration forc_ffi. This measures different costs and skews the reported comparison.Suggested patch (make per-iteration work symmetric)
fn bench_decompress(c: &mut Criterion) { let mut group = c.benchmark_group("decompress"); + let expected_len = zstd::decode_all(COMPRESSED_CORPUS).unwrap().len(); - // Pure Rust decompression - let mut fr = structured_zstd::decoding::FrameDecoder::new(); - let target = &mut vec![0u8; 1024 * 1024 * 200]; - group.bench_function("pure_rust", |b| { b.iter(|| { - fr.decode_all(COMPRESSED_CORPUS, target).unwrap(); + let mut fr = structured_zstd::decoding::FrameDecoder::new(); + let mut target = vec![0u8; expected_len]; + let written = fr.decode_all(COMPRESSED_CORPUS, &mut target).unwrap(); + assert_eq!(written, expected_len); }) }); - // C FFI decompression - let decompressed = zstd::decode_all(COMPRESSED_CORPUS).unwrap(); - let expected_len = decompressed.len(); - drop(decompressed); - group.bench_function("c_ffi", |b| { b.iter(|| { let out = zstd::decode_all(COMPRESSED_CORPUS).unwrap(); assert_eq!(out.len(), expected_len); }) });#!/bin/bash # Verify decompression benchmark asymmetry in the current file. cat -n ruzstd/benches/compare_ffi.rs | sed -n '10,35p'🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ruzstd/benches/compare_ffi.rs` around lines 13 - 33, The benchmark is asymmetric: FrameDecoder::decode_all is reusing a preallocated target (fr and target) while zstd::decode_all allocates every iteration; make the work symmetric by either moving the pure_rust allocation into the closure so it allocates per-iteration, or by changing the c_ffi bench to reuse a preallocated buffer and decode into it (i.e., mirror the FrameDecoder/target pattern); update references to fr, target, COMPRESSED_CORPUS and zstd::decode_all accordingly so both benches perform equivalent allocation/decoding semantics.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@ruzstd/src/dictionary/frequency.rs`:
- Around line 39-41: The rolling hash update uses the `% PRIME` operator on a
possibly negative signed window_hash, which can leave negative values and cause
mismatches against the non-negative pattern_hash; update the hash
canonicalization in the rolling hash calculation (the expression assigning
window_hash) to use rem_euclid(PRIME) so window_hash becomes the non-negative
Euclidean remainder matching pattern_hash semantics (modify the code that
updates window_hash in the rolling-hash loop where window_hash, ALPHABET_SIZE,
h, body and PRIME are used).
- Around line 19-23: The precomputation of h is incorrect: instead of a single
multiplication the code must set h = ALPHABET_SIZE.pow(pattern.len() as u32 - 1)
modulo PRIME (i.e. h = ALPHABET_SIZE^(pattern.len()-1) % PRIME) before the
rolling-hash loop; update the logic that initializes h (the variable h in
frequency.rs used for the rolling removal at the line referenced) to compute the
modular exponent either via a small loop or a modular exponentiation helper so h
equals ALPHABET_SIZE^(pattern.len()-1) % PRIME when pattern.len() > 1.
---
Outside diff comments:
In `@cli/Cargo.toml`:
- Around line 3-11: The Cargo.toml metadata still points to the upstream
project; update the package metadata fields in Cargo.toml (specifically the
name, homepage, repository, and description entries) to reflect the forked
project (e.g., change name from "ruzstd-cli" to "structured-zstd-cli", update
homepage and repository to the structured-world/structured-zstd URLs, and
replace "ruzstd" in the description with "structured-zstd") so the crate
metadata matches the managed-fork repository and intended crate name.
---
Duplicate comments:
In `@ruzstd/benches/compare_ffi.rs`:
- Around line 13-33: The benchmark is asymmetric: FrameDecoder::decode_all is
reusing a preallocated target (fr and target) while zstd::decode_all allocates
every iteration; make the work symmetric by either moving the pure_rust
allocation into the closure so it allocates per-iteration, or by changing the
c_ffi bench to reuse a preallocated buffer and decode into it (i.e., mirror the
FrameDecoder/target pattern); update references to fr, target, COMPRESSED_CORPUS
and zstd::decode_all accordingly so both benches perform equivalent
allocation/decoding semantics.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 08edf729-b2c1-4ad1-874b-d78c97364a51
📒 Files selected for processing (8)
.github/scripts/run-benchmarks.shNOTICEcli/Cargo.tomlruzstd/benches/compare_ffi.rsruzstd/src/dictionary/frequency.rsruzstd/src/tests/mod.rsruzstd/src/tests/roundtrip_integrity.rsruzstd/tests/cross_validation.rs
- 8 doc-test files updated: encoding, decoding, dictionary, lib - README decompression example: add no_run + initialize compressed_data - gh-pages branch created for benchmark-action data storage
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@README.md`:
- Line 76: The image URL in README.md uses the non-deterministic segment "HEAD"
which can change based on the repository default branch; update the raw asset
URL (the USDT TRC-20 Donation QR Code link) to pin to a stable reference such as
"main" or a specific tag/commit (e.g., replace ".../HEAD/.../assets/usdt-qr.svg"
with ".../main/.../assets/usdt-qr.svg" or a commit SHA) so the image renders
deterministically for crates.io/docs and other consumers.
- Around line 64-67: The example snippet creates an empty compressed_data and
calls StreamingDecoder::new(...).unwrap(), which can panic; update the example
to produce a valid compressed payload (e.g. use compress_to_vec with
CompressionLevel to compress a sample byte slice) and replace unwrap() with
expect("valid zstd payload") (or otherwise handle the Result) when constructing
the StreamingDecoder; reference StreamingDecoder::new, compressed_data,
compress_to_vec, and CompressionLevel in your change so the snippet is
self-contained and non-failing.
In `@ruzstd/src/dictionary/frequency.rs`:
- Around line 18-21: Add an explicit guard for empty pattern before the hash
setup/rolling loop: check pattern.is_empty() in the function that computes the
rolling hash (the block initializing h and looping over
pattern.len().saturating_sub(1)) and either assert!(!pattern.is_empty()) or
return a defined result (e.g., Err or false) for pattern.len() == 0; ensure the
guard appears before computing h and before using ALPHABET_SIZE and PRIME so the
rolling loop and subsequent matching logic never execute with an empty pattern.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: e3cbf8be-4648-4cef-bc13-af04d25c159f
📒 Files selected for processing (7)
README.mdruzstd/src/decoding/frame_decoder.rsruzstd/src/decoding/streaming_decoder.rsruzstd/src/dictionary/frequency.rsruzstd/src/dictionary/mod.rsruzstd/src/encoding/frame_compressor.rsruzstd/src/encoding/mod.rs
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 31 out of 32 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
cli/Cargo.toml (1)
3-12: 🧹 Nitpick | 🔵 TrivialConsider renaming the CLI package to match the fork.
The package name remains
ruzstd-cliwhile the description now referencesstructured-zstd. Additionally,homepageandrepository(lines 9-10) still point to the upstreamKillingSpark/zstd-rsrather than this fork. If you intend to publish this CLI as part of the managed fork, consider aligning:-name = "ruzstd-cli" +name = "structured-zstd-cli" ... -homepage = "https://github.com/KillingSpark/zstd-rs" -repository = "https://github.com/KillingSpark/zstd-rs" +homepage = "https://github.com/structured-world/structured-zstd" +repository = "https://github.com/structured-world/structured-zstd"If intentionally keeping
ruzstd-clifor backward compatibility or upstream attribution, this is fine as-is.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/Cargo.toml` around lines 3 - 12, Update the Cargo.toml metadata to reflect the fork: change the package name in the name = "ruzstd-cli" field if you want it to match the fork, and update the homepage = and repository = values to point to this fork's URLs; also align the description = field to reference the forked project name (or add a comment in README/description explaining backward compatibility if you intentionally keep "ruzstd-cli"). Ensure you only modify the name, homepage, repository and description entries and keep authors, version, edition, and readme unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@README.md`:
- Around line 51-56: The example imports compress but never uses it; remove the
unused symbol from the import so the line reads only the needed items (keep
compress_to_vec and CompressionLevel) in the example import statement
referencing structured_zstd::encoding; update the import that currently includes
compress to avoid the unused-import warning while leaving compress_to_vec and
CompressionLevel intact.
---
Outside diff comments:
In `@cli/Cargo.toml`:
- Around line 3-12: Update the Cargo.toml metadata to reflect the fork: change
the package name in the name = "ruzstd-cli" field if you want it to match the
fork, and update the homepage = and repository = values to point to this fork's
URLs; also align the description = field to reference the forked project name
(or add a comment in README/description explaining backward compatibility if you
intentionally keep "ruzstd-cli"). Ensure you only modify the name, homepage,
repository and description entries and keep authors, version, edition, and
readme unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 9d517f83-280c-43df-a540-c4b631348098
📒 Files selected for processing (4)
.github/workflows/ci.ymlREADME.mdcli/Cargo.tomlruzstd/src/dictionary/frequency.rs
- Directory: ruzstd/ → zstd/ - Workspace members: ["zstd", "cli"] - CLI: ruzstd-cli → structured-zstd-cli, direct structured_zstd imports - Fuzz targets: structured_zstd imports, renamed helper functions - Internal test helpers: decode_ruzstd → decode_szstd, encode_ruzstd → encode_szstd - CI: working-directory zstd/, lcov path updated - Only "ruzstd" remaining: upstream attribution in README/description
Summary
Managed fork setup for structured-zstd (pure Rust zstd, fork of ruzstd):
Test plan
Closes #1
Summary by CodeRabbit
New Features
Documentation
Tests
Chores