Conversation
WalkthroughRemoves reliance on remaining/merkle_tree account lists across compressed-token flows, consolidates TokenData hashing to a single little-endian Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Client
participant Program as CompressedTokenProgram
participant Sys as SystemProgram (feature-gated)
participant Handler as InstructionHandler
participant Xfer as TransferHelpers
participant TokenData as TokenData.hash()
Client->>Program: invoke instruction (transfer/mint/burn/freeze/delegate)
Program->>Sys: (feature: deactivate-cpi-context?) check cpi_context
alt feature enabled and cpi_context present
Sys-->>Program: CpiContextDeactivated (error)
Program-->>Client: error
else normal flow
Program->>Handler: process_* entry
Handler->>Xfer: create_output_compressed_accounts(...) note right of Xfer: no remaining_accounts
Xfer-->>Handler: outputs allocated (amount LE)
Handler->>TokenData: token_data.hash()
TokenData-->>Handler: data_hash
Handler->>Xfer: add_data_hash_to_input_compressed_accounts(...) note right of Xfer: no discriminator branching
Xfer-->>Handler: inputs finalized
Handler-->>Program: Ok
Program-->>Client: success
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
programs/compressed-token/src/process_mint.rs (1)
286-288: Length field is serialized as 1 byte; use full u32.Writing only the low byte of len will break if outputs > 255. Serialize the full little‑endian u32.
- // lenght of output_compressed_accounts vec as u32 - inputs.extend_from_slice(&[(len as u8), 0, 0, 0]); + // length of output_compressed_accounts vec as u32 (LE) + inputs.extend_from_slice(&(len as u32).to_le_bytes());programs/compressed-token/src/process_transfer.rs (1)
144-161: Potential underflow and wrong Merkle tree for lamports change account.
- change_lamports = input_lamports - output_lamports can underflow when outputs > inputs, yielding a huge value.
- You ignore lamports_change_account_merkle_tree_index even though the SDK provides it, and always pick the first output’s index.
Use checked_sub and honor the optional index.
- let change_lamports = input_lamports - output_lamports; - if change_lamports > 0 { + let change_lamports = input_lamports + .checked_sub(output_lamports) + .ok_or(ErrorCode::ComputeOutputSumFailed)?; + if change_lamports > 0 { let new_len = output_compressed_accounts.len() + 1; output_compressed_accounts.resize( new_len, OutputCompressedAccountWithPackedContext { compressed_account: CompressedAccount { owner: ctx.accounts.authority.key().into(), lamports: change_lamports, data: None, address: None, }, - merkle_tree_index: inputs.output_compressed_accounts[0].merkle_tree_index, + merkle_tree_index: inputs + .lamports_change_account_merkle_tree_index + .unwrap_or(inputs.output_compressed_accounts[0].merkle_tree_index), }, ); }
🧹 Nitpick comments (9)
program-tests/compressed-token-test/tests/test.rs (1)
891-892: Nit: fix typo in comment.“tokal token supply” → “total token supply”.
- // Make sure that the tokal token supply does not exceed `u64::MAX`. + // Make sure that the total token supply does not exceed `u64::MAX`.programs/compressed-token/src/process_mint.rs (1)
113-125: Correctness: lamports vector construction is confusing and moves Option unnecessarily.Current code relies on capturing the outer Option and duplicates it per recipient. Prefer mapping the inner u64 to Some(...) to avoid accidental misuse and improve clarity.
- let lamports_vec = lamports.map(|_| vec![lamports; amounts.len()]); + let lamports_vec = lamports.map(|lam| vec![Some(lam); amounts.len()]);programs/compressed-token/src/process_transfer.rs (1)
184-270: Performance/readability: avoid serializing TokenData when only hash is required.You serialize token_data into token_data_bytes and then compute the hash from field hashes. If possible, write the packed bytes directly (as you hinted) to reduce allocations. Non‑blocking.
programs/compressed-token/src/token_data.rs (6)
33-44: Docstring is stale (mentions delegated_amount/is_native) — update to reflect actual schema.It currently references delegated_amount and is_native, which are not part of the hash inputs anymore. Please align the comment to the implemented inputs and ordering.
Apply this diff to refresh the docs:
-/// Hashing schema: H(mint, owner, amount, delegate, delegated_amount, -/// is_native, state) -/// -/// delegate, delegated_amount, is_native and state have dynamic positions. -/// Always hash mint, owner and amount If delegate hash delegate and -/// delegated_amount together. If is native hash is_native else is omitted. -/// If frozen hash AccountState::Frozen else is omitted. -/// -/// Security: to prevent the possibility that different fields with the same -/// value to result in the same hash we add a prefix to the delegated amount, is -/// native and state fields. This way we can have a dynamic hashing schema and -/// hash only used values. +/// Hashing schema: Poseidon(mint_hash, owner_hash, amount_le_32, [delegate_hash], [frozen_tag]) +/// +/// - amount_le_32: u64::to_le_bytes() right-aligned in a 32-byte array (bytes[24..31]). +/// - [delegate_hash] is present only if `delegate` is Some. +/// - [frozen_tag] is present only if `state == AccountState::Frozen` and is a 32‑byte array +/// with its last byte set to `AccountState::Frozen as u8`. +/// +/// Note: `delegated_amount` and `is_native` are not part of the hash inputs in this schema.
104-134: Consolidated hashing + little‑endian amount looks good; consider a helper for amount bytes and tighter typing.
- LGTM on unifying to LE and gating state by Initialized vs Frozen.
- Consider extracting amount packing to a single helper to remove duplication across prod/tests.
- Consider changing
hash_inputs_with_hashed_valuesto take&[u8; 32]foramount_bytesto prevent misuse with non‑padded inputs.Please confirm that dropping
is_nativeanddelegated_amountfrom the hash is intentional for this revert and that no clients/circuits still depend on them.Example helper:
impl TokenData { #[inline] pub(crate) fn amount_le_field_bytes(amount: u64) -> [u8; 32] { let mut bytes = [0u8; 32]; bytes[24..].copy_from_slice(&amount.to_le_bytes()); bytes } }
191-208: Make legacy_hash amount representation consistent (use 32‑byte right‑aligned LE).Currently it passes an 8‑byte slice; elsewhere we use a 32‑byte field‑sized array. Aligning removes ambiguity and avoids reliance on Poseidon input handling.
- let amount_bytes = self.amount.to_le_bytes(); + let mut amount_bytes = [0u8; 32]; + amount_bytes[24..].copy_from_slice(self.amount.to_le_bytes().as_slice());
1-1: Unnecessary import.
use std::vec;is not needed;vec![]works without it.-use std::vec;
220-280: 10k-iteration property test is heavy; consider gating or reducing.This can slow CI significantly. Suggest reduce to 1k or guard behind a feature/env flag.
Example:
- equivalency_of_hash_functions_rnd_iters::<10_000>(); + equivalency_of_hash_functions_rnd_iters::<1_000>();Or gate with
#[cfg(feature = "long_tests")].
301-304: Comment contradicts code.State is set to Initialized but the comment says “Using Frozen state”.
- state: AccountState::Initialized, // Using Frozen state to match our circuit test + state: AccountState::Initialized, // Matches circuit test
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
program-tests/compressed-token-test/tests/test.rs(5 hunks)programs/compressed-token/src/burn.rs(0 hunks)programs/compressed-token/src/delegation.rs(0 hunks)programs/compressed-token/src/freeze.rs(4 hunks)programs/compressed-token/src/process_mint.rs(2 hunks)programs/compressed-token/src/process_transfer.rs(3 hunks)programs/compressed-token/src/token_data.rs(7 hunks)
💤 Files with no reviewable changes (2)
- programs/compressed-token/src/delegation.rs
- programs/compressed-token/src/burn.rs
🧰 Additional context used
🧬 Code graph analysis (2)
programs/compressed-token/src/freeze.rs (3)
program-libs/compressed-account/src/compressed_account.rs (3)
from(30-39)from(51-57)from(61-66)program-libs/compressed-account/src/instruction_data/with_readonly.rs (1)
from(49-68)program-libs/compressed-account/src/instruction_data/zero_copy.rs (8)
from(166-171)from(199-205)from(209-215)from(265-281)from(319-326)from(384-391)from(661-673)from(747-766)
programs/compressed-token/src/token_data.rs (2)
program-libs/compressed-account/src/compressed_account.rs (3)
hash(76-82)hash(325-338)hash(363-375)program-libs/hasher/src/hash_to_field_size.rs (1)
hash_to_bn254_field_size_be(91-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (18)
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v1
- GitHub Check: Forester e2e test
- GitHub Check: lint
- GitHub Check: system-programs (sdk-libs, light-macros light-sdk light-program-test light-client light-batched-m...
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: stateless-js-v2
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: cli-v2
🔇 Additional comments (15)
program-tests/compressed-token-test/tests/test.rs (1)
1246-1247: LGTM: migrated to SystemProgramError variants.The assertions now target SystemProgramError::{StateMerkleTreeAccountDiscriminatorMismatch, ProofVerificationFailed}. This aligns with the refactor and keeps tests readable.
Also applies to: 2305-2306, 2351-2353, 2784-2786, 3406-3408
programs/compressed-token/src/process_mint.rs (3)
170-197: Sanity: account_infos ordering duplicates light_system_program; verify intended.light_system_program appears twice (indices 8 and 10). If that’s required for two distinct meta positions (placeholders), add a brief comment to avoid regressions; otherwise drop the duplicate.
564-567: LGTM: unified hashing path in tests.Switch to token_data.hash() is consistent with the repo-wide consolidation.
629-632: LGTM: randomized test uses unified hash path.Matches the production path and reduces legacy drift.
programs/compressed-token/src/freeze.rs (4)
16-17: Import path change looks right.Using InputTokenDataWithContext from process_transfer keeps freeze/thaw aligned with transfer pipeline types.
166-172: LGTM: single hashing path for outputs.Using token_data.hash() removes discriminator-branching and keeps behavior uniform.
509-510: LGTM: tests expect new hash values.Expected hashes now use token_data.hash(), matching production.
568-573: LGTM: input account data_hash uses unified path.Keeps freeze/thaw proofs consistent with transfer hashing semantics.
programs/compressed-token/src/process_transfer.rs (3)
237-246: Endianness change: amount now hashed as little‑endian.The shift to LE (amount.to_bytes_le()) is consistent with the PR objective. Confirm token_data.hash()/hash_with_hashed_values uses LE across all call sites to avoid mismatches.
288-312: Endianness update mirrored for inputs.Inputs and outputs now agree on LE amount encoding; good.
1-2: Minor: imports are fine.New imports reflect added ProgramError mapping and AnchorDeserialize usage.
programs/compressed-token/src/token_data.rs (4)
145-171: Tests updated to call token_data.hash(): OK.Change aligns with the new single-path hash.
172-189: Second test switch to token_data.hash(): OK.Consistent with new API.
343-345: Frozen equivalence test uses hash(): OK.Asserts parity with pre-hashed path; looks good.
415-421: Uniqueness checks now exercise Frozen paths via hash(): OK.Covers state and delegate variants.
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (2)
programs/system/src/invoke_cpi/process_cpi_context.rs (1)
48-52: Guard correctly disables CPI context when the feature is enabled.Behavior is clear and fail-fast with a dedicated error.
If you want “deactivated” to take precedence over other mismatch errors, move this guard above the check on Lines 44–47. Also consider aligning the log string casing with nearby msgs.
Example:
- #[cfg(feature = "deactivate-cpi-context")] - if cpi_context.is_some() { - msg!("cpi context is deactivated."); - return Err(SystemProgramError::CpiContextDeactivated.into()); - } + #[cfg(feature = "deactivate-cpi-context")] + if cpi_context.is_some() { + msg!("Cpi context is deactivated"); + return Err(SystemProgramError::CpiContextDeactivated.into()); + }Add a feature-gated unit test to assert the new path:
#[cfg(all(test, feature = "deactivate-cpi-context"))] #[test] fn test_process_cpi_context_deactivated() { // Arrange minimal instruction_data with Some(cpi_context) and no account // Assert: process_cpi_context(...) returns SystemProgramError::CpiContextDeactivated }programs/system/Cargo.toml (1)
21-21: Add 'deactivate-cpi-context' to [lints.rust.unexpected_cfgs].check-cfgFound
deactivate-cpi-context = []in programs/system/Cargo.toml (line 21). Add it to the check-cfg list so -Zcheck-cfg or warn/deny won't flag the new feature (repo search shows no occurrences of it being enabled in defaults).Apply this diff:
[lints.rust.unexpected_cfgs] level = "allow" check-cfg = [ 'cfg(target_os, values("solana"))', - 'cfg(feature, values("frozen-abi", "no-entrypoint"))', + 'cfg(feature, values("frozen-abi", "no-entrypoint", "deactivate-cpi-context"))', ]
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
programs/system/Cargo.toml(1 hunks)programs/system/src/errors.rs(2 hunks)programs/system/src/invoke_cpi/process_cpi_context.rs(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
programs/system/src/invoke_cpi/process_cpi_context.rs (5)
program-libs/compressed-account/src/instruction_data/with_account_info.rs (1)
cpi_context(341-351)program-libs/compressed-account/src/instruction_data/with_readonly.rs (1)
cpi_context(297-307)program-libs/compressed-account/src/instruction_data/zero_copy.rs (2)
cpi_context(486-488)cpi_context(610-618)programs/system/src/context.rs (1)
cpi_context(379-381)program-libs/compressed-account/src/instruction_data/traits.rs (1)
cpi_context(22-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: Forester e2e test
- GitHub Check: lint
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v1
- GitHub Check: system-programs (sdk-libs, light-macros light-sdk light-program-test light-client light-batched-m...
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v2
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-slow
- GitHub Check: Test program-libs-fast
🔇 Additional comments (2)
programs/system/src/errors.rs (2)
119-121: New error variant is appropriate and keeps ABI stable via u32 mapping.Unconditionally defining CpiContextDeactivated is the right call for stable error codes across builds.
192-192: Numeric mapping looks correct and sequential.6054 follows 6053 without collisions. Consider documenting 6054 in any external error-code docs.
If you keep an external list, please add “CpiContextDeactivated (6054)”. Do you want a quick PR template snippet for error-code docs?
12d7588 to
5f240d6
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
programs/compressed-token/src/token_data.rs (1)
85-101: Add explicit domain separation for optional/variable inputslight-hasher’s Poseidon::hashv only selects a Poseidon instance by arity (vals.len()) and relies on fixed 32‑byte field inputs; it does not add per‑field presence/length tags. Since amount_bytes is a &[u8] (variable length) and the delegate/state are optional (changing arity), encode explicit field tags or length prefixes (or canonicalize amount to a fixed 32‑byte representation) before calling Poseidon::hashv to prevent extension/prepend collisions.
Location: programs/compressed-token/src/token_data.rs: lines 85–101.
🧹 Nitpick comments (3)
programs/compressed-token/src/token_data.rs (3)
33-45: Docs out of date with implemented hashing schemaComment still mentions delegated_amount and is_native prefixes, which are not part of the current hash. Update to reflect the actual inputs and LE encoding.
Apply:
-/// Hashing schema: H(mint, owner, amount, delegate, delegated_amount, -/// is_native, state) -/// -/// delegate, delegated_amount, is_native and state have dynamic positions. -/// Always hash mint, owner and amount If delegate hash delegate and -/// delegated_amount together. If is native hash is_native else is omitted. -/// If frozen hash AccountState::Frozen else is omitted. -/// -/// Security: to prevent the possibility that different fields with the same -/// value to result in the same hash we add a prefix to the delegated amount, is -/// native and state fields. This way we can have a dynamic hashing schema and -/// hash only used values. +/// Hashing schema: H(mint, owner, amount[, delegate][, state_tag]) +/// +/// - Always include `mint`, `owner`, `amount`. +/// - Include `delegate` iff present. +/// - Include a `state_tag` only when `state == Frozen`. +/// - `amount` is encoded little‑endian into the last 8 bytes of a 32‑byte array. +/// +/// Notes: +/// - No `delegated_amount` or `is_native` bytes are part of the hash. +/// - Use domain‑separated inputs (see Poseidon::hashv) to avoid collisions across optional fields.
51-53: Broaden native-mint detection for Token-2022Consider treating SPL Token 2022 native mint as native too.
pub fn is_native(&self) -> bool { - self.mint == spl_token::native_mint::id() + self.mint == spl_token::native_mint::id() + || self.mint == spl_token_2022::native_mint::ID }
191-220: Test-only legacy_hash mirror is fine but redundantConsider inlining comparisons against
hash_inputs_with_hashed_valuesonly and droppinglegacy_hashto reduce duplication.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
.github/workflows/light-system-programs-tests.yml(1 hunks)program-tests/compressed-token-test/tests/test.rs(6 hunks)programs/compressed-token/src/token_data.rs(8 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
programs/compressed-token/src/token_data.rs (2)
program-libs/compressed-account/src/compressed_account.rs (3)
hash(76-82)hash(325-338)hash(363-375)program-libs/hasher/src/hash_to_field_size.rs (1)
hash_to_bn254_field_size_be(91-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: Forester e2e test
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: Test program-libs-fast
- GitHub Check: Test program-libs-slow
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: lint
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v2
- GitHub Check: system-programs (sdk-libs, light-macros light-sdk light-program-test light-client light-batched-m...
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: stateless-js-v1
🔇 Additional comments (10)
program-tests/compressed-token-test/tests/test.rs (5)
1244-1248: LGTM: updated error expectation to SystemProgramErrorSwitch to
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatchaligns with the new error surface.
2303-2307: LGTM: approve() failing-paths now use SystemProgramErrorBoth delegated and change tree mismatches assert the new
SystemProgramErrorvariant correctly.Also applies to: 2349-2353
2783-2786: LGTM: revoke() invalid Merkle tree maps to SystemProgramErrorExpectation matches the updated program error.
3404-3408: LGTM: burn invalid change-tree assertion updatedAsserting
StateMerkleTreeAccountDiscriminatorMismatchis consistent with program changes.
5252-5255: LGTM: transfer invalid Merkle tree pubkey assertion updatedMatches the consolidated error path.
programs/compressed-token/src/token_data.rs (5)
145-189: Tests updated to exercise new hash() — OKEquivalence and helper-path checks align with the new hashing.
221-276: Randomized equivalence tests — solid coverage10k iterations provide good confidence.
321-346: Frozen-equivalence test OKConfirms parity between helper and public path.
348-422: Negative hashing cases are well coveredDistinctness checks across fields/state look correct.
106-134: Verify no legacy hashing callers remainHash consolidation and LE amount look good. ripgrep returned "No files were searched" so I couldn't confirm absence of legacy callers — re-run from repo root:
rg --debug -n -P 'hash_legacy\s*\(' || true rg -n --hidden --no-ignore -S -P 'hash_legacy\s*\(' -g '!**/target/**' rg -n --hidden --no-ignore -S -P '\b(?:hash_frozen_with_hashed_values|hash_with_hashed_values)\b' -g '!**/target/**'If results remain empty, inspect external crates/bindings/generated consumers for callers.
| sub-tests: '["cargo-test-sbf -p compressed-token-test"]' | ||
| - program: system-cpi-test | ||
| sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio --all-features"]' | ||
| sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]' |
There was a problem hiding this comment.
🧩 Analysis chain
Ensure feature-gated CPI-context paths are exercised; add a run with the feature enabled.
Dropping --all-features means system-cpi-test and light-system-program-pinocchio now run only with default features. Given this PR introduces a feature-gated “deactivate CPI context” behavior and error, this change risks a silent coverage gap. Run tests once with defaults and once with --features deactivate-cpi-context to validate both paths.
Apply this minimal change to cover both variants in this job:
- sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
+ sub-tests: '[
+ "cargo-test-sbf -p system-cpi-test",
+ "cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context",
+ "cargo test -p light-system-program-pinocchio",
+ "cargo test -p light-system-program-pinocchio --features deactivate-cpi-context"
+ ]'Run this quick scan to confirm there are tests/assertions hidden behind the feature and to spot any other gated code that might now be skipped:
🏁 Script executed:
#!/bin/bash
# Find feature-gated code/tests for CPI-context deactivation
rg -nC2 -S '\#\[\s*cfg\s*\(\s*feature\s*=\s*"deactivate-cpi-context"\s*\)\s*\]' --type=rust
# Also list tests mentioning the new error type to ensure both variants exercise it
rg -n 'CpiContextDeactivated|deactivate.?cpi.?context' --type=rustLength of output: 940
Run tests with --features deactivate-cpi-context to exercise feature-gated CPI paths
programs/system/src/invoke_cpi/process_cpi_context.rs is guarded by #[cfg(feature = "deactivate-cpi-context")] and errors.rs defines CpiContextDeactivated, so the workflow (.github/workflows/light-system-programs-tests.yml — around line 67) must run the tests both with and without that feature.
- sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
+ sub-tests: '[
+ "cargo-test-sbf -p system-cpi-test",
+ "cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context",
+ "cargo test -p light-system-program-pinocchio",
+ "cargo test -p light-system-program-pinocchio --features deactivate-cpi-context"
+ ]'📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]' | |
| sub-tests: '[ | |
| "cargo-test-sbf -p system-cpi-test", | |
| "cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context", | |
| "cargo test -p light-system-program-pinocchio", | |
| "cargo test -p light-system-program-pinocchio --features deactivate-cpi-context" | |
| ]' |
🤖 Prompt for AI Agents
.github/workflows/light-system-programs-tests.yml around line 67: the workflow
currently runs cargo tests only without the deactivate-cpi-context feature, but
the file programs/system/src/invoke_cpi/process_cpi_context.rs is feature-gated
and errors.rs defines CpiContextDeactivated, so update the workflow so tests run
both with and without the feature; specifically add a second test invocation
that runs cargo test -p light-system-program-pinocchio --features
deactivate-cpi-context (or add the feature-flagged variant to the sub-tests
array alongside the existing entry) so both code paths are exercised.
Summary by CodeRabbit
New Features
Refactor
Bug Fixes