chore: hasher and light hasher macro support for sha256#1892
chore: hasher and light hasher macro support for sha256#1892SwenSchaeferjohann wants to merge 1 commit intomainfrom
Conversation
WalkthroughAdds a Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer (derive)
participant Macro as Proc Macro
participant Generated as Generated Impl
participant Hasher as H: Hasher
participant Poseidon as Poseidon (ID=0)
participant Sha256 as Sha256 (ID=1)
Dev->>Macro: #[derive(LightHasher)] or #[derive(LightHasherSha)]
Macro->>Generated: emit impl that calls H::hashv(data)
Generated->>Hasher: H::hashv(data)
alt H::ID == 0 (Poseidon)
Hasher->>Poseidon: field-based Poseidon hash
Poseidon-->>Generated: 32-byte hash (return)
else H::ID != 0 (e.g., Sha256)
Hasher->>Sha256: compute SHA256 over Borsh bytes
Sha256-->>Generated: 32-byte hash
Generated->>Generated: normalize -> result[0] = 0
end
Generated-->>Dev: to_byte_array / hash result
sequenceDiagram
participant Caller
participant LightAccount as LightAccountInner<'a,H,...>
participant Hasher as H: Hasher
Caller->>LightAccount: new_init / new_mut / to_account_info
LightAccount->>Hasher: compute data hash (H::hashv)
Hasher-->>LightAccount: 32-byte hash
alt H::ID != 0
LightAccount->>LightAccount: normalize/truncate -> result[0]=0
end
LightAccount-->>Caller: compressed account info
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (7)
program-libs/hasher/src/poseidon.rs (1)
81-82: LGTM - Correct trait implementation with minor formatting suggestion.The
ID = 0assignment for Poseidon correctly implements the required trait constant. Assigning 0 to Poseidon makes sense as the baseline hasher that doesn't require truncation in macro-generated code.Consider removing the extra blank line for consistency:
impl Hasher for Poseidon { const ID: u8 = 0; - fn hash(val: &[u8]) -> Result<Hash, HasherError> {sdk-libs/macros/src/lib.rs (1)
259-283: Documentation needs correction.The example imports from
light_sdk::sha::LightHasher, but based on the SDK file structure, this should belight_sdk::sha::LightHasheror users should uselight_sdk_macros::LightHasherShadirectly./// ```ignore -/// use light_sdk::sha::LightHasher; +/// use light_sdk_macros::LightHasherSha; /// -/// #[derive(LightHasher)] +/// #[derive(LightHasherSha)] /// pub struct GameState {sdk-libs/macros/src/hasher/data_hasher.rs (2)
40-48: Clarify the truncation comment.The comment mentions "field size truncation" but the code is zeroing the first byte of the hash result. This should be clarified to explain why this truncation is necessary for non-Poseidon hashers.
- // Apply field size truncation for non-Poseidon hashers + // Zero the first byte for non-Poseidon hashers to ensure consistent hash output format if H::ID != 0 { result[0] = 0; }Also applies to: 69-82
87-118: SHA256 implementation looks good, same comment clarification needed.The approach of serializing the entire struct for SHA256 hashing is appropriate. Consider updating the truncation comment here as well for consistency.
- // Truncate field size for non-Poseidon hashers + // Zero the first byte for non-Poseidon hashers to ensure consistent hash output format if H::ID != 0 { result[0] = 0; }sdk-libs/macros/src/hasher/to_byte_array.rs (1)
7-61: Good refactoring to support multiple hashers.The parameterization with
hashermakes the implementation flexible for different hashing algorithms. Consider clarifying the truncation comment as suggested in other files.- // Truncate field size for non-Poseidon hashers + // Zero the first byte for non-Poseidon hashers to ensure consistent hash output format if #hasher::ID != 0 { result[0] = 0; }sdk-libs/sdk/src/account.rs (1)
88-91: Remove unusedSizetrait.The
Sizetrait is defined but not implemented or used anywhere in this file. Consider removing it if it's not needed, or moving it to where it's actually used.-pub trait Size { - fn size(&self) -> usize; -} -sdk-libs/macros/src/hasher/light_hasher.rs (1)
441-563: Comprehensive test coverage for SHA256 variant.Excellent tests that clearly demonstrate the differences between SHA256 and Poseidon hashers. The tests effectively validate:
- No field limit for SHA256
- Different hashing approaches
- Attribute support differences
Consider adding a test for the
#[skip]attribute behavior with SHA256 to ensure complete attribute coverage documentation.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (12)
program-libs/hasher/src/keccak.rs(1 hunks)program-libs/hasher/src/lib.rs(1 hunks)program-libs/hasher/src/poseidon.rs(1 hunks)program-libs/hasher/src/sha256.rs(1 hunks)sdk-libs/macros/src/hasher/data_hasher.rs(2 hunks)sdk-libs/macros/src/hasher/input_validator.rs(1 hunks)sdk-libs/macros/src/hasher/light_hasher.rs(4 hunks)sdk-libs/macros/src/hasher/mod.rs(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs(2 hunks)sdk-libs/macros/src/lib.rs(2 hunks)sdk-libs/sdk/src/account.rs(10 hunks)sdk-libs/sdk/src/lib.rs(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (12)
- GitHub Check: stateless-js-v2
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-test"])
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: cli-v1
- GitHub Check: cli-v2
- GitHub Check: stateless-js-v1
- GitHub Check: lint
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test program-libs-slow
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-fast
- GitHub Check: Test concurrent-merkle-tree
🔇 Additional comments (13)
program-libs/hasher/src/lib.rs (1)
27-27: LGTM - Clean trait extension for hasher identification.The addition of the
IDconstant to theHashertrait provides a clean way to identify different hasher implementations at compile time. This enables conditional logic in macro-generated code and supports the multi-hasher architecture.program-libs/hasher/src/keccak.rs (1)
12-13: LGTM - Correct implementation of required trait constant.The
ID = 2assignment for Keccak properly implements the newHashertrait requirement and maintains consistency with the numbering scheme (Poseidon=0, SHA256=1, Keccak=2).program-libs/hasher/src/sha256.rs (1)
12-12: LGTM - Proper trait implementation.The
ID = 1assignment for SHA256 correctly implements the newHashertrait requirement and fits well within the established numbering scheme.sdk-libs/macros/src/hasher/mod.rs (1)
7-7: LGTM - Proper export of new SHA256 derive macro.The addition of
derive_light_hasher_shaexport enables the SHA256 variant of the light hasher derive macro to be used throughout the codebase, complementing the existing Poseidon-basedderive_light_hasher.sdk-libs/macros/src/hasher/input_validator.rs (1)
63-91: LGTM! Well-structured SHA256 validation function.The relaxed constraints for SHA256 (no field count limit) appropriately reflect its ability to handle larger inputs compared to Poseidon. The error messages clearly indicate this is for the SHA256 hasher variant.
sdk-libs/macros/src/lib.rs (1)
3-3: Import addition looks good.The new
derive_light_hasher_shaimport is properly added alongside the existing hasher derive function.sdk-libs/sdk/src/lib.rs (1)
106-113: Clean module structure for SHA256 variants.The new
shasubmodule provides a clear namespace separation for SHA256-based implementations while maintaining backward compatibility. The re-export pattern is well-structured.sdk-libs/sdk/src/account.rs (4)
92-99: Well-designed type alias pattern for backward compatibility.The type alias approach maintains backward compatibility while introducing generic hasher support. The separate
shamodule provides a clean namespace for SHA256 variants.
109-110: Clarify the purpose and security implications ofshould_remove_data.The
should_remove_datafield controls whether account data is cleared into_account_info(). The TODO comment on line 271 suggests security concerns need review. Please document:
- When and why this flag should be set
- Security implications of removing vs. keeping data
- Any validation requirements
119-221: Consistent and correct generic hasher implementation.All constructor methods properly updated to support generic hashers with consistent initialization of new fields.
288-308: LGTM!Deref implementations correctly updated for the generic struct.
sdk-libs/macros/src/hasher/light_hasher.rs (2)
51-53: Clean refactoring to support multiple hashers.Good use of the DRY principle by extracting common logic into
derive_light_hasher_with_hasher.
78-120: Well-structured generic hasher implementation.The helper function cleanly separates hasher-specific logic while reusing common code generation.
e66e020 to
7f6196f
Compare
There was a problem hiding this comment.
Actionable comments posted: 0
♻️ Duplicate comments (2)
sdk-libs/macros/src/hasher/to_byte_array.rs (1)
83-84: Fix inconsistent truncation logic in SHA256 implementation.The SHA256 implementation always zeros the first byte without checking the hasher ID, while other implementations only do this conditionally. This should be consistent across all implementations.
Apply this diff to fix the inconsistency:
- // Truncate field size for non-Poseidon hashers - result[0] = 0; + // Zero the first byte for non-Poseidon hashers to ensure consistent hash output format + if ::light_hasher::Sha256::ID != 0 { + result[0] = 0; + }sdk-libs/macros/src/hasher/light_hasher.rs (1)
55-76: Address incomplete field attribute handling in SHA256 implementation.The SHA256 derive path only validates against
#[flatten]and ignores other field attributes (#[skip]and#[hash]), causing them to be silently ignored rather than properly handled or explicitly rejected.The current implementation will treat fields with
#[skip]or#[hash]attributes exactly like regular fields, which may surprise users. Please either:
- Extend validation to explicitly reject unsupported attributes:
// In validate_input_sha, add checks for other attributes if matches!(attr, FieldAttribute::Skip | FieldAttribute::Hash) { return Err(syn::Error::new_spanned(field, "Attribute not supported in SHA256 variant")); }
Implement proper handling in the serialization logic to respect these attributes
Document clearly which attributes are supported/ignored in the SHA256 variant
This ensures consistent behavior and prevents silent failures.
🧹 Nitpick comments (1)
sdk-libs/macros/src/discriminator.rs (1)
23-25: Remove unused variable and clarify discriminator calculation.The
_variant_markervariable is assigned but never used, creating dead code. Additionally, both discriminator variants use SHA256 hash calculation regardless of theis_shaparameter, which may be confusing.Apply this diff to remove the unused variable:
- // For SHA256 variant, we could add specific logic here if needed - // Currently both variants work the same way since discriminator is just based on struct name - let _variant_marker = if is_sha { "sha256" } else { "poseidon" };Consider adding a comment to clarify that discriminators always use SHA256 hash of the struct name regardless of the hasher variant:
+ // Discriminators always use SHA256 hash of struct name for consistency + // regardless of the hasher variant used for the actual data hashing let mut discriminator = [0u8; 8];
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (13)
program-libs/hasher/src/keccak.rs(1 hunks)program-libs/hasher/src/lib.rs(1 hunks)program-libs/hasher/src/poseidon.rs(1 hunks)program-libs/hasher/src/sha256.rs(1 hunks)sdk-libs/macros/src/discriminator.rs(3 hunks)sdk-libs/macros/src/hasher/data_hasher.rs(2 hunks)sdk-libs/macros/src/hasher/input_validator.rs(1 hunks)sdk-libs/macros/src/hasher/light_hasher.rs(5 hunks)sdk-libs/macros/src/hasher/mod.rs(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs(2 hunks)sdk-libs/macros/src/lib.rs(3 hunks)sdk-libs/sdk/src/account.rs(10 hunks)sdk-libs/sdk/src/lib.rs(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (10)
- program-libs/hasher/src/keccak.rs
- program-libs/hasher/src/poseidon.rs
- program-libs/hasher/src/sha256.rs
- program-libs/hasher/src/lib.rs
- sdk-libs/macros/src/hasher/input_validator.rs
- sdk-libs/macros/src/hasher/mod.rs
- sdk-libs/macros/src/lib.rs
- sdk-libs/macros/src/hasher/data_hasher.rs
- sdk-libs/sdk/src/lib.rs
- sdk-libs/sdk/src/account.rs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (12)
- GitHub Check: lint
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-slow
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-fast
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: stateless-js-v1
- GitHub Check: cli-v2
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-test"])
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v1
🔇 Additional comments (7)
sdk-libs/macros/src/discriminator.rs (3)
6-8: LGTM! Clean refactoring to support multiple hasher variants.The delegation to a shared helper maintains backward compatibility while enabling code reuse for the new SHA256 discriminator variant.
10-12: LGTM! Consistent API design for SHA256 discriminator.The new function follows the same pattern as the original discriminator, providing a clean API for SHA256-based discriminators.
63-103: Excellent test coverage for SHA256 discriminator functionality.The new tests comprehensively verify:
- Basic SHA256 discriminator functionality with expected output matching
- Ability to handle large structs that would challenge regular discriminators
- Proper code generation and trait implementation
The large struct test is particularly valuable as it demonstrates the key advantage of SHA256 discriminators for complex data structures.
sdk-libs/macros/src/hasher/to_byte_array.rs (1)
7-61: LGTM! Well-designed generalization to support multiple hashers.The refactored function successfully:
- Generalizes ToByteArray implementation for arbitrary hashers via the
hasherparameter- Simplifies single-field case with direct code parsing
- Implements correct conditional truncation logic that checks hasher ID before zeroing the first byte
- Maintains backward compatibility while enabling new functionality
The truncation logic correctly follows the pattern: only truncate for non-Poseidon hashers (ID != 0).
sdk-libs/macros/src/hasher/light_hasher.rs (3)
51-53: LGTM! Clean refactoring maintains backward compatibility.The delegation to a shared helper with the Poseidon hasher token preserves existing functionality while enabling code reuse for the new SHA256 variant.
78-120: LGTM! Comprehensive shared implementation with full attribute support.The shared helper properly:
- Validates input structure
- Processes field attributes (
#[hash],#[skip],#[flatten])- Generates implementations using the provided hasher parameter
- Maintains all existing functionality while enabling hasher flexibility
This demonstrates the full-featured approach that the SHA256 variant should also follow for attribute handling.
441-712: Excellent comprehensive test coverage for SHA256 variant.The new test suite effectively demonstrates and validates:
- Large struct handling: SHA256's ability to process structs with >12 fields and multiple Pubkeys
- Different hashing approaches: Serialization-based (SHA256) vs field-by-field (Poseidon)
- Attribute validation: Proper rejection of unsupported
#[flatten]attribute- No field limits: SHA256 handling 20+ fields without constraints
- Real-world scenarios: Complex game state struct that would be impossible with Poseidon
- Integration testing: Coordination between hasher and discriminator variants
These tests serve as excellent documentation of when and why to use each variant, and provide confidence in the implementation correctness.
|
what are the expected changes of this pr? |
7f6196f to
d0dda1c
Compare
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
sdk-libs/macros/src/discriminator.rs (1)
14-26: Avoid unwraps in proc-macro path; propagate hashing errors.Unwrapping the result of Sha256::hash may panic at compile time. Return a syn::Error instead and drop the unused variant marker.
Apply:
-fn discriminator_with_hasher(input: ItemStruct, is_sha: bool) -> Result<TokenStream> { +fn discriminator_with_hasher(input: ItemStruct, _is_sha: bool) -> Result<TokenStream> { let account_name = &input.ident; let (impl_gen, type_gen, where_clause) = input.generics.split_for_impl(); - let mut discriminator = [0u8; 8]; - discriminator.copy_from_slice(&Sha256::hash(account_name.to_string().as_bytes()).unwrap()[..8]); - let discriminator: proc_macro2::TokenStream = format!("{discriminator:?}").parse().unwrap(); + let hash = Sha256::hash(account_name.to_string().as_bytes()) + .map_err(|e| syn::Error::new(proc_macro2::Span::call_site(), format!("Sha256::hash failed: {e}")))?; + let mut discriminator = [0u8; 8]; + discriminator.copy_from_slice(&hash[..8]); + let discriminator: proc_macro2::TokenStream = format!("{discriminator:?}").parse().unwrap(); - // For SHA256 variant, we could add specific logic here if needed - // Currently both variants work the same way since discriminator is just based on struct name - let _variant_marker = if is_sha { "sha256" } else { "poseidon" }; + // For SHA256 variant, we could add specific logic here if needed. Currently both variants are identical.
🧹 Nitpick comments (6)
sdk-libs/macros/src/hasher/input_validator.rs (1)
63-91: Relaxed SHA validator is fine; consider deduping common checks.Both validators perform the same “named fields only” and “no flatten” checks. Extracting a small helper reduces drift.
Example:
fn ensure_named_no_flatten(input: &ItemStruct, ctx: &str) -> Result<()> { /* shared checks */ }Do you want me to submit a small refactor PR to consolidate?
sdk-libs/macros/src/lib.rs (2)
144-170: Docs rely on re-exports; confirm they exist and match usage.Example uses light_sdk::sha::{LightHasher, LightDiscriminator} while the macro here is LightDiscriminatorSha. Ensure the SDK re-exports alias LightDiscriminatorSha as LightDiscriminator under sha:: to keep the example valid.
If both discriminators are identical (they currently are), consider stating this explicitly in docs or aliasing rather than maintaining two doc blocks to reduce duplication.
288-313: SHA LightHasher macro added; interface consistent.Macro input handling and error propagation mirror the Poseidon variant.
Align attribute ordering with the non-SHA variant for consistency.
-#[proc_macro_derive(LightHasherSha, attributes(hash, skip))] +#[proc_macro_derive(LightHasherSha, attributes(skip, hash))]sdk-libs/macros/src/discriminator.rs (3)
10-13: SHA variant introduced but behavior is identical.If this intentional parity is for future-proofing, consider noting it in docs/comments to avoid confusion.
Can you confirm there is no intended behavioral difference between LightDiscriminator and LightDiscriminatorSha (both hash only the type name with SHA-256)?
63-79: Duplicated assertion between poseidon/sha variants.Since both paths generate identical output, consider parameterizing one test to call both functions, or assert once and smoke‑test the other for successful expansion only.
81-103: “Large struct” test doesn’t validate a meaningful discriminator difference.Discriminator is based solely on the type name; field count/size doesn’t affect it. If the goal is to ensure macros tolerate large inputs, keep this as a smoke test but adjust the comment; otherwise consider moving “large struct” coverage to the LightHasherSha tests where payload size matters.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (14)
program-libs/hasher/src/keccak.rs(1 hunks)program-libs/hasher/src/lib.rs(1 hunks)program-libs/hasher/src/poseidon.rs(1 hunks)program-libs/hasher/src/sha256.rs(1 hunks)sdk-libs/macros/src/discriminator.rs(3 hunks)sdk-libs/macros/src/hasher/data_hasher.rs(2 hunks)sdk-libs/macros/src/hasher/input_validator.rs(1 hunks)sdk-libs/macros/src/hasher/light_hasher.rs(5 hunks)sdk-libs/macros/src/hasher/mod.rs(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs(2 hunks)sdk-libs/macros/src/lib.rs(3 hunks)sdk-libs/macros/tests/hasher.rs(1 hunks)sdk-libs/sdk/src/account.rs(10 hunks)sdk-libs/sdk/src/error.rs(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
- program-libs/hasher/src/sha256.rs
- program-libs/hasher/src/lib.rs
- program-libs/hasher/src/keccak.rs
- program-libs/hasher/src/poseidon.rs
🧰 Additional context used
🧬 Code graph analysis (7)
sdk-libs/macros/src/hasher/mod.rs (2)
sdk-libs/macros/src/lib.rs (1)
light_hasher(281-286)sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)
sdk-libs/macros/src/hasher/data_hasher.rs (4)
program-libs/hasher/src/keccak.rs (2)
hashv(18-20)hash(14-16)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)program-libs/hasher/src/sha256.rs (2)
hashv(17-43)hash(13-15)
sdk-libs/macros/src/hasher/light_hasher.rs (6)
sdk-libs/macros/src/lib.rs (2)
data_hasher(316-321)light_hasher(281-286)sdk-libs/macros/src/hasher/data_hasher.rs (2)
generate_data_hasher_impl(7-85)generate_data_hasher_impl_sha(88-118)sdk-libs/macros/src/hasher/field_processor.rs (1)
process_field(43-61)sdk-libs/macros/src/hasher/input_validator.rs (3)
get_field_attribute(94-108)validate_input(26-61)validate_input_sha(64-91)sdk-libs/macros/src/hasher/to_byte_array.rs (2)
generate_to_byte_array_impl_sha(64-92)generate_to_byte_array_impl_with_hasher(7-61)sdk-libs/macros/src/discriminator.rs (2)
discriminator(6-8)discriminator_sha(10-12)
sdk-libs/macros/src/hasher/to_byte_array.rs (5)
program-libs/hasher/src/to_byte_array.rs (10)
to_byte_array(6-6)to_byte_array(32-47)to_byte_array(55-59)to_byte_array(150-159)std(37-37)std(306-306)std(317-317)std(328-328)std(339-339)std(350-350)program-libs/hasher/src/hash_to_field_size.rs (2)
hash_to_field_size(9-9)hash_to_field_size(16-35)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)program-libs/hasher/src/sha256.rs (2)
hashv(17-43)hash(13-15)
sdk-libs/macros/src/lib.rs (4)
sdk-libs/sdk/src/account.rs (1)
discriminator(225-227)sdk-libs/macros/src/discriminator.rs (2)
discriminator(6-8)discriminator_sha(10-12)sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)
sdk-libs/sdk/src/account.rs (6)
program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/poseidon.rs (1)
hash(83-85)program-libs/hasher/src/sha256.rs (1)
hash(13-15)program-libs/compressed-account/src/compressed_account.rs (3)
hash(76-82)hash(325-338)hash(363-375)program-tests/system-cpi-test/src/create_pda.rs (1)
hash(544-547)program-tests/create-address-test-program/src/create_pda.rs (1)
hash(134-138)
sdk-libs/macros/tests/hasher.rs (4)
program-libs/hasher/src/lib.rs (1)
hash(28-28)program-tests/system-cpi-test/src/create_pda.rs (1)
hash(544-547)program-tests/create-address-test-program/src/create_pda.rs (1)
hash(134-138)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (14)
- GitHub Check: lint
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-fast
- GitHub Check: Test program-libs-slow
🔇 Additional comments (11)
sdk-libs/sdk/src/error.rs (1)
79-80: New error variant looks fine.Adding ExpectedNoData is reasonable and the message is clear.
sdk-libs/macros/src/hasher/data_hasher.rs (1)
69-79: Consistent truncation for non-Poseidon hashers.Conditionally zeroing the first byte via H::ID != 0 aligns with the new hasher semantics.
sdk-libs/macros/src/hasher/mod.rs (1)
7-7: Re-export of derive_light_hasher_sha looks good.Keeps the macro surface coherent next to derive_light_hasher.
sdk-libs/macros/src/hasher/to_byte_array.rs (2)
47-56: Hasher-parametrized truncation is correct and consistent.Using #hasher::ID to conditionally zero the first byte matches the intended cross-hasher behavior.
79-88: Resolved: truncation now gated on Sha256::ID.This addresses the earlier concern about unconditional zeroing in the SHA path.
sdk-libs/macros/tests/hasher.rs (1)
1308-1416: Approve SHA256 macro test
Covers Borsh serialization, manual Sha256 hashing with truncation, ToByteArray parity, and uniqueness; no duplicate test definitions found.sdk-libs/macros/src/hasher/light_hasher.rs (1)
288-312: LightHasherSha silently ignores#[skip]/#[hash]
derive_light_hasher_shajust serializes the entire struct, so fields annotated with#[skip]still affect the digest and#[hash]never truncates. The derive advertises these attributes and the docs even rely on them, so this is a functional bug. Until we actually support the attributes, we should fail fast instead of producing an incorrect hash.@@ if flatten_field_exists { return Err(Error::new_spanned( input, "Flatten attribute is not supported in SHA256 hasher.", )); } + + for field in input.fields.iter() { + match get_field_attribute(field) { + FieldAttribute::Hash | FieldAttribute::Skip => { + return Err(Error::new_spanned( + field, + "LightHasherSha does not yet support #[hash] or #[skip] attributes.", + )); + } + _ => {} + } + } Ok(()) }sdk-libs/sdk/src/account.rs (1)
289-299: Avoid zeroing the SHA-256 digest here
LightHasherSha’sDataHasheralready handles any truncation/field-size tweaks. Re-hashing raw bytes and forcing the first byte to zero here both duplicates work and weakens the digest (same concern raised earlier). Let’s rely on the hasher implementation instead of masking bits a second time.- if (H::ID == 1 || H::ID == 2) && HASH_FLAT { - output.data_hash = Sha256::hash(output.data.as_slice())?; - output.data_hash[0] = 0; - } else { - output.data_hash = self.account.hash::<H>()?; - } + output.data_hash = self.account.hash::<H>()?;sdk-libs/macros/src/lib.rs (2)
3-4: New SHA imports wired correctly.Brings in SHA variants for discriminator and hasher; looks consistent with module layout.
139-142: Good: propagate macro errors as compile errors.Switching to unwrap_or_else(...).into() avoids panics in macro expansion.
sdk-libs/macros/src/discriminator.rs (1)
7-8: Wrapper keeps API intact.Delegating through a shared helper is fine.
| let mut result = H::hashv(slices.as_slice())?; | ||
|
|
||
| // Apply field size truncation for non-Poseidon hashers | ||
| if H::ID != 0 { | ||
| result[0] = 0; | ||
| } | ||
|
|
||
| Ok(result) | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
Potentially incorrect slices population before hashing.
hashv(slices.as_slice()) assumes slices are fully populated. In the surrounding code, slices are filled via a loop using a constant index (num_flattned_fields) that never increments, so only index 0 is set repeatedly and others remain empty. This will corrupt inputs for flattened structs.
Suggested fix outside this hunk: populate slices using enumerate and remove the unused counter.
for (i, element) in field_array.iter().enumerate() {
slices[i] = element.as_slice();
}Also consider renaming num_flattned_fields -> num_flattened_fields (and removing it if unused).
Search for the current population pattern to patch all occurrences:
🏁 Script executed:
#!/bin/bash
rg -n 'for element in field_array\.iter\(\) \{[^}]*slices\[[^\]]+\]\s*=\s*element\.as_slice\(\);[^}]*\}' sdk-libs/macros/src/hasherLength of output: 0
🏁 Script executed:
#!/bin/bash
sed -n '1,200p' sdk-libs/macros/src/hasher/data_hasher.rsLength of output: 4363
Use enumerate() to properly populate slices and correct the index variable
In sdk-libs/macros/src/hasher/data_hasher.rs the loop
let mut num_flattned_fields = 0;
for element in field_array.iter() {
slices[num_flattned_fields] = element.as_slice();
}never increments the counter, so only slices[0] is set and the rest remain empty. Replace with:
for (i, element) in field_array.iter().enumerate() {
slices[i] = element.as_slice();
}and rename num_flattned_fields → num_flattened_fields (or remove it if unused).
🤖 Prompt for AI Agents
In sdk-libs/macros/src/hasher/data_hasher.rs around lines 40 to 48, the loop
that fills the slices array never increments its index variable so only
slices[0] is set; replace the loop with an enumerate-based iterator (for (i,
element) in field_array.iter().enumerate() { slices[i] = element.as_slice(); })
to correctly populate each slot, and rename num_flattned_fields to
num_flattened_fields (or remove that variable entirely if it's no longer used).
e13adb2 to
d545811
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (7)
program-libs/hasher/src/sha256.rs (1)
8-18: Compile-time SHA256 guard: confirm toolchain supports const assert in traitsThis pattern relies on const-eval panics in associated consts. Verify MSRV/toolchain used in CI supports this. If not, switch to a well-known const-assert pattern.
Example alternative if needed:
-pub trait RequireSha256: Hasher { - const ASSERT: () = assert!( - Self::ID == 1, - "DataHasher for LightHasherSha only works with SHA256 (ID=1). Example: your_struct.hash::<Sha256>()?" - ); -} +pub trait RequireSha256: Hasher { + const ASSERT: () = { + // Fallback const-assert pattern + ["RequireSha256: H::ID must be 1"][!(Self::ID == 1) as usize]; + }; +}sdk-libs/macros/tests/discriminator.rs (1)
1-2: Avoid name shadowing between trait and derive macroUsing the same identifier LightDiscriminator for both the trait alias and the derive macro can confuse readers. Consider aliasing the trait as LightDiscriminatorTrait (or omit importing the macro).
sdk-libs/macros/src/discriminator.rs (1)
8-13: Comment mentions different feature nameThe comment says “anchor-discriminator-compat” while the feature is “anchor-discriminator”. Update for consistency.
sdk-libs/macros/tests/hasher.rs (1)
396-399: Remove stray println! in testsDebug prints can clutter CI logs.
- println!("1"); assert_eq!(with_none.to_byte_array().unwrap(), none_bytes[0]); - println!("1");sdk-libs/macros/src/hasher/light_hasher.rs (3)
68-70: Consider parity: DataHasher SHA path enforces SHA256 via trait hack; ToByteArray SHA path doesn’t.Optional: mirror the compile-time assertion used in DataHasher (RequireSha256) or document that ToByteArray SHA path is intentionally fixed to Sha256.
143-163: Optional: avoid generating dead code when hasher is Poseidon.Generated
if Poseidon::ID != Poseidon::ID { … }is always false. You could omit the block when the selected hasher is Poseidon to reduce noise in generated code and test fixtures.Also applies to: 183-195
507-607: Add a test for attribute handling in SHA256 path.To prevent surprises, add a unit test that confirms
#[hash]and#[skip]are either rejected or explicitly ignored under SHA256 (matchingvalidate_input_sha’s policy).
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lockis excluded by!**/*.lock
📒 Files selected for processing (12)
.github/workflows/rust.yml(1 hunks).github/workflows/sdk-tests.yml(1 hunks)program-libs/hasher/src/sha256.rs(1 hunks)sdk-libs/macros/Cargo.toml(2 hunks)sdk-libs/macros/src/discriminator.rs(2 hunks)sdk-libs/macros/src/hasher/data_hasher.rs(2 hunks)sdk-libs/macros/src/hasher/light_hasher.rs(16 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs(2 hunks)sdk-libs/macros/src/lib.rs(2 hunks)sdk-libs/macros/tests/discriminator.rs(1 hunks)sdk-libs/macros/tests/hasher.rs(8 hunks)sdk-libs/sdk/src/lib.rs(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (7)
sdk-libs/macros/tests/discriminator.rs (1)
sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)
sdk-libs/macros/src/hasher/to_byte_array.rs (6)
program-libs/hasher/src/to_byte_array.rs (10)
to_byte_array(6-6)to_byte_array(32-47)to_byte_array(55-59)to_byte_array(150-159)std(37-37)std(306-306)std(317-317)std(328-328)std(339-339)std(350-350)program-libs/hasher/src/hash_to_field_size.rs (2)
hash_to_field_size(9-9)hash_to_field_size(16-35)program-libs/hasher/src/sha256.rs (2)
hashv(28-54)hash(24-26)program-libs/hasher/src/keccak.rs (2)
hashv(18-20)hash(14-16)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)
sdk-libs/macros/src/hasher/light_hasher.rs (6)
sdk-libs/macros/src/lib.rs (2)
data_hasher(287-292)light_hasher(252-257)sdk-libs/macros/src/hasher/data_hasher.rs (2)
generate_data_hasher_impl(7-85)generate_data_hasher_impl_sha(88-118)sdk-libs/macros/src/hasher/field_processor.rs (1)
process_field(43-61)sdk-libs/macros/src/hasher/input_validator.rs (3)
get_field_attribute(94-108)validate_input(26-61)validate_input_sha(64-91)sdk-libs/macros/src/hasher/to_byte_array.rs (2)
generate_to_byte_array_impl_sha(64-90)generate_to_byte_array_impl_with_hasher(7-61)sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)
sdk-libs/macros/src/hasher/data_hasher.rs (3)
program-libs/hasher/src/sha256.rs (2)
hashv(28-54)hash(24-26)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)
sdk-libs/macros/src/lib.rs (3)
sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)
sdk-libs/macros/src/discriminator.rs (3)
sdk-libs/sdk/src/account.rs (1)
discriminator(225-227)program-libs/account-checks/src/discriminator.rs (1)
discriminator(6-8)program-libs/hasher/src/sha256.rs (1)
hash(24-26)
sdk-libs/macros/tests/hasher.rs (8)
program-libs/hasher/src/sha256.rs (1)
hash(24-26)program-libs/hasher/src/lib.rs (1)
hash(28-28)programs/compressed-token/src/token_data.rs (1)
hash(106-134)program-libs/compressed-account/src/compressed_account.rs (3)
hash(76-82)hash(325-338)hash(363-375)program-tests/system-cpi-test/src/create_pda.rs (1)
hash(544-547)program-tests/create-address-test-program/src/create_pda.rs (1)
hash(134-138)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)program-libs/hasher/src/hash_to_field_size.rs (1)
hash_to_bn254_field_size_be(91-93)
🔇 Additional comments (23)
.github/workflows/sdk-tests.yml (1)
59-65: Extra SDK feature coverage looks goodThanks for extending the matrix to the renamed
light-sdk-macroscrate and adding the--all-featuresruns for both the macros and SDK crates; this should keep the optional feature surface guarded in CI. No issues spotted here..github/workflows/rust.yml (1)
50-50: Nice catch adding the macro tests.Great to see
light-macrospulled into the fast lane—this keeps the new SHA-256 derive coverage exercised alongside the rest of the lightweight crates.sdk-libs/macros/Cargo.toml (1)
10-12: Feature flag and dev-dep look goodanchor-discriminator feature and light-account-checks dev-dep wiring look fine.
Also applies to: 29-29
program-libs/hasher/src/sha256.rs (2)
23-23: Hasher ID constant is sensibleID = 1 for Sha256 matches the new ID scheme.
40-53: Double-check syscall ABI for sol_sha256Passing vals as a flat u8 pointer assumes the syscall expects a contiguous slice-of-slices encoding. Confirm the ABI signature and adjust packing if required.
sdk-libs/macros/tests/discriminator.rs (1)
6-15: Test both feature configurations in CIThis test conditionally asserts different bytes. Ensure CI runs with and without feature "anchor-discriminator" so both paths are exercised.
sdk-libs/macros/src/discriminator.rs (2)
18-19: OK to use SHA256 for discriminator derivationHashing “account:” vs “” behind the feature gate is clear and deterministic.
36-57: Test coverage scoped to non-anchor-discriminator onlyGiven there’s a separate integration test covering both modes, this local unit test scope is fine.
sdk-libs/sdk/src/lib.rs (2)
106-113: Nice, clear SHA256 namespaceThe sha module re-exports make it easy to opt into the SHA256 variant without breaking existing imports.
135-137: Macro re-exports look consistentTop-level exports include both LightHasher and LightHasherSha for flexibility. Good.
sdk-libs/macros/src/lib.rs (2)
3-3: Wiring in derive_light_hasher_shaImport looks correct and localized.
259-284: LightHasherSha derive: API surface LGTMDocs and derive hook are clear; aligns with sdk::sha::LightHasher aliasing.
sdk-libs/macros/tests/hasher.rs (2)
1202-1211: Manual SHA truncation matches production ruleZeroing the first byte for non-Poseidon aligns with the new normalization. Assertions look correct.
Consider a small helper to DRY the “if Sha256::ID != 0 { manual_hash[0] = 0; }” pattern in tests.
Also applies to: 1222-1227, 1252-1260, 1288-1296, 1313-1322, 1473-1482, 1493-1498, 1524-1533
1328-1436: Great end-to-end test for LightHasherShaCovers large struct, Borsh serialization, and output equivalence with ToByteArray.
sdk-libs/macros/src/hasher/data_hasher.rs (3)
36-39: Bug: slices population uses a fixed index (only index 0 set)Only slices[0] is assigned repeatedly; the rest stay empty, corrupting inputs for flattened fields.
Apply:
- let mut num_flattned_fields = 0; + let mut num_flattned_fields = 0; // will be removed let mut field_array = [[0u8; 32]; #(#flattened_fields_added)*]; let mut slices: [&[u8]; #(#flattened_fields_added)*] = [&[]; #(#flattened_fields_added)*]; - for element in field_array.iter() { - slices[num_flattned_fields] = element.as_slice(); - } + for (i, element) in field_array.iter().enumerate() { + slices[i] = element.as_slice(); + }Optionally remove the unused num_flattned_fields or fix the typo to num_flattened_fields if still needed elsewhere.
40-48: Non-Poseidon output normalizationPost-hash truncation via result[0] = 0 for non-Poseidon hashers is consistent with the new rules.
Ensure downstream consumers expect this normalization (e.g., fixtures/reference hashes updated).
Also applies to: 69-79
87-118: Prefer concrete Sha256::hash over generic H::hash in SHA pathEven with the RequireSha256 guard, calling Sha256 directly avoids accidental divergence and mirrors ToByteArray’s SHA path.
- use ::light_hasher::Hasher; use borsh::BorshSerialize; // Compile-time assertion that H must be SHA256 (ID = 1) use ::light_hasher::sha256::RequireSha256; let _ = <H as RequireSha256>::ASSERT; // For SHA256, we serialize the whole struct and hash it in one go let serialized = self.try_to_vec().map_err(|_| ::light_hasher::HasherError::BorshError)?; - let mut result = H::hash(&serialized)?; - // Truncate sha256 to 31 be bytes less than 254 bits bn254 field size. - result[0] = 0; + let mut result = ::light_hasher::Sha256::hash(&serialized)?; + // Normalize non-Poseidon outputs (Sha256::ID != Poseidon::ID) + if ::light_hasher::Sha256::ID != ::light_hasher::Poseidon::ID { + result[0] = 0; + } Ok(result)sdk-libs/macros/src/hasher/to_byte_array.rs (4)
25-25: Don’t panic in proc-macro: emit compile_error! on parse failure.Replace the
.expect(..)with a non‑panicking branch that surfaces a helpful compile-time error.- let content: TokenStream = str.parse().expect("Invalid generated code"); + // Avoid panicking at macro expansion; emit a compile_error! with details. + let content: TokenStream = match str.parse() { + Ok(ts) => ts, + Err(e) => { + let msg = format!( + "light_hasher: failed to parse generated TokenStream in ToByteArray (single-field): {e}" + ); + let lit = proc_macro2::Literal::string(&msg); + return Ok(quote! { compile_error!(#lit); }); + } + };
51-56: Correct: conditional truncation for non‑Poseidon hashers.Gating the “first-byte zero” by hasher ID matches the rest of the codebase and avoids altering Poseidon outputs.
71-75: Confirm NUM_FIELDS semantics for SHA path.You set
NUM_FIELDSto the struct’s field count while hashing the entire struct as one blob. Verify no downstream logic assumesNUM_FIELDSmaps 1:1 to the number of per-field slices (since SHA256 doesn’t expose them here).
63-86: Make truncation consistent: gate zeroing by hasher ID, and drop unused import.Other paths truncate only when
ID != Poseidon::ID. Align here and remove the unusedHasherimport.fn to_byte_array(&self) -> ::std::result::Result<[u8; 32], ::light_hasher::HasherError> { use borsh::BorshSerialize; - use ::light_hasher::Hasher; // For SHA256, we can serialize the whole struct and hash it in one go let serialized = self.try_to_vec().map_err(|_| ::light_hasher::HasherError::BorshError)?; let mut result = ::light_hasher::Sha256::hash(&serialized)?; - // Truncate field size for SHA256 - result[0] = 0; + // Truncate field size for non-Poseidon hashers (consistent with other impls) + if ::light_hasher::Sha256::ID != ::light_hasher::Poseidon::ID { + result[0] = 0; + } Ok(result) }sdk-libs/macros/src/hasher/light_hasher.rs (2)
52-54: Good delegation: Poseidon as default via derive_light_hasher_with_hasher.Keeps the public API stable while enabling alternative hashers behind the scenes.
104-110: Correct: parameterized hasher plumbed into codegen.Passing
hasherthrough togenerate_to_byte_array_impl_with_hasherkeeps the implementation generic and future-proof.
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
sdk-tests/sdk-native-test/src/create_pda.rs (1)
13-13: Note: TODO is fine; ensure follow-up tracked.If v2 instruction migration is planned, consider opening a tracking issue.
sdk-libs/sdk/src/lib.rs (1)
24-33: Prefer top-levelLightAccountimport in docs examples
Replaceaccount::LightAccountwithLightAccountin the compile_fail blocks (sdk-libs/sdk/src/lib.rs lines 24–33, 96–101) to use the crate’s top-level re-export.-//! account::LightAccount, +//! LightAccount,
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
sdk-libs/sdk/Cargo.toml(1 hunks)sdk-libs/sdk/src/account.rs(2 hunks)sdk-libs/sdk/src/error.rs(2 hunks)sdk-libs/sdk/src/lib.rs(3 hunks)sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs(1 hunks)sdk-tests/sdk-native-test/src/create_pda.rs(2 hunks)sdk-tests/sdk-native-test/src/update_pda.rs(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (1)
sdk-libs/sdk/src/account.rs (2)
account(381-381)address(254-256)
sdk-libs/sdk/src/lib.rs (3)
sdk-libs/sdk/src/account.rs (1)
account(381-381)sdk-libs/macros/tests/hasher.rs (2)
account(54-54)account(181-181)program-libs/compressed-account/src/compressed_account.rs (1)
account(762-763)
sdk-libs/sdk/src/account.rs (4)
sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)program-libs/hasher/src/sha256.rs (1)
hash(24-26)program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/poseidon.rs (1)
hash(83-85)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (21)
- GitHub Check: stateless-js-v1
- GitHub Check: Test program-libs-fast
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-slow
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: cli-v2
- GitHub Check: cli-v1
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: lint
- GitHub Check: system-programs (sdk-libs, light-sdk-macros light-sdk light-program-test light-client light-batch...
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: stateless-js-v2
🔇 Additional comments (10)
sdk-libs/sdk/Cargo.toml (1)
50-50: Retain default features for dev-dependencyanchor-langTests and doc examples import
anchor_langAPIs and derive macros; disabling default features will break them.sdk-tests/sdk-native-test/src/update_pda.rs (1)
6-7: LGTM: updated import path to top-level LightAccount.Matches the new re-export in light_sdk. No further changes needed here.
sdk-libs/sdk/src/error.rs (1)
79-81: LGTM: new ExpectedNoData variant and unique code (16035).Error variant is clear, mapped uniquely, and wired into conversions.
Also applies to: 164-165
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (2)
6-8: LGTM: use Poseidon LightAccount for Anchor tests.Importing account::poseidon::LightAccount is consistent with the SHA vs Poseidon split.
12-14: LGTM: macro imports align with new export surface.LightDiscriminator and LightHasher are correctly imported for Poseidon hashing.
sdk-tests/sdk-native-test/src/create_pda.rs (2)
7-8: LGTM: import LightAccount from crate root.Matches the new public re-export.
75-75: LGTM: removed LightHasher derive for SHA flat hashing.For sha::LightAccount, only LightDiscriminator is required.
sdk-libs/sdk/src/lib.rs (2)
105-113: LGTM: sha module and top-level LightAccount re-export.Clear split between sha and poseidon variants; ergonomic crate-root LightAccount is helpful.
134-136: LGTM: macro re-exports include both LightHasher and LightHasherSha.Good for backwards-compat and explicit SHA usage.
sdk-libs/sdk/src/account.rs (1)
403-404: Remove zeroing of the first byte of the hash (security regression).Zeroing introduces deterministic bias and reduces entropy; it’s weaker than truncation and unnecessary for SHA. This was flagged previously and remains.
Apply:
- let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?;- let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?;- output.data_hash = H::hash(output.data.as_slice())?; - output.data_hash[0] = 0; + output.data_hash = H::hash(output.data.as_slice())?;If field-size reduction is required, use a proper reduction (e.g., hash-to-field) rather than fixing bits.
Also applies to: 454-455, 498-499
| pub fn new_close( | ||
| owner: &'a Pubkey, | ||
| input_account_meta: &impl CompressedAccountMetaTrait, | ||
| input_account: A, | ||
| ) -> Result<Self, LightSdkError> { |
There was a problem hiding this comment.
🧩 Analysis chain
SHA path new_close lacks “permanently” toggle and never produces output.
Parity with the Poseidon path is needed; otherwise “non‑permanent” close is unsupported for SHA.
Apply:
- pub fn new_close(
+ pub fn new_close(
owner: &'a Pubkey,
input_account_meta: &impl CompressedAccountMetaTrait,
input_account: A,
+ permanently: bool,
) -> Result<Self, LightSdkError> {
let input_account_info = {
// For HASH_FLAT = true, use direct serialization
let data = input_account
.try_to_vec()
.map_err(|_| LightSdkError::Borsh)?;
- let mut input_data_hash = H::hash(data.as_slice())?;
- input_data_hash[0] = 0;
+ let input_data_hash = H::hash(data.as_slice())?;
let tree_info = input_account_meta.get_tree_info();
InAccountInfo {
data_hash: input_data_hash,
lamports: input_account_meta.get_lamports().unwrap_or_default(),
merkle_context: PackedMerkleContext {
merkle_tree_pubkey_index: tree_info.merkle_tree_pubkey_index,
queue_pubkey_index: tree_info.queue_pubkey_index,
leaf_index: tree_info.leaf_index,
prove_by_index: tree_info.prove_by_index,
},
root_index: input_account_meta.get_root_index().unwrap_or_default(),
discriminator: A::LIGHT_DISCRIMINATOR,
}
};
+ let output = if !permanently {
+ let output_merkle_tree_index = input_account_meta
+ .get_output_state_tree_index()
+ .ok_or(LightSdkError::OutputStateTreeIndexIsNone)?;
+ Some(OutAccountInfo {
+ lamports: input_account_meta.get_lamports().unwrap_or_default(),
+ output_merkle_tree_index,
+ discriminator: A::LIGHT_DISCRIMINATOR,
+ ..Default::default()
+ })
+ } else {
+ None
+ };
+
Ok(Self {
owner,
account: input_account,
account_info: CompressedAccountInfo {
address: input_account_meta.get_address(),
input: Some(input_account_info),
- output: None,
+ output,
},
- should_remove_data: false,
+ should_remove_data: !permanently,
_hasher: PhantomData,
})
}Ensure call sites pass the new parameter.
Also applies to: 470-481
🏁 Script executed:
#!/bin/bash
# Find call sites that need updating to pass the 'permanently' flag:
rg -nP -C2 '\bnew_close\s*\(' --type=rustLength of output: 1423
Add permanently: bool toggle to both SHA and Poseidon new_close methods
- Include
permanently: bool, adjustoutputandshould_remove_dataaccordingly (SHA: lines 443–447; Poseidon: lines 470–481 in sdk/src/account.rs and lines 91–93 in sdk-pinocchio/src/account.rs). - Update all call sites—including the example at line 107—to pass the new flag.
a74a7b2 to
bf73d2f
Compare
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (4)
sdk-libs/sdk/src/lib.rs (2)
107-113: Module sha: good exposure; consider adding a matching poseidon module for clarityProviding
sha::{LightAccount, LightHasher}is helpful. Adding a symmetricposeidon::{LightAccount, LightHasher}reduces confusion and answers “duplicate changes” concerns about where to import from.Suggested addition:
pub mod sha { pub use light_sdk_macros::LightHasherSha as LightHasher; pub use crate::account::sha::LightAccount; } + +/// Poseidon-based variants +pub mod poseidon { + pub use light_sdk_macros::LightHasher as LightHasher; + pub use crate::account::poseidon::LightAccount; +}
24-24: Doc tweaks look fine; ensure examples reflect new routingSwitching to a compile_fail fence and removing LightHasher from the example struct keeps docs accurate, but consider adding a short note pointing to
sha::LightAccountandposeidon::LightAccountto prevent confusion post-change.Also applies to: 96-101
sdk-libs/macros/src/lib.rs (2)
145-160: Doc wording and references: minor polish
- “convertable” → “convertible”.
- The BN254/Keccak truncation note applies to Poseidon. Consider clarifying that SHA256 variant uses SHA-specific rules (see below), to avoid confusion.
-/// - [`ToByteArray`](light_hasher::to_byte_array::ToByteArray), which makes the struct -/// convertable to a 2D byte vector. +/// - [`ToByteArray`](light_hasher::to_byte_array::ToByteArray), which makes the struct +/// convertible to a 2D byte vector.
233-258: LightHasherSha derive: solid addition; please clarify attribute semantics in docsThe SHA256 variant is implemented correctly. Add a note on how
#[hash]behaves under SHA (e.g., whether it still truncates to 31 bytes, zeroes MSB, etc.), since current Poseidon docs reference BN254/Keccak.Example doc addition:
/// SHA256 variant of the LightHasher derive macro. /// -/// This derive macro automatically implements the `DataHasher` and `ToByteArray` traits -/// for structs, using SHA256 as the hashing algorithm instead of Poseidon. +/// This derive macro automatically implements the `DataHasher` and `ToByteArray` traits +/// for structs, using SHA256 instead of Poseidon. +/// +/// Attribute semantics: +/// - `#[hash]`: <describe SHA-specific byte handling here (e.g., SHA256 then truncate/normalize)>; +/// - `#[skip]`: skip field from hashing (same as Poseidon variant).Can you confirm the intended normalization for non-Poseidon hashers (e.g., “zeroing the first byte”) applies here? If yes, reference it explicitly to prevent misuse.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
sdk-libs/macros/src/lib.rs(4 hunks)sdk-libs/sdk/Cargo.toml(1 hunks)sdk-libs/sdk/src/account.rs(2 hunks)sdk-libs/sdk/src/error.rs(2 hunks)sdk-libs/sdk/src/lib.rs(3 hunks)sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs(1 hunks)sdk-tests/sdk-native-test/src/create_pda.rs(2 hunks)sdk-tests/sdk-native-test/src/update_pda.rs(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- sdk-tests/sdk-native-test/src/update_pda.rs
- sdk-libs/sdk/src/error.rs
- sdk-libs/sdk/Cargo.toml
🧰 Additional context used
🧬 Code graph analysis (4)
sdk-libs/sdk/src/lib.rs (3)
sdk-libs/sdk/src/account.rs (1)
account(379-379)sdk-libs/macros/tests/hasher.rs (2)
account(54-54)account(181-181)program-libs/compressed-account/src/compressed_account.rs (1)
account(762-763)
sdk-libs/macros/src/lib.rs (3)
sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (1)
sdk-libs/sdk/src/account.rs (2)
account(379-379)address(252-254)
sdk-libs/sdk/src/account.rs (7)
sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)program-libs/hasher/src/sha256.rs (1)
hash(24-26)program-libs/hasher/src/keccak.rs (1)
hash(14-16)program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/poseidon.rs (1)
hash(83-85)program-libs/compressed-account/src/compressed_account.rs (3)
hash(76-82)hash(325-338)hash(363-375)program-tests/system-cpi-test/src/create_pda.rs (1)
hash(544-547)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (21)
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: system-programs (sdk-libs, light-sdk-macros light-sdk light-program-test light-client light-batch...
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test program-libs-fast
- GitHub Check: Test program-libs-slow
- GitHub Check: cli-v2
- GitHub Check: lint
- GitHub Check: stateless-js-v1
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v2
🔇 Additional comments (8)
sdk-libs/sdk/src/account.rs (3)
400-405: Stop zeroing the first byte of SHA-256 outputsWe already have truncation handled at the hasher layer; forcibly zeroing the MSB here further shrinks the output space, makes the digest non-uniform, and weakens collision resistance. Drop these assignments in
new_mut,new_close, andto_account_infoand rely on the hasher result as-is (or perform a proper truncation that preserves uniformity if still required).- let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?; … - let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?; … - output.data_hash = H::hash(output.data.as_slice())?; - output.data_hash[0] = 0; + output.data_hash = H::hash(output.data.as_slice())?;Also applies to: 451-453, 494-497
329-362: Fix non-permanent close semantics for the Poseidon path
permantentlyis misspelled, and more importantly the method never produces anoutputleaf whenpermanentlyis false—should_remove_databecomes a no-op and the documented “non-permanent” close flow can’t work. Please plumb a correctly spelledpermanently: bool, build anOutAccountInfowhen!permanently, and keepshould_remove_datain sync soto_account_infoenforces the empty-data requirement.- pub fn new_close( + pub fn new_close( owner: &'a Pubkey, input_account_meta: &impl CompressedAccountMetaTrait, input_account: A, - permantently: bool, + permanently: bool, ) -> Result<Self, LightSdkError> { … - Ok(Self { + let output = if !permanently { + let output_merkle_tree_index = input_account_meta + .get_output_state_tree_index() + .ok_or(LightSdkError::OutputStateTreeIndexIsNone)?; + Some(OutAccountInfo { + lamports: input_account_meta.get_lamports().unwrap_or_default(), + output_merkle_tree_index, + discriminator: A::LIGHT_DISCRIMINATOR, + ..Default::default() + }) + } else { + None + }; + + Ok(Self { owner, account: input_account, account_info: CompressedAccountInfo { address: input_account_meta.get_address(), input: Some(input_account_info), - output: None, + output, }, - should_remove_data: !permantently, + should_remove_data: !permanently, _hasher: PhantomData, })
441-478: Add thepermanentlytoggle and empty-output handling to the SHA pathThe SHA path still can’t express a non-permanent close—there’s no
permanentlyflag,should_remove_datais stuckfalse, and we don’t emit an output leaf. Mirror the Poseidon fix here: acceptpermanently: bool, build anOutAccountInfowhen!permanently, setshould_remove_dataaccordingly, and drop the manual byte-zeroing while you’re here.- pub fn new_close( + pub fn new_close( owner: &'a Pubkey, input_account_meta: &impl CompressedAccountMetaTrait, input_account: A, + permanently: bool, ) -> Result<Self, LightSdkError> { … - let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?; … - Ok(Self { + let output = if !permanently { + let output_merkle_tree_index = input_account_meta + .get_output_state_tree_index() + .ok_or(LightSdkError::OutputStateTreeIndexIsNone)?; + Some(OutAccountInfo { + lamports: input_account_meta.get_lamports().unwrap_or_default(), + output_merkle_tree_index, + discriminator: A::LIGHT_DISCRIMINATOR, + ..Default::default() + }) + } else { + None + }; + + Ok(Self { owner, account: input_account, account_info: CompressedAccountInfo { address: input_account_meta.get_address(), input: Some(input_account_info), - output: None, + output, }, - should_remove_data: false, + should_remove_data: !permanently, _hasher: PhantomData, })sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (2)
6-8: Importing Poseidon LightAccount explicitly here is appropriateAnchor tests targeting Poseidon while native tests target SHA256 is clear and avoids ambiguity.
12-14: Deriving Poseidon hasher in this test module is consistent with the Poseidon LightAccountNo issues. This keeps test intent explicit.
sdk-libs/sdk/src/lib.rs (1)
134-136: Macro re-exports including LightHasherSha look correctThe public surface provides both Poseidon and SHA variants. No issues.
sdk-libs/macros/src/lib.rs (2)
3-3: Importing derive_light_hasher_sha is correct and scoped appropriatelyNo issues with the import expansion.
225-231: Poseidon LightHasher derive entry point is cleanParsing as ItemStruct and delegating to
derive_light_hasheris straightforward. No issues.
sdk-libs/sdk/src/account.rs
Outdated
| } | ||
| } | ||
| } | ||
| } // End of __internal module |
bf73d2f to
02f1611
Compare
lint remove unused _output_account_info update lightdiscriminator macro chore: add sha flat macro test, perf: sha flat hash in LightAccount avoid double borsh serialization chore: cleanup tests, add new discriminator test, add anchor discriminator compatibility test: LightAccount close
02f1611 to
ecdd47f
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
sdk-libs/macros/src/hasher/data_hasher.rs (1)
31-38: Bug: slices population uses a non-incremented index; only slices[0] is set.
This breaks hashing for flattened structs (others remain empty). Use enumerate or track the index correctly.Apply this diff:
- let mut num_flattned_fields = 0; + // Populate slices correctly for all flattened fields let mut field_array = [[0u8; 32]; #(#flattened_fields_added)*]; let mut slices: [&[u8]; #(#flattened_fields_added)*] = [&[]; #(#flattened_fields_added)*]; - - for element in field_array.iter() { - slices[num_flattned_fields] = element.as_slice(); - } + for (i, element) in field_array.iter().enumerate() { + slices[i] = element.as_slice(); + }Optional clean-up: drop the misspelled num_flattned_fields variable entirely.
🧹 Nitpick comments (16)
sdk-libs/macros/src/discriminator.rs (3)
8-13: Feature flag path looks right; fix the name in the comment.The code uses feature "anchor-discriminator" but the comment says "anchor-discriminator-compat". Align the comment to prevent confusion.
-// When anchor-discriminator-compat feature is enabled, use "account:" prefix like Anchor does +// When the "anchor-discriminator" feature is enabled, use the "account:" prefix like Anchor does
18-18: Avoid unwrap in macro code; prefer explicit UFCS + expect.Safer compile-time failure with a clear message and no reliance on inherent methods.
- discriminator.copy_from_slice(&Sha256::hash(hash_input.as_bytes()).unwrap()[..8]); + discriminator.copy_from_slice( + &<Sha256 as Hasher>::hash(hash_input.as_bytes()) + .expect("LIGHT_DISCRIMINATOR: Sha256::hash failed")[..8], + );
36-57: Fragile assertion on TokenStream formatting; reduce brittleness or drop duplicate check.
TokenStream::to_string()spacing is not stable. Either assert only on the impl type name or compute expected bytes and compare after parsing tokens. Given the new integration test covers the exact bytes, consider removing the byte-string check here.- assert!(output.contains("impl LightDiscriminator for MyAccount")); - assert!(output.contains("[181 , 255 , 112 , 42 , 17 , 188 , 66 , 199]")); + assert!(output.contains("impl LightDiscriminator for MyAccount"));Would you prefer keeping only the integration test for exact-byte validation to avoid duplication?
sdk-libs/macros/tests/discriminator.rs (1)
1-16: Make expected bytes constants and compare arrays directlyUse
const EXPECTEDfor each feature-flagged discriminator and assert on the array directly to avoid slice coercions and clarify intent.- #[cfg(feature = "anchor-discriminator")] - let protocol_config_discriminator = &[96, 176, 239, 146, 1, 254, 99, 146]; - #[cfg(not(feature = "anchor-discriminator"))] - let protocol_config_discriminator = &[254, 235, 147, 47, 205, 77, 97, 201]; + #[cfg(feature = "anchor-discriminator")] + const EXPECTED: [u8; 8] = [96, 176, 239, 146, 1, 254, 99, 146]; + #[cfg(not(feature = "anchor-discriminator"))] + const EXPECTED: [u8; 8] = [254, 235, 147, 47, 205, 77, 97, 201]; @@ - assert_eq!( - protocol_config_discriminator, - &ProtocolConfigPda::LIGHT_DISCRIMINATOR - ); + assert_eq!(EXPECTED, ProtocolConfigPda::LIGHT_DISCRIMINATOR);program-libs/hasher/src/lib.rs (1)
27-27: MakeHasher::IDsafer and documented.Consider documenting reserved IDs and add a lightweight uniqueness test to prevent collisions. Optionally, replace raw
u8with an enum/newtype.Apply this non-breaking test to guard IDs:
+#[cfg(test)] +mod id_tests { + use super::*; + #[test] + fn ids_are_unique() { + // Ensure collisions fail CI if new hashers get added. + assert_ne!(Sha256::ID, Poseidon::ID); + assert_ne!(Sha256::ID, Keccak::ID); + assert_ne!(Poseidon::ID, Keccak::ID); + } +}sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (1)
111-206: Anchor flows LGTM; small DRY opportunity.Create/close/reinit entrypoints are consistent with the new
LightAccountlifecycle. Consider extracting a tiny helper to buildCpiAccountsandCpiInputsto reduce repetition.sdk-libs/macros/tests/hasher.rs (2)
1202-1211: DRY the SHA256 normalization (zeroing first byte) into a helper to reduce duplication and mistakes.You repeat the “if Sha256::ID != 0 { manual_hash[0] = 0; }” pattern in many places. Centralize it to keep tests consistent with spec.
Example refactor:
+fn normalize_non_poseidon<H: Hasher>(mut h: [u8; 32]) -> [u8; 32] { + if H::ID != 0 { + h[0] = 0; + } + h +}Then replace blocks like:
- let mut manual_hash = Sha256::hash(bytes.as_slice()).unwrap(); - if Sha256::ID != 0 { - manual_hash[0] = 0; - } + let manual_hash = normalize_non_poseidon::<Sha256>(Sha256::hash(bytes.as_slice()).unwrap());Apply similarly to all occurrences listed in these ranges.
Also applies to: 1222-1226, 1251-1259, 1287-1295, 1313-1321, 1378-1381, 1421-1423, 1473-1481, 1493-1497, 1524-1532
1328-1436: Solid end-to-end test for LightHasherSha; add a negative compile-time test to lock the contract.Consider a UI/trybuild test ensuring a
LightHasherSha-derived type cannot be hashed with non‑SHA256 (e.g., Poseidon). This guards against future regressions if gating weakens.I can add a
trybuild-based compile‑fail test if you’d like.sdk-libs/macros/src/hasher/light_hasher.rs (2)
55-76: Avoid unnecessary clones and unreachable; use references and keep errors ergonomic.
No need to clone generics/fields; also prefer returning a compile-time error over unreachable (even if validation precedes it).Apply this diff:
use syn::{Fields, ItemStruct, Result}; +use syn::Error; pub(crate) fn derive_light_hasher_sha(input: ItemStruct) -> Result<TokenStream> { // Use SHA256-specific validation (no field count limits) validate_input_sha(&input)?; - let generics = input.generics.clone(); - - let fields = match &input.fields { - Fields::Named(fields) => fields.clone(), - _ => unreachable!("Validation should have caught this"), - }; + let generics = &input.generics; + let fields = match &input.fields { + Fields::Named(fields) => fields, + _ => return Err(Error::new_spanned(&input, "Only structs with named fields are supported")), + }; - let field_count = fields.named.len(); + let field_count = fields.named.len(); - let to_byte_array_impl = generate_to_byte_array_impl_sha(&input.ident, &generics, field_count)?; - let data_hasher_impl = generate_data_hasher_impl_sha(&input.ident, &generics)?; + let to_byte_array_impl = generate_to_byte_array_impl_sha(&input.ident, generics, field_count)?; + let data_hasher_impl = generate_data_hasher_impl_sha(&input.ident, generics)?;
12-51: Docs: call out the new SHA256 derive and its requirements.
Add a short note that LightHasherSha serializes via Borsh, zeros the first byte (bn254 fit), and requires BorshSerialize on the struct.sdk-libs/macros/src/hasher/to_byte_array.rs (2)
75-87: Minor: unused import and explicit requirement note.
Hasher import isn’t used in this function; also consider documenting the implicit BorshSerialize requirement.Apply this diff:
- use ::light_hasher::Hasher; // For SHA256, we can serialize the whole struct and hash it in one go let serialized = self.try_to_vec().map_err(|_| ::light_hasher::HasherError::BorshError)?; let mut result = ::light_hasher::Sha256::hash(&serialized)?;
71-89: Optional: factor “first-byte zeroing” into a small helper to avoid duplication across generators.
You perform identical truncation in multiple emitters; a shared helper would keep behavior in sync.sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/tests/test.rs (4)
279-283: Avoid panics; add context and use a local bytes ref for deserialize.Replace chained
unwrap()s withexpect(...)and deserialize via a local&[u8]binding for clarity.- let my_compressed_account = MyCompressedAccount::deserialize( - &mut compressed_account.data.as_mut().unwrap().data.as_slice(), - ) - .unwrap(); + let data = compressed_account + .data + .as_ref() + .expect("compressed account must contain data before close"); + let mut bytes: &[u8] = &data.data; + let my_compressed_account = MyCompressedAccount::deserialize(&mut bytes) + .expect("failed to deserialize MyCompressedAccount");
320-327: Preferexpectwith context over bareunwrapwhen fetching the closed account.Improves test failure diagnostics without changing behavior.
- let closed_account = rpc - .get_compressed_account(address, None) - .await - .unwrap() - .value; + let closed_account = rpc + .get_compressed_account(address, None) + .await + .expect("fetch closed account from RPC") + .value;
389-391: Hoist theCompressedAccountMetaCloseimport to module scope.Keeps imports consistent with the rest of the file and avoids repeating
useinside a function.- // Import CompressedAccountMetaClose - use light_sdk::instruction::account_meta::CompressedAccountMetaClose; + // CompressedAccountMetaClose imported at module scopeAdd near the other
use light_sdkimports at the top:use light_sdk::instruction::account_meta::CompressedAccountMetaClose;
256-308: DRY up repeated proof/account-meta boilerplate in helpers.
remaining_accountssetup,get_validity_proof,pack_tree_infos, andto_account_metasrepeat across three helpers. Extract a small helper returning(remaining_accounts, packed_tree_accounts, proof)to reduce duplication and future drift.Also applies to: 310-364, 366-414
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lockis excluded by!**/*.lock
📒 Files selected for processing (24)
.github/workflows/rust.yml(1 hunks).github/workflows/sdk-tests.yml(1 hunks)program-libs/hasher/src/keccak.rs(1 hunks)program-libs/hasher/src/lib.rs(1 hunks)program-libs/hasher/src/poseidon.rs(1 hunks)program-libs/hasher/src/sha256.rs(1 hunks)sdk-libs/macros/Cargo.toml(2 hunks)sdk-libs/macros/src/discriminator.rs(2 hunks)sdk-libs/macros/src/hasher/data_hasher.rs(2 hunks)sdk-libs/macros/src/hasher/input_validator.rs(1 hunks)sdk-libs/macros/src/hasher/light_hasher.rs(16 hunks)sdk-libs/macros/src/hasher/mod.rs(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs(2 hunks)sdk-libs/macros/src/lib.rs(4 hunks)sdk-libs/macros/tests/discriminator.rs(1 hunks)sdk-libs/macros/tests/hasher.rs(8 hunks)sdk-libs/sdk/Cargo.toml(1 hunks)sdk-libs/sdk/src/account.rs(2 hunks)sdk-libs/sdk/src/error.rs(2 hunks)sdk-libs/sdk/src/lib.rs(3 hunks)sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs(2 hunks)sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/tests/test.rs(3 hunks)sdk-tests/sdk-native-test/src/create_pda.rs(2 hunks)sdk-tests/sdk-native-test/src/update_pda.rs(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (7)
- sdk-libs/sdk/Cargo.toml
- sdk-libs/sdk/src/lib.rs
- program-libs/hasher/src/keccak.rs
- sdk-libs/macros/src/hasher/input_validator.rs
- sdk-tests/sdk-native-test/src/create_pda.rs
- .github/workflows/rust.yml
- sdk-libs/macros/Cargo.toml
🧰 Additional context used
🧬 Code graph analysis (11)
sdk-libs/macros/tests/discriminator.rs (2)
sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)sdk-libs/sdk/src/account.rs (1)
discriminator(230-232)
sdk-libs/macros/src/hasher/mod.rs (2)
sdk-libs/macros/src/lib.rs (1)
light_hasher(226-231)sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)
sdk-libs/macros/src/hasher/to_byte_array.rs (5)
program-libs/hasher/src/to_byte_array.rs (10)
to_byte_array(6-6)to_byte_array(32-47)to_byte_array(55-59)to_byte_array(150-159)std(37-37)std(306-306)std(317-317)std(328-328)std(339-339)std(350-350)program-libs/hasher/src/hash_to_field_size.rs (2)
hash_to_field_size(9-9)hash_to_field_size(16-35)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)program-libs/hasher/src/sha256.rs (2)
hashv(28-54)hash(24-26)
sdk-libs/macros/src/hasher/data_hasher.rs (4)
program-libs/hasher/src/keccak.rs (2)
hashv(18-20)hash(14-16)program-libs/hasher/src/lib.rs (2)
hashv(29-29)hash(28-28)program-libs/hasher/src/poseidon.rs (2)
hashv(87-127)hash(83-85)program-libs/hasher/src/sha256.rs (2)
hashv(28-54)hash(24-26)
sdk-libs/macros/src/hasher/light_hasher.rs (6)
sdk-libs/macros/src/lib.rs (2)
data_hasher(262-267)light_hasher(226-231)sdk-libs/macros/src/hasher/data_hasher.rs (2)
generate_data_hasher_impl(7-85)generate_data_hasher_impl_sha(88-118)sdk-libs/macros/src/hasher/field_processor.rs (1)
process_field(43-61)sdk-libs/macros/src/hasher/input_validator.rs (3)
get_field_attribute(94-108)validate_input(26-61)validate_input_sha(64-91)sdk-libs/macros/src/hasher/to_byte_array.rs (2)
generate_to_byte_array_impl_sha(64-90)generate_to_byte_array_impl_with_hasher(7-61)sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)
sdk-libs/sdk/src/account.rs (6)
program-libs/compressed-account/src/instruction_data/with_account_info.rs (9)
owner(71-73)owner(140-142)owner(329-331)address(82-84)address(132-134)lamports(79-81)lamports(128-130)data(98-104)data(152-158)sdk-libs/macros/src/discriminator.rs (1)
discriminator(6-31)program-libs/compressed-account/src/compressed_account.rs (4)
account(762-763)hash(76-82)hash(325-338)hash(363-375)program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/poseidon.rs (1)
hash(83-85)program-libs/hasher/src/sha256.rs (1)
hash(24-26)
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (1)
sdk-libs/sdk/src/account.rs (10)
account(438-438)address(254-256)new_close(371-380)new_close(546-554)to_account_info(423-442)to_account_info(601-621)new_empty(325-369)new_empty(500-544)new_close_permanent(387-421)new_close_permanent(561-599)
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/tests/test.rs (1)
sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/src/lib.rs (4)
close_compressed_account(111-142)reinit_closed_account(144-174)close_compressed_account_permanent(176-206)default(243-258)
sdk-libs/macros/src/discriminator.rs (2)
sdk-libs/sdk/src/account.rs (1)
discriminator(230-232)program-libs/account-checks/src/discriminator.rs (1)
discriminator(6-8)
sdk-libs/macros/tests/hasher.rs (4)
program-libs/hasher/src/lib.rs (1)
hash(28-28)program-libs/hasher/src/sha256.rs (1)
hash(24-26)program-tests/system-cpi-test/src/create_pda.rs (1)
hash(544-547)program-libs/hasher/src/hash_to_field_size.rs (1)
hash_to_bn254_field_size_be(91-93)
sdk-libs/macros/src/lib.rs (2)
sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher(51-53)derive_light_hasher_sha(55-76)program-libs/hasher/src/data_hasher.rs (2)
hash(4-4)hash(78-83)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (21)
- GitHub Check: stateless-js-v2
- GitHub Check: system-programs (sdk-libs, light-sdk-macros light-sdk light-program-test light-client light-batch...
- GitHub Check: system-programs (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -...
- GitHub Check: system-programs (sdk-test-program, ["cargo-test-sbf -p sdk-native-test"])
- GitHub Check: stateless-js-v1
- GitHub Check: lint
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test", "cargo test -p light-syst...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: Test program-libs-fast
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test program-libs-slow
- GitHub Check: cli-v2
- GitHub Check: cli-v1
🔇 Additional comments (16)
sdk-libs/sdk/src/error.rs (2)
79-80: New variant covers LightAccount closure edge caseThanks for adding a dedicated error for the “no data” closure path; it keeps the diagnostic message focused and stays consistent with the existing
Errorderive usage.
164-164: Unique numeric code assignedMapping the new variant to 16035 maintains one-to-one error↔code conversion and avoids the earlier collision with
InvalidCpiAccountsOffset.sdk-tests/sdk-native-test/src/update_pda.rs (1)
6-7: Confirm which LightAccount variant this import resolves to (sha vs poseidon).If root re-exports default to SHA256, this is fine for sdk-native-test; otherwise, prefer explicit
light_sdk::sha::LightAccount/light_sdk::poseidon::LightAccountto avoid ambiguity across tests.program-libs/hasher/src/poseidon.rs (1)
81-82: ID addition looks good; verify uniqueness contract.
Poseidon::ID = 0aligns with “non-SHA path” semantics. Ensure IDs across all hashers are unique and documented (e.g., 0: Poseidon, 1: SHA256, 2: Keccak).sdk-libs/sdk/src/account.rs (2)
137-145: Hasher‑specific type aliases are clear and helpful.The
sha::LightAccountandposeidon::LightAccountaliases make call sites explicit and reduce accidental mixing.Consider re‑exporting a default
LightAccountat crate root with docs stating it maps to SHA256 to match examples.Also applies to: 146-170
459-461: Security: stop zeroing the first byte of SHA‑256 digests.Zeroing introduces deterministic bias and reduces entropy beyond simple truncation. Remove the
[..][0] = 0steps; if field‑size reduction is required, use a proper mapping (e.g., hash‑to‑field) rather than forcing leading zero bits.Apply:
- let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?;- let mut input_data_hash = H::hash(data.as_slice())?; - input_data_hash[0] = 0; + let input_data_hash = H::hash(data.as_slice())?;- output.data_hash = H::hash(output.data.as_slice())?; - output.data_hash[0] = 0; + output.data_hash = H::hash(output.data.as_slice())?;If bn254 compatibility is the goal, replace these with a proper “hash to field size” routine rather than manual zeroing.
Also applies to: 571-573, 616-618
sdk-libs/macros/src/hasher/mod.rs (1)
7-7: Re-export looks correct; confirm crate root exposes LightHasherSha.Ensure
sdk-libs/macros/src/lib.rspublicly exposesLightHasherShaso downstream crates canuse light_sdk_macros::LightHasherShawithout reaching into module internals.program-libs/hasher/src/sha256.rs (1)
23-23: ID constant addition LGTM.
const ID: u8 = 1;is clear and consistent for dispatch.sdk-libs/macros/src/hasher/light_hasher.rs (2)
52-53: LGTM: generic hasher dispatch for default Poseidon path.
Keeps the default derive path stable while enabling alt hashers via the helper.
507-545: Solid test coverage for the SHA256 path and Poseidon parity.
Good assertions on serialization usage, truncation, field-limit behavior, and discriminator integration.Also applies to: 548-590, 592-607, 610-628, 630-688, 690-778
sdk-libs/macros/src/hasher/data_hasher.rs (2)
40-48: Conditional truncation logic looks correct.
Hash result is truncated for non-Poseidon hashers only. Matches the stated normalization rule.Also applies to: 69-79
94-113: Align SHA DataHasher with ToByteArray: call Sha256 directly (same semantics, clearer intent).
Currently uses H::hash with a compile-time constraint; calling Sha256::hash mirrors the SHA ToByteArray path and avoids ambiguity.Apply this diff:
- use ::light_hasher::Hasher; use borsh::BorshSerialize; // Compile-time assertion that H must be SHA256 (ID = 1) use ::light_hasher::sha256::RequireSha256; let _ = <H as RequireSha256>::ASSERT; // For SHA256, we serialize the whole struct and hash it in one go let serialized = self.try_to_vec().map_err(|_| ::light_hasher::HasherError::BorshError)?; - let mut result = H::hash(&serialized)?; + let mut result = ::light_hasher::Sha256::hash(&serialized)?; // Truncate sha256 to 31 be bytes less than 254 bits bn254 field size. result[0] = 0;sdk-libs/macros/src/hasher/to_byte_array.rs (2)
47-56: LGTM: generalized hasher with conditional truncation.
Parameterizing over the hasher and guarding truncation by ID is consistent with the DataHasher path.
25-25: Don’t panic in proc-macro: replace parse().expect with a compile_error! fallback (or build via quote!).
A panic at expansion time degrades DX; emit a clear compile-time error instead.Apply this diff:
- let content: TokenStream = str.parse().expect("Invalid generated code"); + let content: TokenStream = match str.parse() { + Ok(ts) => ts, + Err(e) => { + let msg = format!("LightHasher codegen failed: {}", e); + return Ok(quote! { compile_error!(#msg); }); + } + };Or better: avoid parsing strings altogether by constructing content with quote!.
sdk-libs/macros/src/lib.rs (1)
225-231: Macro wrapper aligns with existing patternThanks for routing the derive through
derive_light_hasher; it keeps the macro surface consistent with the rest of the file.sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/tests/test.rs (1)
138-142: Incorrect assumption about API behavior; indexer returns an error for missing accounts
ThePhotonIndexer::get_compressed_accountimplementation calls.ok_or(IndexerError::AccountNotFound)when the account is absent, so a missing account results inErr(IndexerError::AccountNotFound), not anOkwith aNonevalue. The existingassert!(result.is_err())is correct—update the test comment to reflect that the RPC call fails withAccountNotFound.Likely an incorrect or invalid review comment.
| /// Compile-time assertion trait that ensures a generic Hasher type is SHA256. | ||
| /// Used by LightHasherSha macro to enforce SHA256-only implementation at compile time. | ||
| pub trait RequireSha256: Hasher { | ||
| const ASSERT: () = assert!( | ||
| Self::ID == 1, | ||
| "DataHasher for LightHasherSha only works with SHA256 (ID=1). Example: your_struct.hash::<Sha256>()?" | ||
| ); | ||
| } | ||
|
|
||
| impl<T: Hasher> RequireSha256 for T {} | ||
|
|
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
❓ Verification inconclusive
Compile-time gating via RequireSha256 is brittle; blanket impl negates the constraint.
With impl<T: Hasher> RequireSha256 for T {}, any Hasher satisfies the bound unless the associated const is actually referenced. The assert! in an associated const may also depend on MSRV/const‑panic semantics and won’t fire unless evaluated. Prefer a simple marker trait implemented only for Sha256, or a const‑generic assertion pattern.
Minimal, robust fix:
-pub trait RequireSha256: Hasher {
- const ASSERT: () = assert!(
- Self::ID == 1,
- "DataHasher for LightHasherSha only works with SHA256 (ID=1). Example: your_struct.hash::<Sha256>()?"
- );
-}
-
-impl<T: Hasher> RequireSha256 for T {}
+pub trait RequireSha256: Hasher {}
+impl RequireSha256 for Sha256 {}If you need a generic proof, use a const‑generic helper type instead of assert! in a const.
Replace brittle RequireSha256 implementation with a marker trait
In program-libs/hasher/src/sha256.rs remove the const ASSERT and blanket impl, and instead use:
pub trait RequireSha256: Hasher {}
impl RequireSha256 for Sha256 {}🤖 Prompt for AI Agents
In program-libs/hasher/src/sha256.rs around lines 8-18, replace the current
RequireSha256 trait that uses a const ASSERT plus a blanket impl (which is
brittle) with a simple marker trait and a concrete impl: remove the const ASSERT
and the impl<T: Hasher> RequireSha256 for T {}, declare pub trait RequireSha256:
Hasher {} and add impl RequireSha256 for Sha256 {} (ensure Sha256 is
imported/visible in this module and update any call sites expecting the old
ASSERT behavior).
| /// ```rust | ||
| /// use light_sdk_macros::LightHasherSha; | ||
| /// use borsh::{BorshSerialize, BorshDeserialize}; | ||
| /// use solana_pubkey::Pubkey; | ||
| /// | ||
| /// #[derive(LightHasherSha, BorshSerialize, BorshDeserialize)] | ||
| /// pub struct GameState { | ||
| /// pub player: Pubkey, // Will be hashed to 31 bytes | ||
| /// pub level: u32, | ||
| /// } |
There was a problem hiding this comment.
Doc example should be marked ignore
Because the fence is ```rust, cargo test --doc will try to compile this snippet and require borsh plus solana_pubkey, which aren’t declared dev-dependencies here. That will break doctests. Please mark the block as ignore (like the LightHasher example above) or add the missing dev-deps.
-/// ```rust
+/// ```ignore🤖 Prompt for AI Agents
In sdk-libs/macros/src/lib.rs around lines 240 to 249, the doctest code block is
fenced as ```rust which causes cargo test --doc to try compiling dependencies
not provided; change the opening fence to ```ignore so the snippet is skipped by
doctests (or alternatively add the required crates as dev-dependencies if you
prefer runnable examples).
| //! ```rust | ||
| //! use light_sdk::{LightAccount, LightDiscriminator}; | ||
| //! use borsh::{BorshSerialize, BorshDeserialize}; | ||
| //! use solana_pubkey::Pubkey; | ||
| //! | ||
| //! #[derive(Clone, Debug, Default, LightDiscriminator, BorshSerialize, BorshDeserialize)] | ||
| //! pub struct CounterAccount { | ||
| //! pub owner: Pubkey, | ||
| //! pub counter: u64, | ||
| //! }; | ||
| //! | ||
| //! let program_id = Pubkey::new_unique(); | ||
| //! let address = [0u8; 32]; | ||
| //! let output_tree_index = 0u8; | ||
| //! let owner = Pubkey::new_unique(); | ||
| //! | ||
| //! let mut my_compressed_account = LightAccount::<'_, CounterAccount>::new_init( | ||
| //! &crate::ID, | ||
| //! &program_id, | ||
| //! // Address | ||
| //! Some(address), | ||
| //! output_tree_index, | ||
| //! ); | ||
| //! // Set data: | ||
| //! my_compressed_account.owner = ctx.accounts.signer.key(); | ||
| //! my_compressed_account.owner = owner; | ||
| //! ``` |
There was a problem hiding this comment.
❓ Verification inconclusive
Docs: verify re-exports and fix close example types.
- If
LightAccountis not re-exported at crate root, update snippets tolight_sdk::sha::LightAccount/light_sdk::poseidon::LightAccount. - The “Close compressed account” snippet uses
CompressedAccountMetaClosewithnew_close(non‑permanent). For non‑permanent close, useCompressedAccountMeta; reserveCompressedAccountMetaClosefornew_close_permanent.
Apply this minimal doc fix:
-//! use light_sdk::instruction::account_meta::CompressedAccountMetaClose;
+//! use light_sdk::instruction::account_meta::CompressedAccountMeta;
...
-//! let account_meta_close = CompressedAccountMetaClose::default();
+//! let account_meta = CompressedAccountMeta::default();
...
-//! let _my_compressed_account = LightAccount::<'_, CounterAccount>::new_close(
+//! let _my_compressed_account = LightAccount::<'_, CounterAccount>::new_close(
//! &program_id,
-//! &account_meta_close,
+//! &account_meta,
//! compressed_account_data,
//! ).unwrap();Also applies to: 65-87, 90-110
Docs: fix import path and close example types
- change the import to
use light_sdk::account::LightAccount;(the crate root doesn’t re-exportLightAccount) (docs.rs) - in the “Close compressed account” example, use
CompressedAccountMetawithnew_close, and reserveCompressedAccountMetaClosefornew_close_permanent(docs.rs)
🤖 Prompt for AI Agents
In sdk-libs/sdk/src/account.rs around lines 38 to 62, update the example: change
the import from use light_sdk::{LightAccount, LightDiscriminator}; to use
light_sdk::account::LightAccount; (the crate root doesn’t re-export
LightAccount), and in the “Close compressed account” example replace usages so
that CompressedAccountMeta is constructed with new_close (for temporary close)
and reserve CompressedAccountMetaClose only for new_close_permanent; adjust the
example types and constructor names accordingly to match docs.rs.
| /// Closes the compressed account. | ||
| /// Define whether to close the account permanently or not. | ||
| /// The address of an account that is closed permanently cannot be created again. | ||
| /// For accounts that are not closed permanently the accounts address | ||
| /// continues to exist in an account with discriminator and without data. | ||
| pub fn new_close_permanent( |
There was a problem hiding this comment.
🧩 Analysis chain
Doc/behavior mismatch on non‑permanent close discriminator.
Docs say the closed account “continues to exist with discriminator,” but to_account_info sets output.discriminator = [0u8; 8] on close. Decide on the intended state and align docs or code accordingly.
🏁 Script executed:
#!/bin/bash
set -e
# Locate account.rs
account_file=$(find . -type f -name account.rs)
echo "account.rs path: $account_file"
# Show code around new_close_permanent (lines ~350–420)
echo "---- new_close_permanent snippet ----"
sed -n '350,420p' "$account_file"
# Find to_account_info definition and usage
echo "---- to_account_info definition ----"
grep -R -n "fn to_account_info" "$account_file"
echo "---- to_account_info calls ----"
grep -R -n "to_account_info" "$account_file"
# Find all discriminator assignments
echo "---- discriminator assignments ----"
grep -R -n "discriminator" "$account_file"Length of output: 479
🏁 Script executed:
#!/bin/bash
set -e
# Inspect the specific file from the review comment
file="sdk-libs/sdk/src/account.rs"
echo "---- Showing lines 360–410 of $file ----"
sed -n '360,410p' "$file"
# Locate to_account_info definition and calls within this file
echo "---- to_account_info definition in $file ----"
grep -R -n "fn to_account_info" "$file" || echo "No definition found in $file"
echo "---- to_account_info calls in $file ----"
grep -R -n "to_account_info" "$file" || echo "No calls found in $file"
# Find all assignments to discriminator within this file
echo "---- discriminator assignments in $file ----"
grep -R -n "discriminator" "$file"Length of output: 3927
🏁 Script executed:
#!/bin/bash
set -e
file="sdk-libs/sdk/src/account.rs"
echo "---- to_account_info (first) lines 418–500 ----"
sed -n '418,500p' "$file"
echo "---- to_account_info (second) lines 590–650 ----"
sed -n '590,650p' "$file"Length of output: 5694
Fix doc/behavior mismatch for account closing
The doc comment above new_close_permanent describes “non-permanent” closes as retaining the discriminator, but in to_account_info when should_remove_data is true (used by new_close) we unconditionally zero out output.discriminator. Decide whether non-permanent closes should clear only data (keep the discriminator) and permanent closes should also clear the discriminator, then update the doc in new_close_permanent (or move it to new_close) and/or adjust the should_remove_data flag and discriminator assignment in to_account_info to match.
🤖 Prompt for AI Agents
In sdk-libs/sdk/src/account.rs around lines 382 to 387, the doc and
implementation disagree about whether a non-permanent close should keep the
account discriminator. Make non-permanent closes preserve the discriminator and
only remove it for permanent closes: update to_account_info so the discriminator
is zeroed only for permanent closes (i.e., when the flag indicating permanent
removal is set), and adjust the should_remove_data flag usage accordingly
(non-permanent should remove data but keep discriminator, permanent should
remove both); also move or update the doc comment so it correctly describes the
chosen behavior (either update the comment on new_close_permanent to document
that permanent closes remove the discriminator, or move the description to
new_close if you decide the opposite).
| // Test close_compressed_account (non-permanent close - data should be None) | ||
| // Get the account fresh from RPC for the correct type |
There was a problem hiding this comment.
Fix comment vs assertion mismatch (None vs default).
Comments say “data should be None,” but the assertion expects Some(CompressedAccountData::default()). Align the comments with the asserted behavior (zeroed/default data after non‑permanent close).
- // Test close_compressed_account (non-permanent close - data should be None)
+ // Test close_compressed_account (non-permanent close — data should be zeroed/default)
...
- // Data should be None after close
+ // Data should be default/zeroed after closeAlso applies to: 110-115
🤖 Prompt for AI Agents
In sdk-tests/sdk-anchor-test/programs/sdk-anchor-test/tests/test.rs around lines
87-88 (and similarly at lines 110-115) the test comments state “data should be
None” while the assertion expects Some(CompressedAccountData::default()); update
the comments to reflect the actual asserted behavior (that after a non-permanent
close the account data is present but zeroed/defaulted), e.g. replace “data
should be None” with “data should be default/zeroed
(Some(CompressedAccountData::default()))” so comments match the test assertions.
Summary by CodeRabbit
New Features
Improvements
Bug Fixes
Tests / CI