Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/light-system-programs-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ jobs:
- program: light-compressed-token
sub-tests: '["cargo-test-sbf -p compressed-token-test"]'
- program: system-cpi-test
sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio --all-features"]'
sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

🧩 Analysis chain

Ensure feature-gated CPI-context paths are exercised; add a run with the feature enabled.

Dropping --all-features means system-cpi-test and light-system-program-pinocchio now run only with default features. Given this PR introduces a feature-gated “deactivate CPI context” behavior and error, this change risks a silent coverage gap. Run tests once with defaults and once with --features deactivate-cpi-context to validate both paths.

Apply this minimal change to cover both variants in this job:

-            sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
+            sub-tests: '[
+              "cargo-test-sbf -p system-cpi-test",
+              "cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context",
+              "cargo test -p light-system-program-pinocchio",
+              "cargo test -p light-system-program-pinocchio --features deactivate-cpi-context"
+            ]'

Run this quick scan to confirm there are tests/assertions hidden behind the feature and to spot any other gated code that might now be skipped:


🏁 Script executed:

#!/bin/bash
# Find feature-gated code/tests for CPI-context deactivation
rg -nC2 -S '\#\[\s*cfg\s*\(\s*feature\s*=\s*"deactivate-cpi-context"\s*\)\s*\]' --type=rust

# Also list tests mentioning the new error type to ensure both variants exercise it
rg -n 'CpiContextDeactivated|deactivate.?cpi.?context' --type=rust

Length of output: 940


Run tests with --features deactivate-cpi-context to exercise feature-gated CPI paths

programs/system/src/invoke_cpi/process_cpi_context.rs is guarded by #[cfg(feature = "deactivate-cpi-context")] and errors.rs defines CpiContextDeactivated, so the workflow (.github/workflows/light-system-programs-tests.yml — around line 67) must run the tests both with and without that feature.

-            sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
+            sub-tests: '[
+              "cargo-test-sbf -p system-cpi-test",
+              "cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context",
+              "cargo test -p light-system-program-pinocchio",
+              "cargo test -p light-system-program-pinocchio --features deactivate-cpi-context"
+            ]'
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
sub-tests: '["cargo-test-sbf -p system-cpi-test", "cargo test -p light-system-program-pinocchio"]'
sub-tests: '[
"cargo-test-sbf -p system-cpi-test",
"cargo-test-sbf -p system-cpi-test --features deactivate-cpi-context",
"cargo test -p light-system-program-pinocchio",
"cargo test -p light-system-program-pinocchio --features deactivate-cpi-context"
]'
🤖 Prompt for AI Agents
.github/workflows/light-system-programs-tests.yml around line 67: the workflow
currently runs cargo tests only without the deactivate-cpi-context feature, but
the file programs/system/src/invoke_cpi/process_cpi_context.rs is feature-gated
and errors.rs defines CpiContextDeactivated, so update the workflow so tests run
both with and without the feature; specifically add a second test invocation
that runs cargo test -p light-system-program-pinocchio --features
deactivate-cpi-context (or add the feature-flagged variant to the sub-tests
array alongside the existing entry) so both code paths are exercised.

- program: system-cpi-test-v2-event
sub-tests: '["cargo-test-sbf -p system-cpi-v2-test -- event::parse"]'
- program: system-cpi-test-v2-functional
Expand Down
13 changes: 6 additions & 7 deletions program-tests/compressed-token-test/tests/test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

use std::{assert_eq, str::FromStr};

use account_compression::errors::AccountCompressionErrorCode;
use anchor_lang::{
prelude::AccountMeta, system_program, AccountDeserialize, AnchorDeserialize, AnchorSerialize,
InstructionData, ToAccountMetas,
Expand Down Expand Up @@ -1244,7 +1243,7 @@ async fn test_mint_to_failing() {
assert_rpc_error(
result,
0,
AccountCompressionErrorCode::StateMerkleTreeAccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down Expand Up @@ -2303,7 +2302,7 @@ async fn test_approve_failing() {
assert_rpc_error(
result,
0,
AccountCompressionErrorCode::StateMerkleTreeAccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down Expand Up @@ -2349,7 +2348,7 @@ async fn test_approve_failing() {
assert_rpc_error(
result,
0,
AccountCompressionErrorCode::StateMerkleTreeAccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down Expand Up @@ -2782,7 +2781,7 @@ async fn test_revoke_failing() {
assert_rpc_error(
result,
0,
AccountCompressionErrorCode::StateMerkleTreeAccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down Expand Up @@ -3404,7 +3403,7 @@ async fn failing_tests_burn() {
assert_rpc_error(
res,
0,
anchor_lang::error::ErrorCode::AccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down Expand Up @@ -5251,7 +5250,7 @@ async fn test_invalid_inputs() {

assert_custom_error_or_program_error(
res,
anchor_lang::error::ErrorCode::AccountDiscriminatorMismatch.into(),
SystemProgramError::StateMerkleTreeAccountDiscriminatorMismatch.into(),
)
.unwrap();
}
Expand Down
2 changes: 0 additions & 2 deletions programs/compressed-token/src/burn.rs
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,6 @@ pub fn create_input_and_output_accounts_burn(
lamports,
&hashed_mint,
&[inputs.change_account_merkle_tree_index],
remaining_accounts,
)?;
output_compressed_accounts
} else {
Expand All @@ -181,7 +180,6 @@ pub fn create_input_and_output_accounts_burn(
&mut compressed_input_accounts,
input_token_data.as_slice(),
&hashed_mint,
remaining_accounts,
)?;
Ok((compressed_input_accounts, output_compressed_accounts))
}
Expand Down
4 changes: 0 additions & 4 deletions programs/compressed-token/src/delegation.rs
Original file line number Diff line number Diff line change
Expand Up @@ -158,13 +158,11 @@ pub fn create_input_and_output_accounts_approve(
lamports,
&hashed_mint,
&merkle_tree_indices,
remaining_accounts,
)?;
add_data_hash_to_input_compressed_accounts::<NOT_FROZEN>(
&mut compressed_input_accounts,
input_token_data.as_slice(),
&hashed_mint,
remaining_accounts,
)?;
Ok((compressed_input_accounts, output_compressed_accounts))
}
Expand Down Expand Up @@ -250,13 +248,11 @@ pub fn create_input_and_output_accounts_revoke(
lamports,
&hashed_mint,
&[inputs.output_account_merkle_tree_index],
remaining_accounts,
)?;
add_data_hash_to_input_compressed_accounts::<NOT_FROZEN>(
&mut compressed_input_accounts,
input_token_data.as_slice(),
&hashed_mint,
remaining_accounts,
)?;
Ok((compressed_input_accounts, output_compressed_accounts))
}
Expand Down
21 changes: 4 additions & 17 deletions programs/compressed-token/src/freeze.rs
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
use account_compression::StateMerkleTreeAccount;
use anchor_lang::prelude::*;
use light_compressed_account::{
compressed_account::{CompressedAccount, CompressedAccountData},
Expand All @@ -14,7 +13,7 @@ use crate::{
process_transfer::{
add_data_hash_to_input_compressed_accounts, cpi_execute_compressed_transaction_transfer,
get_input_compressed_accounts_with_merkle_context_and_check_signer,
InputTokenDataWithContext, BATCHED_DISCRIMINATOR,
InputTokenDataWithContext,
},
token_data::{AccountState, TokenData},
FreezeInstruction,
Expand Down Expand Up @@ -116,7 +115,6 @@ pub fn create_input_and_output_accounts_freeze_or_thaw<
&mut compressed_input_accounts,
input_token_data.as_slice(),
&hashed_mint,
remaining_accounts,
)?;
Ok((compressed_input_accounts, output_compressed_accounts))
}
Expand Down Expand Up @@ -165,18 +163,7 @@ fn create_token_output_accounts<const IS_FROZEN: bool>(
};
token_data.serialize(&mut token_data_bytes)?;

let discriminator_bytes = &remaining_accounts[token_data_with_context
.merkle_context
.merkle_tree_pubkey_index
as usize]
.try_borrow_data()?[0..8];
use anchor_lang::Discriminator;
let data_hash = match discriminator_bytes {
StateMerkleTreeAccount::DISCRIMINATOR => token_data.hash_legacy(),
BATCHED_DISCRIMINATOR => token_data.hash(),
_ => panic!(),
}
.map_err(ProgramError::from)?;
let data_hash = token_data.hash().map_err(ProgramError::from)?;

let data: CompressedAccountData = CompressedAccountData {
discriminator: TOKEN_COMPRESSED_ACCOUNT_DISCRIMINATOR,
Expand Down Expand Up @@ -519,7 +506,7 @@ pub mod test_freeze {
let change_data_struct = CompressedAccountData {
discriminator: TOKEN_COMPRESSED_ACCOUNT_DISCRIMINATOR,
data: serialized_expected_token_data.clone(),
data_hash: token_data.hash_legacy().unwrap(),
data_hash: token_data.hash().unwrap(),
};
expected_compressed_output_accounts.push(OutputCompressedAccountWithPackedContext {
compressed_account: CompressedAccount {
Expand Down Expand Up @@ -578,7 +565,7 @@ pub mod test_freeze {
};
let mut data = Vec::new();
token_data.serialize(&mut data).unwrap();
let data_hash = token_data.hash_legacy().unwrap();
let data_hash = token_data.hash().unwrap();
InAccount {
lamports: 0,
address: None,
Expand Down
5 changes: 2 additions & 3 deletions programs/compressed-token/src/process_mint.rs
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,6 @@ pub fn process_mint_to_or_compress<'info, const IS_MINT_TO: bool>(
// We ensure that the Merkle tree account is the first
// remaining account in the cpi to the system program.
&vec![0; amounts.len()],
&[ctx.accounts.merkle_tree.to_account_info()],
)?;
bench_sbf_end!("tm_output_compressed_accounts");

Expand Down Expand Up @@ -564,7 +563,7 @@ mod test {
let data = CompressedAccountData {
discriminator: TOKEN_COMPRESSED_ACCOUNT_DISCRIMINATOR,
data: token_data_bytes,
data_hash: token_data.hash_legacy().unwrap(),
data_hash: token_data.hash().unwrap(),
};
let lamports = 0;

Expand Down Expand Up @@ -629,7 +628,7 @@ mod test {
let data = CompressedAccountData {
discriminator: TOKEN_COMPRESSED_ACCOUNT_DISCRIMINATOR,
data: token_data_bytes,
data_hash: token_data.hash_legacy().unwrap(),
data_hash: token_data.hash().unwrap(),
};
let lamports = rng.gen_range(0..1_000_000_000_000);

Expand Down
69 changes: 4 additions & 65 deletions programs/compressed-token/src/process_transfer.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
use account_compression::{utils::constants::CPI_AUTHORITY_PDA_SEED, StateMerkleTreeAccount};
use anchor_lang::{
prelude::*, solana_program::program_error::ProgramError, AnchorDeserialize, Discriminator,
};
use account_compression::utils::constants::CPI_AUTHORITY_PDA_SEED;
use anchor_lang::{prelude::*, solana_program::program_error::ProgramError, AnchorDeserialize};
use light_compressed_account::{
compressed_account::{CompressedAccount, CompressedAccountData, PackedMerkleContext},
hash_to_bn254_field_size_be,
Expand Down Expand Up @@ -128,7 +126,6 @@ pub fn process_transfer<'a, 'b, 'c, 'info: 'b + 'c>(
.iter()
.map(|data| data.merkle_tree_index)
.collect::<Vec<u8>>(),
ctx.remaining_accounts,
)?;
bench_sbf_end!("t_create_output_compressed_accounts");

Expand All @@ -138,7 +135,6 @@ pub fn process_transfer<'a, 'b, 'c, 'info: 'b + 'c>(
&mut compressed_input_accounts,
input_token_data.as_slice(),
&hashed_mint,
ctx.remaining_accounts,
)?;
}
bench_sbf_end!("t_add_token_data_to_input_compressed_accounts");
Expand Down Expand Up @@ -177,8 +173,6 @@ pub fn process_transfer<'a, 'b, 'c, 'info: 'b + 'c>(
ctx.remaining_accounts,
)
}
pub const BATCHED_DISCRIMINATOR: &[u8] = b"BatchMta";
pub const OUTPUT_QUEUE_DISCRIMINATOR: &[u8] = b"queueacc";

/// Creates output compressed accounts.
/// Steps:
Expand All @@ -197,7 +191,6 @@ pub fn create_output_compressed_accounts(
lamports: Option<Vec<Option<impl ZeroCopyNumTrait>>>,
hashed_mint: &[u8; 32],
merkle_tree_indices: &[u8],
remaining_accounts: &[AccountInfo<'_>],
) -> Result<u64> {
let mut sum_lamports = 0;
let hashed_delegate_store = if let Some(delegate) = delegate {
Expand Down Expand Up @@ -242,29 +235,7 @@ pub fn create_output_compressed_accounts(
let hashed_owner = hash_to_bn254_field_size_be(owner.to_pubkey_bytes().as_slice());

let mut amount_bytes = [0u8; 32];
let discriminator_bytes =
&remaining_accounts[merkle_tree_indices[i] as usize].try_borrow_data()?[0..8];
match discriminator_bytes {
StateMerkleTreeAccount::DISCRIMINATOR => {
amount_bytes[24..].copy_from_slice(amount.to_bytes_le().as_slice());
Ok(())
}
BATCHED_DISCRIMINATOR => {
amount_bytes[24..].copy_from_slice(amount.to_bytes_be().as_slice());
Ok(())
}
OUTPUT_QUEUE_DISCRIMINATOR => {
amount_bytes[24..].copy_from_slice(amount.to_bytes_be().as_slice());
Ok(())
}
_ => {
msg!(
"{} is no Merkle tree or output queue account. ",
remaining_accounts[merkle_tree_indices[i] as usize].key()
);
err!(anchor_lang::error::ErrorCode::AccountDiscriminatorMismatch)
}
}?;
amount_bytes[24..].copy_from_slice(amount.to_bytes_le().as_slice());

let data_hash = TokenData::hash_with_hashed_values(
hashed_mint,
Expand Down Expand Up @@ -306,7 +277,6 @@ pub fn add_data_hash_to_input_compressed_accounts<const FROZEN_INPUTS: bool>(
input_compressed_accounts_with_merkle_context: &mut [InAccount],
input_token_data: &[TokenData],
hashed_mint: &[u8; 32],
remaining_accounts: &[AccountInfo<'_>],
) -> Result<()> {
for (i, compressed_account_with_context) in input_compressed_accounts_with_merkle_context
.iter_mut()
Expand All @@ -315,38 +285,7 @@ pub fn add_data_hash_to_input_compressed_accounts<const FROZEN_INPUTS: bool>(
let hashed_owner = hash_to_bn254_field_size_be(&input_token_data[i].owner.to_bytes());

let mut amount_bytes = [0u8; 32];
let discriminator_bytes = &remaining_accounts[compressed_account_with_context
.merkle_context
.merkle_tree_pubkey_index
as usize]
.try_borrow_data()?[0..8];
match discriminator_bytes {
StateMerkleTreeAccount::DISCRIMINATOR => {
amount_bytes[24..]
.copy_from_slice(input_token_data[i].amount.to_le_bytes().as_slice());
Ok(())
}
BATCHED_DISCRIMINATOR => {
amount_bytes[24..]
.copy_from_slice(input_token_data[i].amount.to_be_bytes().as_slice());
Ok(())
}
OUTPUT_QUEUE_DISCRIMINATOR => {
amount_bytes[24..]
.copy_from_slice(input_token_data[i].amount.to_be_bytes().as_slice());
Ok(())
}
_ => {
msg!(
"{} is no Merkle tree or output queue account. ",
remaining_accounts[compressed_account_with_context
.merkle_context
.merkle_tree_pubkey_index as usize]
.key()
);
err!(anchor_lang::error::ErrorCode::AccountDiscriminatorMismatch)
}
}?;
amount_bytes[24..].copy_from_slice(input_token_data[i].amount.to_le_bytes().as_slice());
let delegate_store;
let hashed_delegate = if let Some(delegate) = input_token_data[i].delegate {
delegate_store = hash_to_bn254_field_size_be(&delegate.to_bytes());
Expand Down
Loading
Loading