Skip to content

fix: resolve merge conflict with master; clean up token visibility and check_op#57

Merged
ashyanSpada merged 4 commits intomasterfrom
bolt/optimize-token-comparison-9775148041958261230
Apr 8, 2026
Merged

fix: resolve merge conflict with master; clean up token visibility and check_op#57
ashyanSpada merged 4 commits intomasterfrom
bolt/optimize-token-comparison-9775148041958261230

Conversation

@ashyanSpada
Copy link
Copy Markdown
Owner

@ashyanSpada ashyanSpada commented Apr 5, 2026

Branch had diverged from origin/master after consolidation PR #58 landed overlapping changes to src/token.rs, causing a merge conflict.

Conflict resolutions

  • as_str() visibility: changed from pubpub(crate) — avoids unnecessary public API expansion
  • string() attribute: removed #[cfg(not(tarpaulin_include))] — was flagged by clippy as an unknown cfg condition
  • check_op: preserved this PR's Comma/Semicolon matching (unique contribution not in master), rewritten in master's concise one-expression-per-arm style:
pub fn check_op(token: Token, expected: &str) -> bool {
    match token {
        Token::Delim(op, _) => op.as_str() == expected,
        Token::Operator(op, _) => op == expected,
        Token::Comma(c, _) => c == expected,
        Token::Semicolon(s, _) => s == expected,
        _ => false,
    }
}
  • Tests: merged both sets — master's comprehensive test_check_op table + test_delim_token_type_as_str + close-delim tests, plus this branch's test_check_op_comma, test_check_op_semicolon, test_is_question_mark, test_is_colon, test_is_eof
  • src/context.rs, src/tokenizer.rs, src/value.rs from master applied cleanly with no conflicts

…token matching by introducing a zero-allocation `as_str()` method on `DelimTokenType` and using string references (`&str`) during tokenizer checks. This prevents creating new `String` instances whenever matching a parenthesis, bracket, or brace in the parser.\n\nAlso centralized check logic for Comma and Semicolon into `check_op`.

Co-authored-by: ashyanSpada <22587148+ashyanSpada@users.noreply.github.com>
@google-labs-jules
Copy link
Copy Markdown
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

Copilot AI review requested due to automatic review settings April 5, 2026 21:30
@codecov
Copy link
Copy Markdown

codecov bot commented Apr 5, 2026

Codecov Report

❌ Patch coverage is 92.85714% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 89.67%. Comparing base (9a4a6cc) to head (461ba25).

Files with missing lines Patch % Lines
src/token.rs 92.30% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master      #57      +/-   ##
==========================================
+ Coverage   89.55%   89.67%   +0.11%     
==========================================
  Files          11       11              
  Lines        1063     1065       +2     
==========================================
+ Hits          952      955       +3     
+ Misses        111      110       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors DelimTokenType to use a zero-allocation as_str method and updates the check_op function to support Comma and Semicolon tokens. A critical logic bug was identified in the expect function within src/tokenizer.rs, where a token mismatch could incorrectly result in a successful return; it is recommended to refactor this function to use check_op for better consistency and correctness.

Comment on lines 146 to 151
match token {
Token::Delim(bracket, _) => {
if bracket.string() == op {
if bracket.as_str() == op {
return Ok(());
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The expect function contains a logic bug. If the token variant matches (e.g., Token::Delim) but the inner value does not match the expected op, the function currently falls through the match and returns Ok(()) at the end (line 166). This causes the parser to incorrectly assume the expectation was met when it actually failed.

Additionally, expect is missing a case for Token::Semicolon, which was added to check_op in this PR. To ensure consistency and correctness, consider refactoring expect to leverage the check_op function from src/token.rs.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR targets tokenizer/parser hot paths by avoiding heap allocations when comparing delimiter tokens, introducing a zero-allocation DelimTokenType::as_str() and switching delimiter comparisons to use it.

Changes:

  • Added DelimTokenType::as_str() -> &'static str and reimplemented string() in terms of it.
  • Updated delimiter comparisons in Tokenizer::expect and check_op to use as_str() instead of allocating via string().
  • Expanded check_op to also treat Token::Comma and Token::Semicolon as comparable op tokens.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
src/tokenizer.rs Switches delimiter comparison in expect() to DelimTokenType::as_str() to avoid allocations.
src/token.rs Adds as_str() for delimiter tokens, updates check_op to avoid allocations, and broadens check_op matching to comma/semicolon.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 145 to 149
self.next()?;
match token {
Token::Delim(bracket, _) => {
if bracket.string() == op {
if bracket.as_str() == op {
return Ok(());
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Tokenizer::expect, when the token is a Delim (and similarly for other op-like tokens), a mismatch with op does not produce an error; it falls through and the function ultimately returns Ok(()) after advancing. This can silently accept invalid syntax. Consider returning Err(Error::ExpectedOpNotExist(op.to_string())) (or similar) whenever the token value doesn't equal op.

Copilot uses AI. Check for mistakes.
src/token.rs Outdated
Comment on lines +55 to +59
pub fn as_str(&self) -> &'static str {
use DelimTokenType::*;
match self {
OpenParen => "(".to_string(),
CloseParen => ")".to_string(),
OpenBracket => "[".to_string(),
CloseBracket => "]".to_string(),
OpenBrace => "{".to_string(),
CloseBrace => "}".to_string(),
Unknown => "??".to_string(),
OpenParen => "(",
CloseParen => ")",
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New DelimTokenType::as_str() and the expanded check_op behavior (now matching Comma/Semicolon) are not covered by unit tests in this module. Adding small table-driven tests for as_str() outputs and check_op on comma/semicolon would help prevent regressions in this tokenizer hot path.

Copilot uses AI. Check for mistakes.
google-labs-jules bot and others added 2 commits April 5, 2026 21:38
Co-authored-by: ashyanSpada <22587148+ashyanSpada@users.noreply.github.com>
Co-authored-by: ashyanSpada <22587148+ashyanSpada@users.noreply.github.com>
src/token.rs Outdated
}
}

#[cfg(not(tarpaulin_include))]
@ashyanSpada
Copy link
Copy Markdown
Owner Author

@copilot fix merge conflict

…-comparison-9775148041958261230

# Conflicts:
#	src/token.rs

Co-authored-by: ashyanSpada <22587148+ashyanSpada@users.noreply.github.com>
Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 8, 2026

@copilot fix merge conflict

Merged origin/master into the branch and resolved the conflict in src/token.rs (commit 5ea03d2):

  • Used pub(crate) for as_str() (avoids expanding the public API)
  • Removed #[cfg(not(tarpaulin_include))] from string() (fixes the clippy code-scanning warning)
  • Kept check_op extended with Comma/Semicolon support using master's cleaner expression style
  • Combined both test sets: master's comprehensive test_check_op / test_delim_token_type_as_str / close-delim tests, plus our new test_check_op_comma, test_check_op_semicolon, and token predicate tests

All 237 tests pass.

Copilot AI changed the title ⚡ Bolt: [performance improvement] eliminate token string allocations fix: resolve merge conflict with master; clean up token visibility and check_op Apr 8, 2026
@ashyanSpada ashyanSpada merged commit 257b200 into master Apr 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants