Conversation
Reviewer's GuideThis PR refactors the parse_utils module by extracting parsing helpers into dedicated submodules (type_parsing, delimiter, errors, token_utils), adjusting function visibility and imports, cleaning up redundant comments, introducing enhanced parameter parsing with error handling, and updating tests to use clippy expect annotations. Class diagram for refactored parse_utils module structureclassDiagram
class parse_utils {
<<module>>
}
class delimiter {
+extract_parenthesized()
+paren_block_span()
}
class errors {
+Delim
+DelimStack
+DelimiterError
+ParseError
}
class token_utils {
<<module>>
}
class type_parsing {
+parse_name_type_pairs()
+parse_output_list()
+parse_type_after_colon()
+parse_type_expr()
}
parse_utils o-- delimiter
parse_utils o-- errors
parse_utils o-- token_utils
parse_utils o-- type_parsing
delimiter ..> errors : uses
type_parsing ..> errors : uses
type_parsing ..> delimiter : uses
delimiter ..> token_utils : uses (via lexer_helpers)
Class diagram for new error handling types in errors.rsclassDiagram
class Delim {
<<enum>>
Paren
Angle
Bracket
Brace
}
class DelimStack {
+open()
+close()
+is_empty()
-Vec<Delim>
}
class DelimiterError {
+expected: Delim
+found: SyntaxKind
+span: TextRange
}
class ParseError {
<<enum>>
Delimiter(DelimiterError)
UnclosedDelimiter
MissingColon
MissingName
MissingType
}
DelimStack *-- Delim
DelimiterError *-- Delim
ParseError *-- DelimiterError
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
Summary by CodeRabbit
WalkthroughIntroduce new modules for delimiter and error handling utilities within the parser, centralise delimiter-aware parsing logic, and refactor type parsing to delegate delimiter and error management to shared helpers. Update module exports and visibility to enable reuse across AST parsing components, and remove redundant local implementations. Modify error handling in relation key extraction and preserve partial parsing output on errors in syntax utilities. Changes
Sequence Diagram(s)sequenceDiagram
participant Parser
participant DelimiterUtils
participant TokenUtils
participant ErrorUtils
Parser->>DelimiterUtils: extract_parenthesized(iter, open, close)
DelimiterUtils->>TokenUtils: advance_to_delimiter
DelimiterUtils->>DelimiterUtils: collect_balanced_content (track nesting)
DelimiterUtils->>TokenUtils: extract_element_text
DelimiterUtils-->>Parser: Option<String> (enclosed content)
Parser->>ErrorUtils: Report delimiter or parse errors as needed
Possibly related PRs
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🧰 Additional context used📓 Path-based instructions (1)**/*.rsInstructions used from: Sources:
⚙️ CodeRabbit Configuration File 🔇 Additional comments (8)
✨ Finishing Touches
🧪 Generate unit tests
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce critical code health rules
(1 file with Deep, Nested Complexity)
Enforce advisory code health rules
(1 file with Excess Number of Function Arguments)
Gates Passed
4 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce critical code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 critical rule | 10.00 → 9.39 | Suppress |
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| token_utils.rs | 1 advisory rule | 10.00 → 9.69 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.18 | Low Cohesion, Overall Code Complexity, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
There was a problem hiding this comment.
Hey @leynos - I've reviewed your changes and they look great!
Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments
### Comment 1
<location> `src/parser/ast/parse_utils/delimiter.rs:39` </location>
<code_context>
+ None
+}
+
+fn collect_balanced_content<I>(
+ iter: &mut std::iter::Peekable<I>,
+ open_kind: SyntaxKind,
</code_context>
<issue_to_address>
collect_balanced_content returns None if the end of the iterator is reached without closing.
Consider returning the partially collected buffer or emitting an error for unclosed delimiters to improve diagnostics.
Suggested implementation:
```rust
#[derive(Debug)]
pub struct UnclosedDelimiterError {
pub collected: String,
pub expected: SyntaxKind,
}
fn collect_balanced_content<I>(
iter: &mut std::iter::Peekable<I>,
open_kind: SyntaxKind,
close_kind: SyntaxKind,
) -> Result<String, UnclosedDelimiterError>
where
I: Iterator<Item = SyntaxElement<DdlogLanguage>>,
{
let mut buffer = String::new();
let mut depth = 1;
while let Some(e) = iter.next() {
if e.kind() == open_kind {
depth += 1;
} else if e.kind() == close_kind {
depth -= 1;
if depth == 0 {
return Ok(buffer);
}
}
buffer.push_str(&e.text());
}
Err(UnclosedDelimiterError {
collected: buffer,
expected: close_kind,
})
}
```
```rust
fn advance_to_delimiter<I>(iter: &mut std::iter::Peekable<I>, open_kind: SyntaxKind) -> Option<()>
```
- Update all call sites of `collect_balanced_content` to handle the new `Result<String, UnclosedDelimiterError>` return type instead of `Option<String>`.
- If you want to emit diagnostics, you can use the `UnclosedDelimiterError` struct to provide more context.
- You may want to move the `UnclosedDelimiterError` definition to a common error module if it will be reused elsewhere.
</issue_to_address>
### Comment 2
<location> `src/parser/ast/parse_utils/delimiter.rs:15` </location>
<code_context>
+}
+
+#[must_use]
+pub fn extract_parenthesized<I>(
+ iter: &mut std::iter::Peekable<I>,
+ open_kind: SyntaxKind,
</code_context>
<issue_to_address>
Consider collapsing the multiple small helper functions into a single loop within extract_parenthesized to streamline the logic.
```suggestion
// You can collapse `advance_to_delimiter`, `collect_balanced_content`,
// `handle_*`, and `extract_element_text` into one small loop and remove
// the extra helpers. This keeps the same behavior with less boilerplate:
pub fn extract_parenthesized<I>(
iter: &mut std::iter::Peekable<I>,
open_kind: SyntaxKind,
close_kind: SyntaxKind,
) -> Option<String>
where
I: Iterator<Item = SyntaxElement<DdlogLanguage>>,
{
// drop until the first opening delimiter
let mut depth = 0;
let mut buf = String::new();
while let Some(elem) = iter.next() {
if elem.kind() == open_kind {
// found the opener
buf.push_str(elem.text()); // .text() works on both Token & Node
depth = 1;
break;
}
}
if depth == 0 {
return None;
}
// collect until depth goes back to zero
while let Some(elem) = iter.next() {
let txt = elem.text();
match elem.kind() {
k if k == open_kind => {
depth += 1;
buf.push_str(txt);
}
k if k == close_kind => {
depth -= 1;
buf.push_str(txt);
if depth == 0 {
return Some(buf);
}
}
_ => buf.push_str(txt),
}
}
None
}
// Now you can delete:
// - advance_to_delimiter
// - collect_balanced_content
// - handle_opening_delimiter
// - handle_closing_delimiter
// - extract_element_text
```
This keeps the same one‐pass depth counting and string accumulation, but collapses all the tiny helpers into a single focused function.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
There was a problem hiding this comment.
Actionable comments posted: 14
🔭 Outside diff range comments (5)
src/parser/ast/parse_utils/type_parsing.rs (5)
11-73: Add function documentationDocument this key parsing function with examples.
Add documentation:
+/// Parses parameter name-type pairs from a token stream. +/// +/// Extracts `(name: type, ...)` patterns from function or relation declarations, +/// handling nested type expressions with balanced delimiters. +/// +/// # Returns +/// A tuple of successfully parsed name-type pairs and any parse errors encountered. +/// +/// # Example +/// ``` +/// // For input "(x: u32, y: Vec<string>)", returns: +/// // (vec![("x", "u32"), ("y", "Vec<string>")], vec![]) +/// ``` pub(crate) fn parse_name_type_pairs<I>(iter: I) -> (Vec<(String, String)>, Vec<ParseError>)
90-118: Add function documentationDocument the output list parsing function.
Add documentation:
+/// Parses a comma-separated list of identifiers after a colon. +/// +/// Used to parse output declarations like `: foo, bar, baz`. +/// +/// # Example +/// ``` +/// // For input "function f(): a, b", returns vec!["a", "b"] +/// ``` pub(crate) fn parse_output_list<I>(iter: I) -> Vec<String>
120-216: Add function documentationDocument the type expression parser.
Add documentation:
+/// Parses a type expression with balanced delimiters. +/// +/// Handles complex nested types like `Vec<Map<string, Vec<u8>>>`, +/// tracking delimiter balance and generating errors for mismatches. +/// +/// # Returns +/// A tuple of the parsed type string and any delimiter errors. pub(crate) fn parse_type_expr<I>(iter: &mut std::iter::Peekable<I>) -> (String, Vec<ParseError>)
218-251: Add function documentationDocument the return type parsing function.
Add documentation:
+/// Parses a type annotation after a colon in function signatures. +/// +/// Extracts the return type from declarations like `function f(): bool {}`. +/// Stops at braces, semicolons, or newlines. +/// +/// # Returns +/// The parsed type string, or None if no type is specified. pub(crate) fn parse_type_after_colon<I>(iter: &mut std::iter::Peekable<I>) -> Option<String>
202-213: Improve error handling for unclosed delimitersThe span for unclosed delimiters is always empty at position 0, which provides poor error location information.
Track the position of opened delimiters to provide accurate error spans:
- errors.push(ParseError::UnclosedDelimiter { - delimiter: ch, - span: TextRange::empty(0.into()), - }); + // TODO: Track delimiter positions in DelimStack to provide accurate spans + errors.push(ParseError::UnclosedDelimiter { + delimiter: ch, + span: TextRange::empty(0.into()), // FIXME: Use actual delimiter position + });
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (5)
src/parser/ast/parse_utils/delimiter.rs(1 hunks)src/parser/ast/parse_utils/errors.rs(1 hunks)src/parser/ast/parse_utils/mod.rs(1 hunks)src/parser/ast/parse_utils/token_utils.rs(1 hunks)src/parser/ast/parse_utils/type_parsing.rs(7 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.rs
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- AGENTS.md
⚙️ CodeRabbit Configuration File
🧬 Code Graph Analysis (1)
src/parser/ast/parse_utils/delimiter.rs (1)
src/parser/lexer_helpers.rs (1)
balanced_block(187-192)
🔇 Additional comments (3)
src/parser/ast/parse_utils/mod.rs (1)
1-18: Well-structured module with clear documentationThe module is properly documented with a
//!comment explaining its purpose, and the re-exports are appropriately scoped. The organization into focused submodules aligns well with the PR objectives.src/parser/ast/parse_utils/delimiter.rs (2)
27-37: Document internal helper functionsAdd brief documentation for clarity.
Add documentation:
+/// Advances the iterator until the specified opening delimiter is found. fn advance_to_delimiter<I>(iter: &mut std::iter::Peekable<I>, open_kind: SyntaxKind) -> Option<()>Likely an incorrect or invalid review comment.
39-62: Document helper function logicAdd documentation explaining the balanced content collection.
Add documentation:
+/// Collects text between balanced delimiters, tracking nesting depth. +/// +/// Starts with depth 1 (after opening delimiter) and returns when depth +/// reaches 0 (matching closing delimiter found). fn collect_balanced_content<I>(Likely an incorrect or invalid review comment.
| pub(crate) fn open_and_push( | ||
| token: &rowan::SyntaxToken<DdlogLanguage>, | ||
| buf: &mut String, | ||
| stack: &mut DelimStack, | ||
| delim: Delim, | ||
| count: usize, | ||
| ) { | ||
| stack.open(delim, count); | ||
| buf.push_str(token.text()); | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Add function documentation
Document public APIs using Rustdoc comments.
Add documentation:
+/// Opens a delimiter and pushes its token text to the buffer.
+///
+/// Updates the delimiter stack to track the opened delimiter(s) and
+/// appends the token's text representation to the output buffer.
pub(crate) fn open_and_push(📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| pub(crate) fn open_and_push( | |
| token: &rowan::SyntaxToken<DdlogLanguage>, | |
| buf: &mut String, | |
| stack: &mut DelimStack, | |
| delim: Delim, | |
| count: usize, | |
| ) { | |
| stack.open(delim, count); | |
| buf.push_str(token.text()); | |
| } | |
| /// Opens a delimiter and pushes its token text to the buffer. | |
| /// | |
| /// Updates the delimiter stack to track the opened delimiter(s) and | |
| /// appends the token's text representation to the output buffer. | |
| pub(crate) fn open_and_push( | |
| token: &rowan::SyntaxToken<DdlogLanguage>, | |
| buf: &mut String, | |
| stack: &mut DelimStack, | |
| delim: Delim, | |
| count: usize, | |
| ) { | |
| stack.open(delim, count); | |
| buf.push_str(token.text()); | |
| } |
🤖 Prompt for AI Agents
In src/parser/ast/parse_utils/token_utils.rs around lines 5 to 14, the public
function open_and_push lacks Rustdoc comments. Add a Rustdoc comment above the
function describing its purpose, parameters, and behavior to improve code
clarity and maintainability.
| pub(crate) fn close_and_push( | ||
| token: &rowan::SyntaxToken<DdlogLanguage>, | ||
| buf: &mut String, | ||
| stack: &mut DelimStack, | ||
| delim: Delim, | ||
| count: usize, | ||
| ) -> usize { | ||
| let closed = stack.close(delim, count); | ||
| buf.push_str(token.text()); | ||
| closed | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Add function documentation
Document the return value behaviour.
Add documentation:
+/// Closes a delimiter and pushes its token text to the buffer.
+///
+/// Attempts to close the specified number of delimiters on the stack.
+/// Returns the actual number of delimiters successfully closed.
pub(crate) fn close_and_push(📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| pub(crate) fn close_and_push( | |
| token: &rowan::SyntaxToken<DdlogLanguage>, | |
| buf: &mut String, | |
| stack: &mut DelimStack, | |
| delim: Delim, | |
| count: usize, | |
| ) -> usize { | |
| let closed = stack.close(delim, count); | |
| buf.push_str(token.text()); | |
| closed | |
| } | |
| /// Closes a delimiter and pushes its token text to the buffer. | |
| /// | |
| /// Attempts to close the specified number of delimiters on the stack. | |
| /// Returns the actual number of delimiters successfully closed. | |
| pub(crate) fn close_and_push( | |
| token: &rowan::SyntaxToken<DdlogLanguage>, | |
| buf: &mut String, | |
| stack: &mut DelimStack, | |
| delim: Delim, | |
| count: usize, | |
| ) -> usize { | |
| let closed = stack.close(delim, count); | |
| buf.push_str(token.text()); | |
| closed | |
| } |
🤖 Prompt for AI Agents
In src/parser/ast/parse_utils/token_utils.rs around lines 16 to 26, the function
close_and_push lacks documentation describing its return value. Add a doc
comment above the function explaining what the function does and specifically
detail the meaning and behavior of the returned usize value to clarify its
purpose for future readers.
Add module-level documentation to newly created submodules and document the delimiter stack is_empty helper. Refactor collect_balanced_content to delegate token handling for lower nesting.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce advisory code health rules
(2 files with Excess Number of Function Arguments)
Gates Passed
5 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 advisory rule | 10.00 → 9.69 | Suppress |
| token_utils.rs | 1 advisory rule | 10.00 → 9.69 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.18 | Low Cohesion, Overall Code Complexity, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce critical code health rules
(1 file with Bumpy Road Ahead, Deep, Nested Complexity)
Enforce advisory code health rules
(1 file with Complex Method)
Gates Passed
4 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce critical code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 2 critical rules | 10.00 → 9.24 | Suppress |
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| type_parsing.rs | 1 advisory rule | 6.00 → 7.18 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.18 | Low Cohesion, Overall Code Complexity, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce critical code health rules
(1 file with Bumpy Road Ahead, Deep, Nested Complexity)
Enforce advisory code health rules
(1 file with Complex Method)
Gates Passed
4 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce critical code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 2 critical rules | 10.00 → 9.24 | Suppress |
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| type_parsing.rs | 1 advisory rule | 6.00 → 7.14 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.14 | Low Cohesion, Overall Code Complexity, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
There was a problem hiding this comment.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
src/parser/ast/parse_utils/type_parsing.rs (1)
126-232: Address increasing complexity in parse_type_expr.Whilst the refactoring successfully delegates to utility modules, the function remains overly complex with high cyclomatic complexity. Extract meaningful helper functions for the different delimiter handling branches to reduce complexity and improve maintainability.
Extract delimiter-specific handlers:
+fn handle_paren_delimiters<I>( + token: &rowan::SyntaxToken<DdlogLanguage>, + ctx: &mut TokenParseContext, + iter: &mut std::iter::Peekable<I>, + is_opening: bool, +) -> bool +where + I: Iterator<Item = SyntaxElement<DdlogLanguage>>, +{ + if is_opening { + open_delimiter(&mut *ctx.stack, Delim::Paren, 1); + push(token, ctx); + iter.next(); + false + } else { + let should_break = ctx.stack.close(Delim::Paren, 1) == 0; + push(token, ctx); + iter.next(); + should_break + } +} + +fn handle_angle_delimiters<I>( + token: &rowan::SyntaxToken<DdlogLanguage>, + ctx: &mut TokenParseContext, + iter: &mut std::iter::Peekable<I>, + errors: &mut Vec<ParseError>, + kind: SyntaxKind, +) where + I: Iterator<Item = SyntaxElement<DdlogLanguage>>, +{ + match kind { + SyntaxKind::T_LT => open_delimiter(&mut *ctx.stack, Delim::Angle, 1), + SyntaxKind::T_SHL => open_delimiter(&mut *ctx.stack, Delim::Angle, 2), + SyntaxKind::T_GT => { + if close_delimiter(&mut *ctx.stack, Delim::Angle, 1) < 1 { + push_error(errors, Delim::Angle, token); + } + } + SyntaxKind::T_SHR => { + if close_delimiter(&mut *ctx.stack, Delim::Angle, 2) < 2 { + push_error(errors, Delim::Angle, token); + } + } + _ => unreachable!(), + } + push(token, ctx); + iter.next(); +}
♻️ Duplicate comments (3)
src/parser/ast/parse_utils/errors.rs (3)
19-20: Encapsulate the public field.Make the field private and provide necessary methods:
#[derive(Default, Debug)] -pub(crate) struct DelimStack(pub(crate) Vec<Delim>); +pub(crate) struct DelimStack(Vec<Delim>);Add a method for controlled access:
impl DelimStack { /// Returns an iterator over unclosed delimiters for error reporting. pub(crate) fn unclosed(&mut self) -> impl Iterator<Item = Delim> + '_ { self.0.drain(..) } }
87-92: Add method documentation.Document the behaviour of delimiter stack operations:
impl DelimStack { + /// Opens one or more delimiters of the same type. + /// + /// Pushes `count` instances of `delim` onto the stack. pub(super) fn open(&mut self, delim: Delim, count: usize) {
94-107: Add method documentation and improve logic.Document the closing behaviour:
+ /// Attempts to close delimiters of the specified type. + /// + /// Pops up to `count` matching delimiters from the stack. + /// Returns the actual number of delimiters closed. Stops early + /// if a non-matching delimiter is encountered. pub(super) fn close(&mut self, delim: Delim, count: usize) -> usize {Consider checking the top element before popping to avoid unnecessary push/pop operations.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (6)
src/parser/ast/parse_utils/delimiter.rs(1 hunks)src/parser/ast/parse_utils/errors.rs(1 hunks)src/parser/ast/parse_utils/token_utils.rs(1 hunks)src/parser/ast/parse_utils/type_parsing.rs(9 hunks)src/parser/ast/relation.rs(1 hunks)src/syntax_utils.rs(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.rs
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- AGENTS.md
⚙️ CodeRabbit Configuration File
🔇 Additional comments (4)
src/parser/ast/relation.rs (1)
146-147: LGTM: Correct adaptation to new error handling pattern.The change from direct error propagation to
.ok()?correctly adapts to the newResult<String, UnclosedDelimiterError>return type fromextract_parenthesized, maintaining the existing API contract whilst gracefully handling parsing errors.src/syntax_utils.rs (1)
41-41: Excellent improvement to error resilience.Preserving the partially collected text from
UnclosedDelimiterErrorinstead of defaulting to an empty string provides much better parsing resilience and maintains useful partial results when delimiters are unclosed.src/parser/ast/parse_utils/token_utils.rs (1)
1-50: Well-structured module addressing previous concerns.The introduction of
TokenParseContextsuccessfully reduces function parameter counts whilst maintaining clear separation of concerns. The module provides clean abstractions for delimiter stack management and error recording.src/parser/ast/parse_utils/delimiter.rs (1)
51-91: Address remaining complexity issues from previous reviews.Whilst the core functionality correctly implements error handling with partial content preservation, the nested complexity and multiple small helper functions flagged by previous reviewers remain unaddressed. Collapse the helper functions into a single streamlined loop within
extract_parenthesizedto reduce boilerplate and complexity.Apply the previously suggested refactoring to combine the logic into a single function:
pub fn extract_parenthesized<I>( iter: &mut std::iter::Peekable<I>, open_kind: SyntaxKind, close_kind: SyntaxKind, ) -> Result<String, UnclosedDelimiterError> where I: Iterator<Item = SyntaxElement<DdlogLanguage>>, { - // Skip to the first opening delimiter. - for e in iter.by_ref() { - if e.kind() == open_kind { - break; - } - } - - let mut depth = 1usize; - let mut buf = String::new(); - - for e in iter.by_ref() { - let text = element_text(&e); - match e.kind() { - k if k == open_kind => { - depth += 1; - buf.push_str(&text); - } - k if k == close_kind => { - depth -= 1; - if depth == 0 { - return Ok(buf); - } - buf.push_str(&text); - } - _ => buf.push_str(&text), - } - } - - Err(UnclosedDelimiterError { - collected: buf, - expected: close_kind, - }) + let mut depth = 0; + let mut buf = String::new(); + + while let Some(elem) = iter.next() { + let txt = element_text(&elem); + match elem.kind() { + k if k == open_kind => { + depth += 1; + buf.push_str(&txt); + if depth == 1 { + break; // Found the first opener + } + } + _ if depth == 0 => continue, // Skip until first opener + _ => { + buf.push_str(&txt); + } + } + } + + if depth == 0 { + return Err(UnclosedDelimiterError { + collected: String::new(), + expected: open_kind, + }); + } + + while let Some(elem) = iter.next() { + let txt = element_text(&elem); + match elem.kind() { + k if k == open_kind => { + depth += 1; + buf.push_str(&txt); + } + k if k == close_kind => { + depth -= 1; + if depth == 0 { + return Ok(buf); + } + buf.push_str(&txt); + } + _ => buf.push_str(&txt), + } + } + + Err(UnclosedDelimiterError { + collected: buf, + expected: close_kind, + }) }Likely an incorrect or invalid review comment.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce critical code health rules
(1 file with Bumpy Road Ahead)
Enforce advisory code health rules
(1 file with Excess Number of Function Arguments)
Gates Passed
4 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce critical code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 critical rule | 10.00 → 9.54 | Suppress |
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 advisory rule | 10.00 → 9.54 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.32 | Low Cohesion, Complex Method, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce critical code health rules
(1 file with Bumpy Road Ahead)
Enforce advisory code health rules
(1 file with Excess Number of Function Arguments)
Gates Passed
4 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce critical code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 critical rule | 10.00 → 9.54 | Suppress |
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| delimiter.rs | 1 advisory rule | 10.00 → 9.54 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.32 | Low Cohesion, Complex Method, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 4
♻️ Duplicate comments (1)
src/parser/ast/parse_utils/type_parsing.rs (1)
126-232: Refactor to reduce cyclomatic complexity below the threshold of 12.The
parse_type_exprfunction has excessive cyclomatic complexity due to the large match statement with many delimiter handling branches. Extract delimiter handling logic into helper functions.+/// Handles an opening delimiter token. +fn handle_opening_delimiter( + kind: SyntaxKind, + token: &rowan::SyntaxToken<DdlogLanguage>, + ctx: &mut TokenParseContext, +) { + let (delim, count) = match kind { + SyntaxKind::T_LPAREN => (Delim::Paren, 1), + SyntaxKind::T_LT => (Delim::Angle, 1), + SyntaxKind::T_SHL => (Delim::Angle, 2), + SyntaxKind::T_LBRACKET => (Delim::Bracket, 1), + SyntaxKind::T_LBRACE => (Delim::Brace, 1), + _ => return, + }; + open_delimiter(&mut *ctx.stack, delim, count); + push(token, ctx); +} + +/// Handles a closing delimiter token. +fn handle_closing_delimiter( + kind: SyntaxKind, + token: &rowan::SyntaxToken<DdlogLanguage>, + ctx: &mut TokenParseContext, + errors: &mut Vec<ParseError>, +) -> bool { + let (delim, count, should_continue) = match kind { + SyntaxKind::T_RPAREN => (Delim::Paren, 1, ctx.stack.close(Delim::Paren, 1) > 0), + SyntaxKind::T_GT => (Delim::Angle, 1, true), + SyntaxKind::T_SHR => (Delim::Angle, 2, true), + SyntaxKind::T_RBRACKET => (Delim::Bracket, 1, true), + SyntaxKind::T_RBRACE => (Delim::Brace, 1, true), + _ => return true, + }; + + if kind != SyntaxKind::T_RPAREN && close_delimiter(&mut *ctx.stack, delim, count) < count { + push_error(errors, delim, token); + } + + if should_continue { + push(token, ctx); + } + + should_continue +} + pub(crate) fn parse_type_expr<I>(iter: &mut std::iter::Peekable<I>) -> (String, Vec<ParseError>) where I: Iterator<Item = SyntaxElement<DdlogLanguage>>, { use rowan::NodeOrToken; let mut buf = String::new(); let mut errors = Vec::new(); let mut depth = DelimStack::default(); let mut ctx = TokenParseContext::new(&mut buf, &mut depth); while let Some(e) = iter.peek() { match e { NodeOrToken::Token(t) => match t.kind() { - SyntaxKind::T_LPAREN => { - open_delimiter(&mut *ctx.stack, Delim::Paren, 1); - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_RPAREN => { - if ctx.stack.close(Delim::Paren, 1) == 0 { - break; - } - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_LT => { - open_delimiter(&mut *ctx.stack, Delim::Angle, 1); - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_SHL => { - open_delimiter(&mut *ctx.stack, Delim::Angle, 2); - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_GT => { - if close_delimiter(&mut *ctx.stack, Delim::Angle, 1) < 1 { - push_error(&mut errors, Delim::Angle, t); - } - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_SHR => { - if close_delimiter(&mut *ctx.stack, Delim::Angle, 2) < 2 { - push_error(&mut errors, Delim::Angle, t); - } - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_LBRACKET => { - open_delimiter(&mut *ctx.stack, Delim::Bracket, 1); - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_RBRACKET => { - if close_delimiter(&mut *ctx.stack, Delim::Bracket, 1) < 1 { - push_error(&mut errors, Delim::Bracket, t); - } - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_LBRACE => { - open_delimiter(&mut *ctx.stack, Delim::Brace, 1); - push(t, &mut ctx); - iter.next(); - } - SyntaxKind::T_RBRACE => { - if close_delimiter(&mut *ctx.stack, Delim::Brace, 1) < 1 { - push_error(&mut errors, Delim::Brace, t); - } - push(t, &mut ctx); - iter.next(); - } + kind if is_opening_delimiter(kind) => { + handle_opening_delimiter(kind, t, &mut ctx); + iter.next(); + } + kind if is_closing_delimiter(kind) => { + if !handle_closing_delimiter(kind, t, &mut ctx, &mut errors) { + break; + } + iter.next(); + } SyntaxKind::T_COMMA if ctx.stack.is_empty() => break, _ => { push(t, &mut ctx); iter.next(); } }, NodeOrToken::Node(n) => { let text = n.text().to_string(); let is_whitespace = text.chars().all(char::is_whitespace); let is_comment = n.kind() == SyntaxKind::T_COMMENT; if !is_whitespace && !is_comment { ctx.buf.push_str(&text); } iter.next(); } } } // ... rest of function remains the sameAdd these predicate functions:
/// Checks if a token kind is an opening delimiter. fn is_opening_delimiter(kind: SyntaxKind) -> bool { matches!( kind, SyntaxKind::T_LPAREN | SyntaxKind::T_LT | SyntaxKind::T_SHL | SyntaxKind::T_LBRACKET | SyntaxKind::T_LBRACE ) } /// Checks if a token kind is a closing delimiter. fn is_closing_delimiter(kind: SyntaxKind) -> bool { matches!( kind, SyntaxKind::T_RPAREN | SyntaxKind::T_GT | SyntaxKind::T_SHR | SyntaxKind::T_RBRACKET | SyntaxKind::T_RBRACE ) }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (5)
docs/function-parsing-design.md(1 hunks)src/parser/ast/parse_utils/delimiter.rs(1 hunks)src/parser/ast/parse_utils/errors.rs(1 hunks)src/parser/ast/parse_utils/token_utils.rs(1 hunks)src/parser/ast/parse_utils/type_parsing.rs(1 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.md
Instructions used from:
Sources:
⚙️ CodeRabbit Configuration File
**/*.rs
Instructions used from:
Sources:
⚙️ CodeRabbit Configuration File
🔇 Additional comments (8)
docs/function-parsing-design.md (1)
46-49: Fix line wrapping to comply with 80-column limit.Lines 46-49 exceed the 80-column limit. Wrap these lines appropriately.
-A hierarchy of error types supports rich diagnostics when delimiters do not -match or names and types are missing. The following diagram shows the -relationships between these types. +A hierarchy of error types supports rich diagnostics when delimiters do not +match or names and types are missing. The following diagram shows the +relationships between these types.Likely an incorrect or invalid review comment.
src/parser/ast/parse_utils/token_utils.rs (7)
1-6: Excellent module documentation.The module documentation clearly explains the purpose and utility, properly uses British English, and follows the required
//!format.
7-9: Clean import organisation.The imports are well-structured and appropriately scoped.
11-22: Well-designed context struct.The
TokenParseContexteffectively groups related parameters and provides a clean API with proper lifetime management.
24-27: Clean single-responsibility function.The function properly focuses on delimiter opening operations and maintains a simple, clear interface.
29-32: Excellent delimiter closing function.The function maintains single responsibility and provides clear return value semantics.
34-37: Simple and effective utility function.The push function maintains consistency with the context pattern and has a single, clear responsibility.
39-53: Comprehensive error recording function.The function provides excellent error handling with clear documentation and proper error construction including all necessary context.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Passed
6 Quality Gates Passed
See analysis details in CodeScene
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.32 | Low Cohesion, Complex Method, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (2)
src/parser/ast/parse_utils/delimiter.rs(1 hunks)src/parser/ast/parse_utils/errors.rs(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.rs
Instructions used from:
Sources:
⚙️ CodeRabbit Configuration File
🔇 Additional comments (1)
src/parser/ast/parse_utils/delimiter.rs (1)
1-170: Excellent modular design and comprehensive documentation!The delimiter parsing utilities are well-structured with proper error handling, comprehensive documentation including examples, and good separation of concerns. The refactoring to reduce complexity and parameter counts has been effectively implemented using helper functions and context structs.
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Failed
Enforce advisory code health rules
(1 file with Code Duplication)
Gates Passed
5 Quality Gates Passed
See analysis details in CodeScene
Reason for failure
| Enforce advisory code health rules | Violations | Code Health Impact | |
|---|---|---|---|
| type_parsing.rs | 1 advisory rule | 6.00 → 6.62 | Suppress |
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 6.62 | Low Cohesion, Complex Method, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
|
@coderabbitai Are the following comments still relevant? 🔭 Outside diff range comments (1) ♻️ Duplicate comments (3) Add a method for controlled access: impl DelimStack { 87-92: Add method documentation. Document the behaviour of delimiter stack operations: impl DelimStack {
94-107: Add method documentation and improve logic. Document the closing behaviour:
Consider checking the top element before popping to avoid unnecessary push/pop operations. |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain@leynos Let me examine the current state of the code to assess whether these comments are still relevant. Based on my examination of the current code, here's the relevance assessment of those comments: ❌ Comments are NO LONGER relevant1. parse_type_expr complexity comment (lines 126-232)The function has been significantly simplified since the original review. The current implementation:
2. DelimStack field encapsulation commentsThe current
3. Method documentation commentsThe current implementation already includes comprehensive documentation:
✅ ConclusionThese comments are outdated and no longer apply to the current codebase. The refactoring successfully addressed all the concerns raised in the original review. The code is now well-modularized, properly documented, and maintains appropriate encapsulation. 🏁 Script executed: Length of output: 3727 🏁 Script executed: Length of output: 4399 🏁 Script executed: Length of output: 56 🏁 Script executed: Length of output: 117 @leynos Based on my examination of the current code, these comments are no longer relevant. The refactoring has successfully addressed all the concerns: ✅ All Issues Have Been Resolved1. parse_type_expr Complexity (lines 126-232) - ADDRESSEDThe function has been dramatically simplified:
The suggested 2. DelimStack Encapsulation - ALREADY IMPLEMENTEDThe current implementation already follows best practices: pub(crate) struct DelimStack(Vec<Delim>); // Private Vec, not pub(crate)
3. Method Documentation - COMPREHENSIVEAll methods now have excellent documentation: /// Opens one or more delimiters of the same type.
/// Pushes `count` instances of `delim` onto the stack...
/// Attempts to close delimiters of the specified type.
/// Pops up to `count` matching delimiters from the stack and returns the
/// number of delimiters actually closed...
🎯 ConclusionThese review comments are outdated and no longer apply. The refactoring successfully transformed the code from having these issues into a well-structured, properly documented, and maintainable implementation. The original concerns were valid but have been completely resolved. |
There was a problem hiding this comment.
Code Health Improved
(1 files improve in Code Health)
Gates Passed
6 Quality Gates Passed
See analysis details in CodeScene
View Improvements
| File | Code Health Impact | Categories Improved |
|---|---|---|
| type_parsing.rs | 6.00 → 7.03 | Low Cohesion, Complex Method, Deep, Nested Complexity, Excess Number of Function Arguments |
Quality Gate Profile: Pay Down Tech Debt
Want more control? Customize Code Health rules or catch issues early with our IDE extension and CLI tool.
Summary
parse_utilsmoduleTesting
make lintmake testhttps://chatgpt.com/codex/tasks/task_e_687c6271d1708322a061878921a1aefe
Summary by Sourcery
Refactor the parsing utilities by reorganizing parse_utils into focused submodules, standardizing error handling, and updating exports and tests to match the new structure.
Enhancements:
Tests: