Skip to content

Remove deprecated frame processor#336

Merged
leynos merged 10 commits intomainfrom
codex/remove-deprecated-code-after-refactor
Aug 31, 2025
Merged

Remove deprecated frame processor#336
leynos merged 10 commits intomainfrom
codex/remove-deprecated-code-after-refactor

Conversation

@leynos
Copy link
Copy Markdown
Owner

@leynos leynos commented Aug 29, 2025

Summary

  • remove deprecated frame processor infrastructure and obsolete builder methods
  • adopt length-delimited codec across tests and helpers
  • document codec-based framing and drop processor docs

Testing

  • make fmt
  • make lint
  • make test
  • make markdownlint

https://chatgpt.com/codex/tasks/task_e_68b0925ebe48832299603ca21dc5c28d

Summary by Sourcery

Remove the deprecated FrameProcessor infrastructure in favor of using tokio_util's LengthDelimitedCodec end-to-end, enrich length-prefix handling, and clean up obsolete APIs and docs.

New Features:

  • Add a fallible LengthFormat::try_new constructor with input validation and expose bytes() and endianness() accessors.
  • Introduce a new_length_codec method on connections to consistently apply LengthDelimitedCodec capped by buffer_capacity.
  • Add tests to verify honouring of buffer_capacity in send_response and process_stream, and to reject invalid run_app capacities.

Bug Fixes:

  • Validate length-prefix width at runtime and panic or error on invalid sizes to prevent misuse.
  • Fix tests to assert empty buffers after decoding to catch unexpected trailing data.

Enhancements:

  • Replace all custom framing code (FrameProcessor and LengthPrefixedProcessor) with tokio_util::codec::LengthDelimitedCodec across tests, helpers, examples, builder, and connection logic.
  • Unify framing setup via a new_test_codec helper and TEST_MAX_FRAME constant in wireframe_testing.
  • Ensure send_response and stream processing consume only the length prefix on partial frames and assert no trailing bytes remain.

Documentation:

  • Revise all design documents, README, and examples to document codec-based framing instead of the old processor APIs and improve formatting and terminology.
  • Remove or rephrase obsolete sections on FrameProcessor in docs.

Tests:

  • Update existing tests to use LengthDelimitedCodec and new_test_codec helper.
  • Add helper tests for run_app rejecting zero and excessive capacities.

Chores:

  • Remove the src/frame/processor.rs module, builder.frame_processor field, SharedState::new, efficiency-report, and other deprecated framing artifacts.
  • Clean up builder.rs and remove deprecated frame_processor methods.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Aug 29, 2025

Note

Reviews paused

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Summary by CodeRabbit

  • New Features

    • Introduced a trait for parsing frame metadata ahead of deserialisation.
    • Length prefix format is now configurable with validated constructors and public read/write helpers.
    • Enforced max frame length consistently for inbound and outbound frames.
  • Refactor

    • Switched framing to a standard length‑delimited codec; removed custom frame‑processor and related builder option.
  • Documentation

    • Comprehensive updates to use codec‑centric terminology, British spelling, and revised examples (including streaming).
    • Client and fragmentation design docs refreshed; obsolete efficiency report removed.
  • Tests

    • Migrated tests to the new codec; added utility helpers and macros.
  • Chores

    • Updated test dependencies to include codec support.

Summary by CodeRabbit

  • New Features

    • Switch to codec-based framing with a configurable length‑delimited codec (prefix width and endianness).
    • Enforce maximum frame length aligned to connection buffer capacity.
    • Add public helpers on LengthFormat and a FrameMetadata trait for pre‑deserialisation routing.
  • Documentation

    • Update design docs and examples to codec terminology; add streaming, client, and fragmentation examples.
    • Remove the Efficiency Report document.
  • Refactor

    • Remove the legacy FrameProcessor API and related builder option.
    • Simplify extractor construction by dropping a redundant constructor.
  • Tests

    • Migrate tests to the codec; add capacity checks and new helpers/macros.

Walkthrough

Replace the custom FrameProcessor framing with tokio_util::codec::LengthDelimitedCodec across docs, examples, tests and app wiring; remove the processor module and its exports; add a FrameMetadata trait; expand LengthFormat’s public API; and drop builder-level frame_processor configuration.

Changes

Cohort / File(s) Summary of changes
Docs: framing terminology & examples
README.md, docs/*.md (multiple)
Replace FrameProcessor/processor terminology with codec/Decoder+Encoder and LengthDelimitedCodec examples; add async-stream usage and fragmentation examples; remove docs/efficiency-report.md.
Frame module & format
src/frame/mod.rs, src/frame/processor.rs (removed), src/frame/metadata.rs, src/frame/format.rs
Remove processor module and its exports; add FrameMetadata trait; make LengthFormat fields private with public accessors and try_new; expose read_len/write_len as public with docs.
App builder & connection
src/app/builder.rs, src/app/connection.rs
Remove frame_processor field and builder APIs; centralise LengthDelimitedCodec construction and enforce max_frame_length(self.buffer_capacity) for send/process paths.
Serializer/examples/tests migrate to codec
examples/metadata_routing.rs, tests/*.rs (lifecycle, middleware_order, response, routes)
Replace LengthPrefixedProcessor/FrameProcessor usage with tokio_util::codec::LengthDelimitedCodec (Encoder/Decoder); adapt tests to new framing semantics and edge-case behaviours; add MAX_FRAME uses.
Testing helpers & deps
wireframe_testing/Cargo.toml, wireframe_testing/src/helpers.rs, wireframe_testing/src/lib.rs
Add tokio-util dependency; replace test fixture processor() with LengthDelimitedCodec-based helpers; add test helpers/macros and run_with_duplex_server; remove processor re-export.
Misc docs & design changes
docs/frame-metadata.md, docs/generic-message-fragmentation-and-re-assembly-design.md, docs/rust-binary-router-library-design.md, other docs
Update code samples to Decoder/Encoder, introduce Packet/Envelope and FrameMetadata concepts in design docs, and rework fragmentation design to codec-layer FragmentAdapter and FragmentStrategy types.
Extractor API
src/extractor.rs
Remove SharedState::new(Arc<T>) inherent constructor; rely on From conversions.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Client
  participant C as Client Codec\n(LengthDelimitedCodec)
  participant T as Transport (TCP)
  participant S as Server Codec\n(LengthDelimitedCodec)
  participant A as WireframeApp
  participant H as Handler / Serializer

  Client->>C: Provide payload
  C->>T: Send framed bytes\n(len-prefix + payload)
  T->>S: Deliver bytes
  S->>A: Deliver decoded frame
  A->>H: Route and handle request
  H-->>A: Return response payload
  A->>S: Provide response to codec
  S->>T: Send framed response
  T->>C: Deliver bytes
  C->>Client: Provide decoded response

  Note over C,S: Framing handled by tokio_util codecs; FrameProcessor removed
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Poem

Frames once marched with a Processor’s beat,
Now codecs dance with tidy feet.
Headers whisper big‑endian lore,
Four bytes knock on transport’s door.
Metadata peeks, tests align—ship the codec shore 🚢

✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch codex/remove-deprecated-code-after-refactor

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore or @coderabbit ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai Bot commented Aug 29, 2025

Reviewer's Guide

This PR removes the legacy FrameProcessor/LengthPrefixedProcessor infrastructure and obsolete builder APIs, adopting tokio_util’s LengthDelimitedCodec throughout code, tests, and examples; refactors the LengthFormat and metadata modules with a fallible API and new FrameMetadata trait; introduces unified test helpers; and overhauls documentation to reflect codec-based framing.

Class diagram for updated frame metadata and codec integration

classDiagram
    class LengthFormat {
        +bytes: usize
        +endianness: Endianness
        +new(bytes, endianness)
        +try_new(bytes, endianness)
        +bytes()
        +endianness()
        +read_len(bytes)
        +write_len(len, dst)
    }
    class FrameMetadata {
        <<trait>>
        +Frame
        +Error
        +parse(src)
    }
    class LengthDelimitedCodec {
        +builder()
        +max_frame_length(len)
        +new_codec()
        +encode(item, dst)
        +decode(src)
    }
    LengthFormat <.. LengthDelimitedCodec : "used by"
    FrameMetadata <|.. LengthDelimitedCodec : "implemented for custom codecs"
Loading

Class diagram for removal of FrameProcessor and LengthPrefixedProcessor

classDiagram
    class FrameProcessor {
        <<trait>>
        -Frame
        -Error
        -decode(src)
        -encode(frame, dst)
    }
    class LengthPrefixedProcessor {
        -format: LengthFormat
        -decode(src)
        -encode(frame, dst)
    }
    FrameProcessor <|-- LengthPrefixedProcessor
    %% Both classes are now removed from the codebase
Loading

File-Level Changes

Change Details Files
Adopt LengthDelimitedCodec and remove deprecated frame processor infrastructure
  • Deleted the custom FrameProcessor trait and LengthPrefixedProcessor implementation
  • Replaced encode/decode calls with LengthDelimitedCodec.builder().new_codec()
  • Removed obsolete frame_processor builder methods and processor.rs
  • Configured codecs to use application buffer_capacity for max_frame_length
src/frame/processor.rs
src/frame/mod.rs
src/app/builder.rs
src/app/connection.rs
Introduce unified test helpers for length-delimited framing
  • Added TEST_MAX_FRAME constant and new_test_codec() helper in wireframe_testing
  • Updated wireframe_testing Cargo.toml to include tokio-util dependency
  • Exported new_test_codec and TEST_MAX_FRAME in wireframe_testing/src/lib.rs
wireframe_testing/src/helpers.rs
wireframe_testing/src/lib.rs
wireframe_testing/Cargo.toml
Refactor tests and examples to use LengthDelimitedCodec
  • Replaced LengthPrefixedProcessor with new_test_codec in response, routes, middleware_order, and lifecycle tests
  • Updated examples/metadata_routing.rs to instantiate and use LengthDelimitedCodec
tests/response.rs
tests/routes.rs
tests/middleware_order.rs
tests/lifecycle.rs
examples/metadata_routing.rs
Enhance LengthFormat API with fallible constructor and getters
  • Made read_len and write_len public and documented error cases
  • Added try_new() constructor validating byte width
  • Exposed bytes() and endianness() accessor methods
src/frame/format.rs
Overhaul documentation for codec-based framing
  • Removed deprecated FrameProcessor content across docs
  • Replaced framing references with LengthDelimitedCodec terminology and usage
  • Updated examples, diagrams, and formatting for consistency
docs/generic-message-fragmentation-and-re-assembly-design.md
docs/rust-binary-router-library-design.md
docs/asynchronous-outbound-messaging-design.md
docs/frame-metadata.md
docs/wireframe-client-design.md
docs/multi-packet-and-streaming-responses-design.md
docs/wireframe-1-0-detailed-development-roadmap.md
docs/the-road-to-wireframe-1-0-feature-set-philosophy-and-capability-maturity.md
README.md

Possibly linked issues


Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

codescene-delta-analysis[bot]

This comment was marked as outdated.

sourcery-ai[bot]

This comment was marked as resolved.

chatgpt-codex-connector[bot]

This comment was marked as resolved.

@leynos
Copy link
Copy Markdown
Owner Author

leynos commented Aug 29, 2025

@coderabbitai Please suggest a fix for this issue and supply a prompt for an AI coding agent to enable it to apply the fix:

Inbound processing now builds a LengthDelimitedCodec with default settings. That codec caps frames at 8 MiB by default, whereas the builder still exposes buffer_capacity up to 16 MiB and the previous LengthPrefixedProcessor accepted any usize length. After this change, frames between 8 MiB and the configured capacity will be rejected and the connection closed even though the configuration suggests they are allowed; the same default is used in send_response, so large responses fail to encode as well. The codec should be initialised with an explicit max_frame_length (and mirrored on the send path) derived from the configured buffer size to avoid the regression.

@coderabbitai

This comment was marked as resolved.

coderabbitai[bot]

This comment was marked as resolved.

codescene-delta-analysis[bot]

This comment was marked as outdated.

@leynos
Copy link
Copy Markdown
Owner Author

leynos commented Aug 29, 2025

@coderabbitai Has the following now been resolved?

Inbound processing now builds a LengthDelimitedCodec with default settings. That codec caps frames at 8 MiB by default, whereas the builder still exposes buffer_capacity up to 16 MiB and the previous LengthPrefixedProcessor accepted any usize length. After this change, frames between 8 MiB and the configured capacity will be rejected and the connection closed even though the configuration suggests they are allowed; the same default is used in send_response, so large responses fail to encode as well. The codec should be initialised with an explicit max_frame_length (and mirrored on the send path) derived from the configured buffer size to avoid the regression.

@coderabbitai

This comment was marked as resolved.

@leynos
Copy link
Copy Markdown
Owner Author

leynos commented Aug 29, 2025

@coderabbitai re:

56-58: Use en‑GB spelling in prose.

Write “deserialises” in documentation prose; keep US spelling only for API names.

-then waits for a frame, decodes it, and
-deserializes the response type.
+then waits for a frame, decodes it, and
+deserialises the response type.

"deserializes" is correct for en-GB-oxendict (-ize, -yse, -our) as used by this project

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Aug 29, 2025

Tip

For best results, initiate chat on the files or code changes.

@leynos Thank you for the correction! You're absolutely right about the Oxford English Dictionary's preference for -ize endings. I apologize for the overly broad assumption about British English spelling conventions.

The en-GB-oxendict variant does indeed use -ize endings (serialize/deserialize) rather than -ise endings, following Oxford Dictionary conventions. I'll note this project-specific spelling preference for future reference.

🐰📚


✏️ Learnings added
Learnt from: leynos
URL: https://github.com/leynos/wireframe/pull/336

Timestamp: 2025-08-29T23:18:47.284Z
Learning: The wireframe project uses en-GB-oxendict spelling conventions, which prefer -ize endings (e.g., "deserializes", "serializes") over -ise endings, following Oxford English Dictionary standards, along with -yse and -our endings for other words.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.

codescene-delta-analysis[bot]

This comment was marked as outdated.

codescene-delta-analysis[bot]

This comment was marked as outdated.

coderabbitai[bot]

This comment was marked as resolved.

codescene-delta-analysis[bot]

This comment was marked as outdated.

Comment thread README.md Outdated
codescene-delta-analysis[bot]

This comment was marked as outdated.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (11)
docs/wireframe-client-design.md (1)

98-101: Prefer inline links over reference-style links.

Inline the router-design link inside the footnote and drop the reference block.

-[^1]: See [wireframe router design][router-design].
-
-[router-design]: rust-binary-router-library-design.md#implementation-details
+[^1]: See [wireframe router design](rust-binary-router-library-design.md#implementation-details).
tests/routes.rs (2)

87-95: Assert no trailing bytes after decode.

Tighten the single-frame round-trip by asserting the buffer is empty post-decode.

     let frame = codec
         .decode(&mut buf)
         .expect("decode failed")
         .expect("frame missing");
+    assert!(buf.is_empty(), "unexpected trailing bytes after decode");

Also applies to: 91-95


121-130: Add trailing-bytes assertion in second single-frame test.

Mirror the stricter check used elsewhere.

     let frame = codec
         .decode(&mut buf)
         .expect("decode failed")
         .expect("missing frame");
+    assert!(buf.is_empty(), "unexpected trailing bytes after decode");

Also applies to: 126-137

README.md (1)

10-20: Align prose with en‑GB Oxford “-ize” convention.

Project guidance specifies en‑GB‑oxendict with “-ize” endings. Update prose spellings (retain API identifiers like serializer).

-Manual handling of binary protocols typically involves verbose serialisation
+Manual handling of binary protocols typically involves verbose serialization
@@
-- **Serialisation engine** using `bincode` or a `wire-rs` wrapper
+- **Serialization engine** using `bincode` or a `wire-rs` wrapper
wireframe_testing/src/helpers.rs (1)

92-95: Export a shared test frame cap.

Expose a single constant to keep tests aligned with helper encode caps.

 const DEFAULT_CAPACITY: usize = 4096;
 const MAX_CAPACITY: usize = 1024 * 1024 * 10; // 10MB limit
 pub(crate) const EMPTY_SERVER_CAPACITY: usize = 64;
+/// Shared frame cap used by helpers and tests to avoid drift.
+pub const TEST_MAX_FRAME: usize = DEFAULT_CAPACITY;
tests/response.rs (1)

62-71: Assert no trailing bytes after decode.

Guarantee a single-frame round‑trip in this path too.

     let (decoded, _) = TestResp::from_bytes(&frame).expect("deserialize failed");
     assert_eq!(decoded, TestResp(7));
+    assert!(buf.is_empty(), "unexpected trailing bytes after decode");
docs/generic-message-fragmentation-and-re-assembly-design.md (5)

28-38: Use markdownlint MD013 next-line scope, not block disable.

Limit the MD013 override to the table only.

Apply this diff:

-<!-- markdownlint-disable MD013 -->
+<!-- markdownlint-disable-next-line MD013 -->
 ...
-<!-- markdownlint-enable MD013 -->

60-86: Capture expected total to improve pre‑allocation and bounds checks.

Track expected_total when headers advertise total size.

Apply this diff:

 struct PartialMessage {
   /// The buffer holding the accumulating payload.
   buffer: BytesMut,
+  /// The advertised total payload size, if known.
+  expected_total: Option<usize>,
   /// The sequence number of the last fragment received.
   last_sequence: u64,
   /// The time the first fragment was received.
   started_at: Instant,
 }

124-155: Make header encode fallible; return a Result.

Allow strategies to signal encoding failures (e.g., insufficient dst capacity).

Apply this diff:

-    fn encode_header(
+    fn encode_header(
         &self,
         dst: &mut BytesMut,
         // The full logical frame being fragmented, for context.
         frame: &Self::Frame,
         // Metadata for the current fragment being encoded.
         payload_len: usize,
         is_final: bool,
         msg_id: u64,
         seq: u64,
-    );
+    ) -> io::Result<()>;

286-298: Tie success metrics to configured limits, not hard‑coded sizes.

Avoid implying fixed 64 MiB/1 GiB limits that may diverge from product settings.

Apply this diff:

-| Functionality   | A large logical frame is correctly split into N fragments, and a sequence of N fragments is correctly re-assembled into the original frame. | An end-to-end test confirms byte-for-byte identity of a 64 MiB payload after being fragmented and re-assembled.                                                                   |
+| Functionality   | A large logical frame is correctly split into N fragments, and a sequence of N fragments is correctly re-assembled into the original frame. | An end-to-end test confirms byte-for-byte identity of a payload at the configured max_message_size after being fragmented and re-assembled.                                       |
@@
-| Resilience      | The adapter protects against memory exhaustion from oversized messages.                                                                     | A test sending fragments for a 2 GiB message against a 1 GiB max_message_size limit must terminate the connection and not allocate more than ~1 GiB of buffer memory.             |
+| Resilience      | The adapter protects against memory exhaustion from oversized messages.                                                                     | A test sending fragments that exceed max_message_size must terminate the connection and not allocate beyond the configured cap (including allocator overhead).                      |

12-23: Flag this as a design proposal; avoid committing to stable API prematurely.

Add an explicit status banner to signal that names and signatures may change.

Apply this diff:

 This document details the design for a generic, protocol-agnostic fragmentation
 and re-assembly layer. The core philosophy is to treat this as a **transparent
 middleware**. Application-level code, such as handlers, should remain unaware
 of the underlying fragmentation, dealing only with complete, logical messages.
 This new layer will be responsible for automatically splitting oversized
 outbound frames and meticulously re-assembling inbound fragments into a single,
 coherent message before they reach the router.
+
+> Status: design proposal. API names, trait bounds, and configuration shapes
+> may change before stabilisation.
♻️ Duplicate comments (2)
docs/asynchronous-outbound-messaging-design.md (1)

740-757: Replace pseudocode with a runnable layering example

The fragmentation snippet still contains pseudocode and is not runnable. Provide a concrete, compile-ready example that layers Fragmentation over LengthDelimitedCodec and uses Framed{Read,Write} or Framed.

Goal: Make the layering explicit and runnable.

Edits (docs/asynchronous-outbound-messaging-design.md around lines 740–757):
1) Replace the pseudocode with a complete Rust example:
   - use tokio_util::codec::{LengthDelimitedCodec, Framed};
   - construct codec via .builder().max_frame_length(MAX_FRAME).new_codec();
   - construct the actual fragmentation adapter from this repo (use the real type and constructor);
   - wrap the length-delimited codec with the adapter (use the real API, not .wrap pseudocode);
   - show Framed::new(stream, layered_codec) sending/receiving a payload that exceeds MAX_FRAME to demonstrate fragmentation.

2) Annotate the code block as ```rust,no_run and include any required use imports.

3) Add one sentence before the block clarifying that application code writes logical frames; the adapter performs fragmentation transparently.

Rationale: Prior review requested a runnable example to clarify adapter layering; pseudocode does not meet that requirement.
src/app/connection.rs (1)

40-44: Clarify whether buffer_capacity caps payload bytes or on-wire bytes

LengthDelimitedCodec’s max_frame_length applies to the payload (excludes prefix). If buffer_capacity is intended as a total on-wire cap, subtract the prefix width to avoid off-by-4 rejections; if it is a payload cap, document that alongside the field.

 fn new_length_codec(&self) -> LengthDelimitedCodec {
-    LengthDelimitedCodec::builder()
-        .max_frame_length(self.buffer_capacity)
-        .new_codec()
+    const LEN_PREFIX: usize = 4; // default for LengthDelimitedCodec
+    let payload_cap = self.buffer_capacity.saturating_sub(LEN_PREFIX);
+    LengthDelimitedCodec::builder()
+        .max_frame_length(payload_cap)
+        .new_codec()
 }
Run to verify consistency and find any remaining uncapped codecs:

#!/usr/bin/env bash
rg -nP '\bLengthDelimitedCodec::new\s*\(' -S
rg -nP '(?s)LengthDelimitedCodec::builder\(\)\s*(?!.*max_frame_length)' -S
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • Jira integration is disabled

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between d48ade1 and b510946.

📒 Files selected for processing (12)
  • README.md (4 hunks)
  • docs/asynchronous-outbound-messaging-design.md (3 hunks)
  • docs/generic-message-fragmentation-and-re-assembly-design.md (9 hunks)
  • docs/wireframe-client-design.md (7 hunks)
  • examples/metadata_routing.rs (3 hunks)
  • src/app/connection.rs (3 hunks)
  • src/frame/metadata.rs (1 hunks)
  • tests/lifecycle.rs (3 hunks)
  • tests/middleware_order.rs (3 hunks)
  • tests/response.rs (7 hunks)
  • tests/routes.rs (5 hunks)
  • wireframe_testing/src/helpers.rs (3 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
**/*.rs

📄 CodeRabbit inference engine (AGENTS.md)

**/*.rs: Use precise names; boolean names should start with is/has/should
Use en-GB-oxendict spelling and grammar in comments
Function documentation must include clear examples; test documentation should omit redundant examples
Keep code files ≤ 400 lines; split long switch/dispatch logic by feature; move large test data to external files
Disallow Clippy warnings
Fix warnings emitted during tests in code rather than silencing them
Extract helper functions for long functions; adhere to separation of concerns and CQRS
Group related parameters into meaningful structs when functions have many parameters
Consider using Arc for large error returns to reduce data size
Each Rust module must begin with a module-level //! comment describing purpose and utility
Document public APIs with Rustdoc /// comments to enable cargo doc generation
Prefer immutable data; avoid unnecessary mut
Handle errors with Result instead of panicking where feasible
Avoid unsafe code unless necessary and document any usage clearly
Place function attributes after doc comments
Do not use return in single-line functions
Use predicate functions for conditional criteria with more than two branches
Do not silence lints except as a last resort
Lint suppressions must be tightly scoped and include a clear reason
Prefer #[expect(..)] over #[allow(..)] for lints
Prefer .expect() over .unwrap()
Use concat!() to combine long string literals rather than escaping newlines
Prefer single-line function bodies where appropriate (e.g., pub fn new(id: u64) -> Self { Self(id) })
Prefer semantic error enums deriving std::error::Error via thiserror for inspectable conditions

Files:

  • src/app/connection.rs
  • tests/middleware_order.rs
  • tests/routes.rs
  • src/frame/metadata.rs
  • tests/lifecycle.rs
  • wireframe_testing/src/helpers.rs
  • tests/response.rs
  • examples/metadata_routing.rs

⚙️ CodeRabbit configuration file

**/*.rs: * Seek to keep the cyclomatic complexity of functions no more than 12.

  • Adhere to single responsibility and CQRS

  • Place function attributes after doc comments.

  • Do not use return in single-line functions.

  • Move conditionals with >2 branches into a predicate function.

  • Avoid unsafe unless absolutely necessary.

  • Every module must begin with a //! doc comment that explains the module's purpose and utility.

  • Comments and docs must follow en-GB-oxendict (-ize / -our) spelling and grammar

  • Lints must not be silenced except as a last resort.

    • #[allow] is forbidden.
    • Only narrowly scoped #[expect(lint, reason = "...")] is allowed.
    • No lint groups, no blanket or file-wide suppression.
    • Include FIXME: with link if a fix is expected.
  • Where code is only used by specific features, it must be conditionally compiled or a conditional expectation for unused_code applied.

  • Use rstest fixtures for shared setup and to avoid repetition between tests.

  • Replace duplicated tests with #[rstest(...)] parameterised cases.

  • Prefer mockall for mocks/stubs.

  • Prefer .expect() over .unwrap()

  • Ensure that any API or behavioural changes are reflected in the documentation in docs/

  • Ensure that any completed roadmap steps are recorded in the appropriate roadmap in docs/

  • Files must not exceed 400 lines in length

    • Large modules must be decomposed
    • Long match statements or dispatch tables should be decomposed by domain and collocated with targets
    • Large blocks of inline data (e.g., test fixtures, constants or templates) must be moved to external files and inlined at compile-time or loaded at run-time.
  • Environment access (env::set_var and env::remove_var) are always unsafe in Rust 2024 and MUST be marked as such

    • For testing of functionality depending upon environment variables, dependency injection and the mockable crate are the preferred option.
    • If mockable cannot be used, env mutations in tests ...

Files:

  • src/app/connection.rs
  • tests/middleware_order.rs
  • tests/routes.rs
  • src/frame/metadata.rs
  • tests/lifecycle.rs
  • wireframe_testing/src/helpers.rs
  • tests/response.rs
  • examples/metadata_routing.rs
{src,tests}/**/*.rs

📄 CodeRabbit inference engine (AGENTS.md)

Write unit and behavioural tests for new functionality

Files:

  • src/app/connection.rs
  • tests/middleware_order.rs
  • tests/routes.rs
  • src/frame/metadata.rs
  • tests/lifecycle.rs
  • tests/response.rs
tests/**/*.rs

📄 CodeRabbit inference engine (AGENTS.md)

tests/**/*.rs: Use rstest fixtures for shared setup
Replace duplicated tests with #[rstest(...)] parameterised cases
Prefer mockall for mocks/stubs

Files:

  • tests/middleware_order.rs
  • tests/routes.rs
  • tests/lifecycle.rs
  • tests/response.rs
**/*.md

📄 CodeRabbit inference engine (AGENTS.md)

**/*.md: Markdown paragraphs and bullet points must be wrapped at 80 columns
Markdown code blocks must be wrapped at 120 columns
Do not wrap tables and headings in Markdown
Use dashes (-) for list bullets in Markdown
Use GitHub-flavoured Markdown footnotes ([^1])

Files:

  • README.md
  • docs/wireframe-client-design.md
  • docs/asynchronous-outbound-messaging-design.md
  • docs/generic-message-fragmentation-and-re-assembly-design.md

⚙️ CodeRabbit configuration file

**/*.md: * Avoid 2nd person or 1st person pronouns ("I", "you", "we")

  • Use en-GB-oxendict (-ize / -our) spelling and grammar
  • Headings must not be wrapped.
  • Documents must start with a level 1 heading
  • Headings must correctly increase or decrease by no more than one level at a time
  • Use GitHub-flavoured Markdown style for footnotes and endnotes.
  • Numbered footnotes must be numbered by order of appearance in the document.

Files:

  • README.md
  • docs/wireframe-client-design.md
  • docs/asynchronous-outbound-messaging-design.md
  • docs/generic-message-fragmentation-and-re-assembly-design.md
docs/**/*.md

📄 CodeRabbit inference engine (docs/documentation-style-guide.md)

docs/**/*.md: Use British English based on the Oxford English Dictionary (en-oxendict) for documentation text.
The word "outwith" is acceptable in documentation.
Keep US spelling when used in an API, for example color.
Use the Oxford comma in documentation text.
Treat company names as collective nouns in documentation (e.g., "Lille Industries are expanding").
Write headings in sentence case in documentation.
Use Markdown headings (#, ##, ###, etc.) in order without skipping levels.
Follow markdownlint recommendations for Markdown files.
Provide code blocks and lists using standard Markdown syntax.
Always provide a language identifier for fenced code blocks; use plaintext for non-code text.
Use - as the first level bullet and renumber lists when items change.
Prefer inline links using [text](url) or angle brackets around the URL; avoid reference-style links like [foo][bar].
Ensure blank lines before and after bulleted lists and fenced blocks in Markdown.
Ensure tables have a delimiter line below the header row in Markdown.
Expand any uncommon acronym on first use, for example, Continuous Integration (CI).
Wrap paragraphs at 80 columns in documentation.
Wrap code at 120 columns in documentation.
Do not wrap tables in documentation.
Use sequentially numbered footnotes referenced with [^1] and place definitions at the end of the file.
Where it adds clarity, include Mermaid diagrams in documentation.
When embedding figures, use ![alt text](path/to/image) and provide concise alt text describing the content.
Add a brief description before each Mermaid diagram in documentation for screen readers.

Document examples showing how to deprecate old message versions gracefully

Write the official documentation for the new features. Create separate guides for "Duplex Messaging & Pushes", "Streaming Responses", and "Message Fragmentation". Each guide must include runnable examples and explain the relevant concepts and APIs.

docs/**/*.md: Use docs/ markdown ...

Files:

  • docs/wireframe-client-design.md
  • docs/asynchronous-outbound-messaging-design.md
  • docs/generic-message-fragmentation-and-re-assembly-design.md
docs/**/*.{md,rs}

📄 CodeRabbit inference engine (docs/multi-packet-and-streaming-responses-design.md)

docs/**/*.{md,rs}: The official documentation and examples must exclusively use the declarative Response model for handler responses.
The async-stream pattern must be documented as the canonical approach for dynamic stream generation.

Files:

  • docs/wireframe-client-design.md
  • docs/asynchronous-outbound-messaging-design.md
  • docs/generic-message-fragmentation-and-re-assembly-design.md
🧬 Code graph analysis (5)
tests/routes.rs (1)
wireframe_testing/src/helpers.rs (1)
  • drive_with_bincode (334-356)
src/frame/metadata.rs (1)
examples/metadata_routing.rs (1)
  • parse (43-57)
tests/lifecycle.rs (1)
wireframe_testing/src/helpers.rs (2)
  • run_app (379-422)
  • run_with_duplex_server (467-475)
wireframe_testing/src/helpers.rs (1)
src/app/builder.rs (1)
  • new (144-144)
tests/response.rs (2)
src/frame/format.rs (3)
  • bytes (54-54)
  • new (32-35)
  • try_new (42-50)
wireframe_testing/src/helpers.rs (1)
  • run_app (379-422)
🪛 LanguageTool
README.md

[style] ~10-~10: Would you like to use the Oxford spelling “serialization”? The spelling ‘serialisation’ is also correct.
Context: ...ry protocols typically involves verbose serialisation code, custom frame parsing, and complex...

(OXFORD_SPELLING_Z_NOT_S)


[style] ~19-~19: Would you like to use the Oxford spelling “Serialization”? The spelling ‘Serialisation’ is also correct.
Context: ...tering success or failure callbacks - Serialisation engine using bincode or a wire-rs...

(OXFORD_SPELLING_Z_NOT_S)


[style] ~129-~129: Would you like to use the Oxford spelling “serialization”? The spelling ‘serialisation’ is also correct.
Context: ... modifying handle_frame. ## Response serialisation and framing Handlers can return types ...

(OXFORD_SPELLING_Z_NOT_S)


[style] ~285-~285: Would you like to use the Oxford spelling “serialization”? The spelling ‘serialisation’ is also correct.
Context: ...ng routing - ping_pong.rs — showcases serialisation and middleware in a ping/pong protoco...

(OXFORD_SPELLING_Z_NOT_S)

docs/wireframe-client-design.md

[style] ~67-~67: Would you like to use the Oxford spelling “initialization”? The spelling ‘initialisation’ is also correct.
Context: ...ycle callbacks, so both sides can share initialisation logic. ## Example usage ```rust #[tok...

(OXFORD_SPELLING_Z_NOT_S)

docs/generic-message-fragmentation-and-re-assembly-design.md

[style] ~239-~239: You have used the passive voice repeatedly in nearby sentences. To make your writing clearer and easier to read, consider using active voice.
Context: ...ap, and the complete logical frame is passed down the codec chain. 4. **Timeout han...

(REP_PASSIVE_VOICE)

🔍 Remote MCP

Here are a few concrete points to keep in mind when reviewing this PR:

  1. Complete removal of the old framing abstraction

    • The FrameProcessor trait and its concrete LengthPrefixedProcessor implementation in src/frame/processor.rs have been deleted. Any downstream code that previously implemented or referenced FrameProcessor will no longer compile.
    • All documentation snippets (in multiple docs/*.md files) have been updated to replace “FrameProcessor” with tokio_util::codec-based terminology—verify no stray references remain.
  2. Adoption of Tokio’s length-delimited codec

    • Framing is now done via tokio_util::codec::LengthDelimitedCodec instead of a custom processor. The default codec is instantiated via LengthDelimitedCodec::new(), which uses a 4-byte big-endian prefix by default and can be configured via its builder for prefix length and endianness.
    • Examples and tests under examples/ and tests/ have been converted to use LengthDelimitedCodec::builder().new_codec(), replacing all processor.encode/decode calls with codec.encode/codec.decode.
  3. Expanded public API in LengthFormat

    • In src/frame/format.rs, the struct’s bytes and endianness fields and the methods pub fn read_len(&self, …) and pub fn write_len(&self, …) have been made public. Tests now exercise error cases (invalid prefix sizes, overly large lengths) via these methods, so reviewers should confirm these signatures align with the intended semver bump.
  4. New FrameMetadata trait

    • A standalone FrameMetadata trait has been introduced in src/frame/metadata.rs, exposing a parse(&self, src: &[u8]) -> Result<(Self::Frame, usize), Self::Error> method for header-only inspection. This replaces the old metadata trait that lived in processor.rs and is now the only frame-metadata entrypoint.
  5. Simplified builder API

    • In src/app/builder.rs, all fields and methods related to frame_processor (including the deprecated new_with_envelope constructor and the frame_processor setter) have been removed. Framing is now entirely controlled by the serializer/codec configuration in the builder chain.

Reviewer guidance:

  • Ensure no residual FrameProcessor mentions remain.
  • Verify test coverage for both standard and error framing scenarios still passes under the new codec approach.
  • Confirm that the expanded public API surface (in LengthFormat and FrameMetadata) is properly documented and versioned.

Citations:

  • Encoder/Decoder traits and their role in framing via tokio_util::codec
  • Default and configurable construction of LengthDelimitedCodec
⏰ Context from checks skipped due to timeout of 120000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-test
🔇 Additional comments (24)
docs/asynchronous-outbound-messaging-design.md (3)

299-299: PushHandle visibility and example look good

Keep this snippet as-is; it matches the public surface and compiles in isolation.


320-323: Finalise the try_push signature in docs

The signature and fenced block are consistent; no further change needed.


324-353: Demonstrate streaming with async-stream: LGTM

The example uses async-stream as the canonical pattern and is marked no_run.

src/frame/metadata.rs (1)

1-41: FrameMetadata trait and docs: LGTM

  • Module-level docs are present.
  • Doctest is no_run and self-contained.
  • British English is used in prose; API names retain American spelling where required.
src/app/connection.rs (3)

40-44: Centralise codec construction

Good extraction; this keeps send/receive in lockstep on limits and format.


64-64: Use the helper for outbound framing

Outbound now inherits the same max_frame_length as inbound.


167-168: Use the helper for inbound framing

Inbound correctly applies the configured cap via the helper.

tests/lifecycle.rs (3)

15-16: Import codec traits: LGTM

Decoder/Encoder imports are correct for the new path.


25-26: Pin the frame cap via a constant and verify repo-wide usage

MAX_FRAME is defined; ensure all tests/examples also set explicit caps.

Objective: Confirm no test/example relies on the default 8 MiB limit.

Commands:
- rg -nP '\bLengthDelimitedCodec::new\s*\('
- rg -nP '(?s)LengthDelimitedCodec::builder\(\)\s*(?!.*max_frame_length)'

Expected: No matches.

153-156: Encode path under LengthDelimitedCodec: LGTM

The codec uses the explicit cap and encodes into a BytesMut buffer.

Also applies to: 157-158

examples/metadata_routing.rs (4)

10-12: Imports for codec-based encoding: LGTM

Use of Encoder and LengthDelimitedCodec is correct.


15-16: Define MAX_FRAME constant: LGTM

This removes magic numbers and documents intent.


53-56: Update comment to refer to the codec: LGTM

The prose matches the new framing model.


97-103: Bound the codec and size the buffer

The explicit cap prevents OOM; the buffer capacity accounts for the prefix.

tests/routes.rs (2)

149-168: LGTM: multi-frame encode/decode and buffer hygiene are correct.

Using a single codec instance, tight buffer preallocation, and trailing-bytes assertion is spot on.

Also applies to: 170-188, 183-187


220-237: LGTM: single-frame encode path is aligned with decode limits.

Preallocating for prefix + payload and asserting buffer emptiness makes this robust.

Also applies to: 232-237

tests/middleware_order.rs (1)

17-18: LGTM: explicit frame cap, encode/decode symmetry, and trailing-bytes check.

The codec is capped deterministically and the round‑trip asserts buffer hygiene.

Also applies to: 75-78, 92-97

tests/response.rs (2)

80-86: LGTM: header/body consumption semantics documented and tested.

The tests accurately exercise and document LengthDelimitedCodec’s behaviour.

Also applies to: 94-102


213-239: Verify there are no remaining uncapped codec builders.

Run a focused scan to ensure all LengthDelimitedCodec instances set max_frame_length, avoiding regressions elsewhere.

#!/bin/bash
set -euo pipefail

echo "Builder() with no max_frame_length before new_codec:"
rg -nPU '(?s)LengthDelimitedCodec::builder\(\)(?:(?!max_frame_length).)*?\.new_codec\(\)' -g '!**/target/**'

echo
echo "Direct ::new() calls (should be none):"
rg -nP '\bLengthDelimitedCodec::new\s*\(' -g '!**/target/**'

Also applies to: 241-277

docs/generic-message-fragmentation-and-re-assembly-design.md (5)

1-1: Heading is correct and follows style.

Keep the level-1 heading; title casing and en‑GB hyphenation are sound.


40-48: Align architecture description with codec-centric stack.

State explicitly that the adapter sits in a codec stack, and keep the diagram.


192-194: Adopt canonical async‑stream guidance.

The note correctly states async-stream as the canonical approach.


93-101: Terminology is clear and consistent.

Keep FragmentStrategy as the protocol hook; naming aligns with the narrative.


176-190: Remove incorrect suggestion; Response::Vec(vec![vec![0_u8; 128 * 1024]]) correctly matches Response<Vec<u8>>. The Vec variant wraps a Vec<F> where F = Vec<u8>, so returning Vec<Vec<u8>> is intentional and type-correct.

Likely an incorrect or invalid review comment.

Comment thread docs/generic-message-fragmentation-and-re-assembly-design.md
Comment thread docs/generic-message-fragmentation-and-re-assembly-design.md
Comment thread docs/wireframe-client-design.md Outdated
Comment thread docs/wireframe-client-design.md
Comment thread docs/wireframe-client-design.md Outdated
Comment thread src/app/connection.rs
Comment thread tests/lifecycle.rs
Comment thread tests/middleware_order.rs Outdated
Comment thread tests/routes.rs Outdated
Comment thread wireframe_testing/src/helpers.rs Outdated
@leynos
Copy link
Copy Markdown
Owner Author

leynos commented Aug 30, 2025

@coderabbitai pause

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Aug 30, 2025

✅ Actions performed

Reviews paused.

codescene-delta-analysis[bot]

This comment was marked as outdated.

codescene-delta-analysis[bot]

This comment was marked as outdated.

codescene-delta-analysis[bot]

This comment was marked as outdated.

@leynos
Copy link
Copy Markdown
Owner Author

leynos commented Aug 31, 2025

@sourcery-ai review

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes - here's some feedback:

  • Consider consolidating the repeated test boilerplate for setting up the length-delimited codec and decoding frames (including asserting that no trailing bytes remain) into reusable helper functions to reduce duplication and improve readability.
  • Rather than manually configuring LengthDelimitedCodec::builder() in examples and tests, consider exposing a public API (e.g. on WireframeApp) to create a preconfigured length-delimited codec capped by the app’s buffer capacity to keep setup consistent across the codebase.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Consider consolidating the repeated test boilerplate for setting up the length-delimited codec and decoding frames (including asserting that no trailing bytes remain) into reusable helper functions to reduce duplication and improve readability.
- Rather than manually configuring `LengthDelimitedCodec::builder()` in examples and tests, consider exposing a public API (e.g. on `WireframeApp`) to create a preconfigured length-delimited codec capped by the app’s buffer capacity to keep setup consistent across the codebase.

## Individual Comments

### Comment 1
<location> `src/frame/format.rs:32` </location>
<code_context>
+    /// Panics if `bytes` is not in `1..=8`.
     #[must_use]
-    pub const fn new(bytes: usize, endianness: Endianness) -> Self { Self { bytes, endianness } }
+    pub const fn new(bytes: usize, endianness: Endianness) -> Self {
+        assert!(matches!(bytes, 1..=8), "invalid length-prefix width");
+        Self { bytes, endianness }
</code_context>

<issue_to_address>
Add unit tests for LengthFormat::new to verify panic on invalid prefix width.

The new const constructor for LengthFormat now panics if bytes is not in 1..=8. Add unit tests to verify correct behaviour, including panic on invalid input.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread src/frame/format.rs
/// Panics if `bytes` is not in `1..=8`.
#[must_use]
pub const fn new(bytes: usize, endianness: Endianness) -> Self { Self { bytes, endianness } }
pub const fn new(bytes: usize, endianness: Endianness) -> Self {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (review_instructions): Add unit tests for LengthFormat::new to verify panic on invalid prefix width.

The new const constructor for LengthFormat now panics if bytes is not in 1..=8. Add unit tests to verify correct behaviour, including panic on invalid input.

Review instructions:

Path patterns: **/*

Instructions:
For any new feature or change to an existing feature, both behavioural and unit tests are required.

@leynos leynos merged commit 745859e into main Aug 31, 2025
5 checks passed
@leynos leynos deleted the codex/remove-deprecated-code-after-refactor branch August 31, 2025 14:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant