Skip to content

add URL scheme validation in DAHR and handleWeb2ProxyRequest#465

Merged
massouji82 merged 5 commits intotestnetfrom
add-DAHR-url-validation
Sep 7, 2025
Merged

add URL scheme validation in DAHR and handleWeb2ProxyRequest#465
massouji82 merged 5 commits intotestnetfrom
add-DAHR-url-validation

Conversation

@massouji82
Copy link
Contributor

@massouji82 massouji82 commented Sep 5, 2025

User description

add URL scheme validation in DAHR and handleWeb2ProxyRequest


PR Type

Enhancement


Description

  • Add URL scheme validation to prevent transport crashes

  • Restrict proxy requests to HTTP/HTTPS protocols only

  • Update demosdk dependency to version 2.3.22


Diagram Walkthrough

flowchart LR
  A["Web2 Proxy Request"] --> B["URL Validation"]
  B --> C["Check Protocol"]
  C --> D["Allow HTTP/HTTPS Only"]
  D --> E["Proceed with Request"]
  C --> F["Reject Invalid Schemes"]
Loading

File Walkthrough

Relevant files
Enhancement
DAHR.ts
Add URL scheme validation in DAHR                                               

src/features/web2/dahr/DAHR.ts

  • Add URL parsing and protocol validation
  • Throw error for non-HTTP/HTTPS schemes
  • Include detailed error messages for invalid URLs
+12/-0   
handleWeb2ProxyRequest.ts
Add URL validation in proxy handler                                           

src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts

  • Add URL scheme validation before processing requests
  • Return 400 error for invalid protocols
  • Handle URL parsing errors gracefully
+21/-0   
Dependencies
package.json
Update demosdk dependency version                                               

package.json

  • Update @kynesyslabs/demosdk from 2.3.17 to 2.3.22
+1/-1     

Summary by CodeRabbit

  • Bug Fixes

    • Added runtime HTTP(S) URL validation and normalization for web2 proxy requests with SSRF protections: trims input, rejects invalid formats/schemes, embedded credentials, missing host, and private/loopback/localhost targets with clear 400 errors.
  • New Features

    • DNS-based preflight checks at request time to block disallowed hosts and ensure canonicalized URLs are used when proxying.
  • Chores

    • Updated dependency: @kynesyslabs/demosdk to ^2.3.22.

@massouji82 massouji82 requested a review from cwilvx September 5, 2025 21:22
@massouji82 massouji82 self-assigned this Sep 5, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 5, 2025

Walkthrough

Adds an HTTP(S) URL validator with SSRF protections, integrates it into START_PROXY handler and DAHR.startProxy, and adds runtime DNS-based SSRF preflight checks in the proxy. Also bumps @kynesyslabs/demosdk version in package.json.

Changes

Cohort / File(s) Change Summary
Web2 URL validator (new)
src/features/web2/validator.ts
New module exporting UrlValidationResult and validateAndNormalizeHttpUrl(input: string) that trims, parses, enforces http(s) scheme, rejects embedded credentials/missing host/localhost and private/loopback/link-local addresses, canonicalizes (lowercase host, strip default ports, remove fragment), and returns { ok: true, normalizedUrl } or { ok: false, status: 400, message }.
START_PROXY handler
src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts
Calls validateAndNormalizeHttpUrl on web2Request.raw.url; on validation failure returns an RPC 400 with the validation message; on success updates web2Request.raw.url to normalizedUrl before invoking DAHR.startProxy.
DAHR.startProxy defensive validation
src/features/web2/dahr/DAHR.ts
Imports and invokes validateAndNormalizeHttpUrl after ensuring _web2Request; throws an Error with .status when invalid; uses validation.normalizedUrl when constructing the proxied request.
Proxy DNS preflight hardening
src/features/web2/proxy/Proxy.ts
Adds DNS-based SSRF hardening: uses node:dns/promises lookups at request time and disallowed-address checks (private/link-local/loopback ranges) to preflight target hostnames; request handler becomes async and rejects disallowed targets (400) before proxying.
Dependency bump
package.json
Updated @kynesyslabs/demosdk from ^2.3.17 to ^2.3.22.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Client
  participant Handler as START_PROXY handler
  participant Validator as validateAndNormalizeHttpUrl
  participant DAHR as DAHR.startProxy
  participant DNS as DNS lookup
  participant Proxy as HTTP Proxy

  Client->>Handler: START_PROXY (web2Request.raw.url)
  Handler->>Validator: validateAndNormalizeHttpUrl(url)
  alt validation fails
    Validator-->>Handler: {ok:false, status:400, message}
    Handler-->>Client: RPC 400 (message)
  else validation passes
    Validator-->>Handler: {ok:true, normalizedUrl}
    Handler->>DAHR: startProxy(with normalizedUrl)
    DAHR->>Validator: validateAndNormalizeHttpUrl(normalizedUrl)  %% defensive check
    alt defensive validation fails
      Validator-->>DAHR: {ok:false, status:400, message}
      DAHR-->>Handler: throw Error(status/message)
      Handler-->>Client: Error propagated
    else defensive validation passes
      DAHR->>DNS: resolve(targetHostname)   note right of DNS: runtime DNS preflight
      DNS-->>DAHR: addresses
      alt disallowed address
        DAHR-->>Client: 400 "Invalid target host"
      else allowed
        DAHR->>Proxy: forward request to normalizedUrl
        Proxy-->>DAHR: response
        DAHR-->>Handler: result
        Handler-->>Client: Success
      end
    end
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Poem

I hop through hosts with careful eyes,
I trim the links and ban disguise.
No localhost or secret nest—
Only safe hops pass my test.
A tiny patch, a cautious cheer! 🐰✨

Warning

Review ran into problems

🔥 Problems

Git: Failed to clone repository. Please run the @coderabbitai full review command to re-trigger a full review. If the issue persists, set path_filters to include or exclude specific files.


📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4471fa9 and a4b798c.

📒 Files selected for processing (2)
  • src/features/web2/proxy/Proxy.ts (3 hunks)
  • src/features/web2/validator.ts (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/features/web2/validator.ts
  • src/features/web2/proxy/Proxy.ts
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch add-DAHR-url-validation

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qodo-code-review
Copy link
Contributor

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
🧪 No relevant tests
🔒 Security concerns

Sensitive information exposure:
Error messages include the raw URL value, which might contain credentials or PII in query strings. Consider logging a sanitized version and returning a generic message to clients (e.g., "Invalid URL format" without echoing the URL).

⚡ Recommended focus areas for review

Error Message Clarity

The thrown error for invalid URL format includes the full URL, which may expose sensitive data (e.g., query params). Consider omitting or redacting the URL in error messages returned to users/logs.

    throw new Error(`Invalid URL format: ${url}. ${e?.message || ""}`)
}
Duplicate Validation

URL scheme validation is implemented in both the handler and DAHR. Verify this duplication is intentional; otherwise centralize validation to avoid drift and ensure consistent error handling.

try {
    const parsed = new URL(web2Request.raw.url)
    if (
        parsed.protocol !== "http:" &&
        parsed.protocol !== "https:"
    ) {
        return createRPCResponse(
            400,
            null,
            `Invalid URL scheme: ${parsed.protocol}. Only http(s) are allowed.`,
        )
    }
} catch {
    return createRPCResponse(
        400,
        null,
        `Invalid URL format: ${web2Request.raw.url}`,
    )
}
Normalization

Consider normalizing/stripping whitespace before URL parsing to reduce false negatives (e.g., leading/trailing spaces) and provide consistent handling.

try {
    const parsed = new URL(web2Request.raw.url)
    if (
        parsed.protocol !== "http:" &&
        parsed.protocol !== "https:"
    ) {

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Sep 5, 2025

PR Code Suggestions ✨

Latest suggestions up to ce64b28

CategorySuggestion                                                                                                                                    Impact
Security
Block localhost/loopback targets
Suggestion Impact:The commit implemented checks to reject localhost and loopback addresses/hostnames, including IPv4 127.0.0.0/8 and IPv6 ::1, aligning with the suggestion.

code diff:

+        const hostLower = parsed.hostname.toLowerCase()
+
+        // 4. Reject localhost and loopback hostnames
+        if (hostLower === "localhost" || hostLower.endsWith(".localhost")) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Localhost targets are not allowed",
+            }
+        }
+
+        // 5. Basic loopback check for IPv4 and IPv6
+        const isIPv6Loopback = hostLower === "::1" || hostLower === "[::1]"
+        const isIPv4Loopback = /^127(?:\.\d{1,3}){3}$/.test(hostLower)
+        if (isIPv4Loopback || isIPv6Loopback) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Loopback targets are not allowed",
+            }
+        }

Guard against SSRF by optionally restricting private/loopback addresses when the
host is an IP. At minimum, reject literal localhost and loopback hostnames to
prevent internal access.

src/features/web2/validator.ts [17-29]

 try {
     const parsed = new URL(trimmed)
     if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
         return {
             ok: false,
             status: 400,
             message: "Invalid URL scheme. Only http(s) are allowed",
         }
     }
+    const hostLower = parsed.hostname.toLowerCase()
+    if (hostLower === "localhost" || hostLower.endsWith(".localhost")) {
+        return { ok: false, status: 400, message: "Localhost targets are not allowed" }
+    }
+    // Basic loopback check for IPv4 and IPv6
+    const isIPv6Loopback = hostLower === "::1" || hostLower === "[::1]"
+    const isIPv4Loopback = /^127(?:\.\d{1,3}){3}$/.test(hostLower)
+    if (isIPv4Loopback || isIPv6Loopback) {
+        return { ok: false, status: 400, message: "Loopback targets are not allowed" }
+    }
     return { ok: true, normalizedUrl: parsed.toString() }
 } catch {
     return { ok: false, status: 400, message: "Invalid URL format" }
 }

[Suggestion processed]

Suggestion importance[1-10]: 10

__

Why: This suggestion addresses a critical Server-Side Request Forgery (SSRF) vulnerability by preventing the proxy from making requests to internal localhost or loopback addresses.

High
Incremental [*]
Harden URL normalization and checks
Suggestion Impact:The commit implemented the suggested hardening: it rejects embedded credentials, lowercases the host, strips default ports, removes fragments, and enforces http(s). It also added extra checks (hostname required, localhost/loopback rejection, redacted errors).

code diff:

+        // 1. Ensure protocol is http(s)
         if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
             return {
                 ok: false,
@@ -23,7 +30,65 @@
                 message: "Invalid URL scheme. Only http(s) are allowed",
             }
         }
-        return { ok: true, normalizedUrl: parsed.toString() }
+
+        // 2. Reject URLs with embedded credentials (username/password)
+        if (parsed.username || parsed.password) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Invalid URL: embedded credentials are not allowed",
+            }
+        }
+
+        // 3. Reject URLs without a hostname
+        if (!parsed.hostname) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Invalid URL: URL must have a hostname",
+            }
+        }
+
+        const hostLower = parsed.hostname.toLowerCase()
+
+        // 4. Reject localhost and loopback hostnames
+        if (hostLower === "localhost" || hostLower.endsWith(".localhost")) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Localhost targets are not allowed",
+            }
+        }
+
+        // 5. Basic loopback check for IPv4 and IPv6
+        const isIPv6Loopback = hostLower === "::1" || hostLower === "[::1]"
+        const isIPv4Loopback = /^127(?:\.\d{1,3}){3}$/.test(hostLower)
+        if (isIPv4Loopback || isIPv6Loopback) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Loopback targets are not allowed",
+            }
+        }
+
+        // 6. Canonicalize the URL (lowercase host, strip default ports, remove fragment)
+        const canonicalUrlObject = new URL(parsed.toString())
+        canonicalUrlObject.hostname = canonicalUrlObject.hostname.toLowerCase()
+
+        // Strip default ports
+        if (
+            (canonicalUrlObject.protocol === "http:" &&
+                canonicalUrlObject.port === "80") ||
+            (canonicalUrlObject.protocol === "https:" &&
+                canonicalUrlObject.port === "443")
+        ) {
+            canonicalUrlObject.port = ""
+        }
+
+        // Remove fragment
+        canonicalUrlObject.hash = ""
+
+        return { ok: true, normalizedUrl: canonicalUrlObject.toString() }

Normalize by stripping default ports and fragments and lowercasing the host to
avoid proxy mismatches and accidental leakage of URL fragments; also reject
credentials in the URL for security.

src/features/web2/validator.ts [1-30]

 export type UrlValidationResult =
     | { ok: true; normalizedUrl: string }
     | { ok: false; status: 400; message: string }
 
 /**
  * Validate and normalize a URL for DAHR.
  * - Trims whitespace
  * - Ensures protocol is http(s)
+ * - Lowercases host, strips default ports and fragments
+ * - Rejects URLs containing credentials
  */
 export function validateAndNormalizeHttpUrl(
     input: string,
 ): UrlValidationResult {
     const trimmed = (input ?? "").trim()
     if (!trimmed) {
         return { ok: false, status: 400, message: "Invalid URL: empty value" }
     }
     try {
         const parsed = new URL(trimmed)
+
         if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
             return {
                 ok: false,
                 status: 400,
                 message: "Invalid URL scheme. Only http(s) are allowed",
             }
         }
+
+        if (parsed.username || parsed.password) {
+            return {
+                ok: false,
+                status: 400,
+                message: "Invalid URL: credentials are not allowed",
+            }
+        }
+
+        // Normalize: lowercase host
+        parsed.hostname = parsed.hostname.toLowerCase()
+
+        // Strip default ports
+        if ((parsed.protocol === "http:" && parsed.port === "80") ||
+            (parsed.protocol === "https:" && parsed.port === "443")) {
+            parsed.port = ""
+        }
+
+        // Remove fragment to avoid leaking client-side data
+        parsed.hash = ""
+
         return { ok: true, normalizedUrl: parsed.toString() }
     } catch {
         return { ok: false, status: 400, message: "Invalid URL format" }
     }
 }

[Suggestion processed]

Suggestion importance[1-10]: 8

__

Why: The suggestion significantly hardens the new URL validation by adding security checks for credentials and fragments, and improves normalization, which is crucial for a proxy feature.

Medium
Preserve validation error status
Suggestion Impact:The commit changed the error handling to create an Error, attach the validation.status to it, and throw it, thereby preserving the status code as suggested.

code diff:

-            throw new Error(validation.message)
+            const err = new Error(validation.message)
+            ;(err as any).status = validation.status
+            throw err

Preserve and propagate the structured error status when validation fails instead
of throwing a generic Error string, so callers can map the correct HTTP status
and avoid converting expected validation failures into 500s.

src/features/web2/dahr/DAHR.ts [74-78]

 // Validate and normalize URL without echoing sensitive details
 const validation = validateAndNormalizeHttpUrl(url)
 if (!validation.ok) {
-    throw new Error(validation.message)
+    const err: Error & { status?: number } = new Error(validation.message)
+    err.status = validation.status
+    throw err
 }

[Suggestion processed]

Suggestion importance[1-10]: 7

__

Why: This is a good suggestion for improving error handling by propagating the status code from the validation result, preventing a validation error from becoming a generic server error.

Medium
Return correct validation status
Suggestion Impact:The code was updated to pass validation.status to createRPCResponse instead of the hardcoded 400, aligning error signaling with the validator's structured status.

code diff:

-                    return createRPCResponse(400, null, validation.message)
+                    return createRPCResponse(
+                        validation.status,
+                        null,
+                        validation.message,
+                    )

Use the structured status from validation for the RPC response instead of
hardcoding 400, to keep consistent error signaling if validation evolves (e.g.,
different status codes).

src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts [61-66]

 const validation = validateAndNormalizeHttpUrl(
     web2Request.raw.url,
 )
 if (!validation.ok) {
-    return createRPCResponse(400, null, validation.message)
+    return createRPCResponse(validation.status, null, validation.message)
 }

[Suggestion processed]

Suggestion importance[1-10]: 6

__

Why: The suggestion correctly points out that the validation.status should be used instead of a hardcoded 400, making the error handling more robust and consistent with the validator's logic.

Low
  • Update

Previous suggestions

✅ Suggestions up to commit a242c53
CategorySuggestion                                                                                                                                    Impact
Security
Remove URL from error messages
Suggestion Impact:The commit replaced the try/catch that threw "Invalid URL format: ${url}..." with a validator that returns a generic message without echoing the URL, aligning with the suggestion to avoid including the raw URL in errors.

code diff:

-        // Validate URL scheme to prevent transport crashes
-        try {
-            const parsed = new URL(url)
-            if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
-                throw new Error(
-                    `Invalid URL scheme: ${parsed.protocol}. Only http(s) are allowed.`,
-                )
-            }
-        } catch (e: any) {
-            throw new Error(`Invalid URL format: ${url}. ${e?.message || ""}`)
+        // Validate and normalize URL without echoing sensitive details
+        const validation = validateAndNormalizeHttpUrl(url)
+        if (!validation.ok) {
+            throw new Error(validation.message)
         }

Do not include the raw url in error messages to avoid leaking sensitive
information (credentials, tokens) and prevent log/response injection. Return a
generic message while keeping the underlying error detail if needed.

src/features/web2/dahr/DAHR.ts [81-83]

 } catch (e: any) {
-            throw new Error(`Invalid URL format: ${url}. ${e?.message || ""}`)
+            throw new Error(`Invalid URL format. ${e?.message || ""}`)
         }

[Suggestion processed]

Suggestion importance[1-10]: 8

__

Why: The suggestion correctly identifies a security risk of leaking sensitive information from the url in an error message and proposes a valid fix to prevent this.

Medium
Stop echoing user input in errors
Suggestion Impact:The commit replaced direct URL parsing and specific error strings (which echoed the user URL) with a validator function that returns a sanitized error message, thus stopping the echo of user input in error responses.

code diff:

-                // Validate URL scheme: only http(s) are allowed
-                try {
-                    const parsed = new URL(web2Request.raw.url)
-                    if (
-                        parsed.protocol !== "http:" &&
-                        parsed.protocol !== "https:"
-                    ) {
-                        return createRPCResponse(
-                            400,
-                            null,
-                            `Invalid URL scheme: ${parsed.protocol}. Only http(s) are allowed.`,
-                        )
-                    }
-                } catch {
-                    return createRPCResponse(
-                        400,
-                        null,
-                        `Invalid URL format: ${web2Request.raw.url}`,
-                    )
+                const validation = validateAndNormalizeHttpUrl(
+                    web2Request.raw.url,
+                )
+                if (!validation.ok) {
+                    return createRPCResponse(400, null, validation.message)
                 }

Avoid echoing user-provided URLs in error responses to prevent leaking
credentials and mitigate CRLF/log injection risks. Return a sanitized, generic
message instead.

src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts [73-79]

 } catch {
                     return createRPCResponse(
                         400,
                         null,
-                        `Invalid URL format: ${web2Request.raw.url}`,
+                        "Invalid URL format",
                     )
                 }

[Suggestion processed]

Suggestion importance[1-10]: 8

__

Why: The suggestion correctly identifies a security risk of echoing the user-provided url in an error response, which could leak sensitive data, and provides a valid fix.

Medium

Massoud Valipoor added 2 commits September 5, 2025 23:56
…t, and loopback addresses; update error handling in DAHR and handleWeb2ProxyRequest
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts (2)

77-85: Sanitize forwarded headers; set Host to the normalized target.

Passing user-supplied Host/X-Forwarded-* can cause cache poisoning or origin confusion. Strip sensitive hop-by-hop and proxy headers and set Host from the normalized URL.

Apply this diff:

-                const { method, headers } = web2Request.raw
+                const { method, headers } = web2Request.raw
+                const target = new URL(validation.normalizedUrl)
+                const sanitizedHeaders = { ...headers }
+                delete (sanitizedHeaders as any).host
+                delete (sanitizedHeaders as any)["x-forwarded-host"]
+                delete (sanitizedHeaders as any)["x-forwarded-proto"]
+                delete (sanitizedHeaders as any)["x-forwarded-for"]
+                delete (sanitizedHeaders as any)["proxy-authorization"]
+                sanitizedHeaders.host = target.host
@@
-                const response = await dahr.startProxy({
+                const response = await dahr.startProxy({
                     method,
-                    headers,
+                    headers: sanitizedHeaders,
                     payload,
                     authorization,
                     url: validation.normalizedUrl,
                 })

98-102: Map thrown status codes and avoid returning raw Error objects.

Currently all errors are returned as 500 with the Error object in response. Respect error.status (e.g., 400 from DAHR validation) and only return the message.

Apply this diff:

-    } catch (error: any) {
-        console.error("Error in handleWeb2ProxyRequest:", error)
-
-        return createRPCResponse(500, error, error.message)
-    }
+    } catch (error: any) {
+        console.error("Error in handleWeb2ProxyRequest:", error)
+        const status =
+            typeof error?.status === "number" && error.status >= 400 && error.status < 600
+                ? error.status
+                : 500
+        const msg = typeof error?.message === "string" ? error.message : "Internal error"
+        return createRPCResponse(status, null, msg)
+    }
🧹 Nitpick comments (3)
src/features/web2/dahr/DAHR.ts (1)

74-81: Good: validate in DAHR as well (defense-in-depth), but use a typed error.

Monkey-patching err.status is brittle. Prefer a small HttpError class.

Apply:

+class HttpError extends Error {
+    status: number
+    constructor(status: number, message: string) {
+        super(message)
+        this.status = status
+    }
+}
@@
-        const validation = validateAndNormalizeHttpUrl(url)
-        if (!validation.ok) {
-            const err = new Error(validation.message)
-            ;(err as any).status = validation.status
-            throw err
-        }
+        const validation = validateAndNormalizeHttpUrl(url)
+        if (!validation.ok) {
+            throw new HttpError(validation.status, validation.message)
+        }
src/features/web2/validator.ts (2)

74-91: Nit: reusing the same URL object avoids extra parse.

You can mutate the existing parsed instead of round-tripping through new URL(parsed.toString()).

Apply:

-        const canonicalUrlObject = new URL(parsed.toString())
-        canonicalUrlObject.hostname = canonicalUrlObject.hostname.toLowerCase()
+        const canonicalUrlObject = parsed
+        canonicalUrlObject.hostname = canonicalUrlObject.hostname.toLowerCase()

1-3: Type is clear and ergonomic. Add unit tests for edge cases.

Please add tests for:

  • IPv4-mapped IPv6 (::ffff:127.0.0.1)
  • RFC1918 IPv4 ranges
  • Link-local (169.254.0.0/16, fe80::/10)
  • IDN/punycode hostnames
  • Default port stripping and fragment removal

I can generate a jest test suite covering these cases if helpful.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 7b3c3ee and bb54060.

📒 Files selected for processing (4)
  • package.json (1 hunks)
  • src/features/web2/dahr/DAHR.ts (2 hunks)
  • src/features/web2/validator.ts (1 hunks)
  • src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts (3 hunks)
🔇 Additional comments (5)
package.json (1)

49-49: Confirm necessity of new dependency @aptos-labs/ts-sdk.

Looks unrelated to the URL-validation change. If unused, please drop it to keep the diff focused.

Run to verify usage:

#!/bin/bash
# Find imports/usages of @aptos-labs/ts-sdk
rg -nP --type=ts -C2 $'from\\s+[\'"]@aptos-labs/ts-sdk[\'"]|require\\([\'"]@aptos-labs/ts-sdk[\'"]\\)'
src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts (1)

61-71: Good defensive validation with clear 400s.

Early scheme validation and normalized URL usage are solid.

src/features/web2/dahr/DAHR.ts (1)

82-89: LGTM: normalized URL is used for the proxied request.

This keeps downstream components consistent.

src/features/web2/validator.ts (2)

15-33: LGTM: clear parse and scheme checks.

Trimming, URL() parse, and http/https enforcement are correct.


34-42: LGTM: credential-in-URL rejection.

Prevents accidental secrets leakage.

Comment on lines +61 to +71
const validation = validateAndNormalizeHttpUrl(
web2Request.raw.url,
)
if (!validation.ok) {
return createRPCResponse(
validation.status,
null,
validation.message,
)
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

SSRF gaps remain: DNS and redirects can still target private/metadata IPs.

Blocking only localhost/loopback at parse-time is insufficient. Attackers can use a public hostname that resolves to 169.254.169.254, 10.0.0.0/8, fc00::/7, etc., or abuse redirects. Enforce IP allow/block after DNS resolution and on every redirect in the Proxy layer.

Action:

  • In Proxy HTTP client, provide a custom lookup/agent to resolve the target and reject connections to private/link-local/reserved ranges (IPv4 and IPv6, including IPv4-mapped IPv6).
  • Re-check the resolved address after each redirect hop before following it.
  • Consider an allowlist instead (e.g., public CIDRs) if feasible.
    Want a patch against Proxy to add a guarded lookup + redirect guard?
🤖 Prompt for AI Agents
In src/libs/network/routines/transactions/handleWeb2ProxyRequest.ts around lines
61-71: the current URL validation only blocks parse-time localhost/loopback
which leaves SSRF via DNS and redirects possible; update the Proxy HTTP client
to use a custom DNS lookup/agent that resolves hostnames before connecting and
rejects/resolves-to addresses in private/link-local/reserved ranges (e.g.,
10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 127.0.0.0/8, 169.254.0.0/16,
100.64.0.0/10, 192.0.0.0/24, 198.18.0.0/15, fc00::/7, fe80::/10, IPv4-mapped
IPv6 ranges), add the same check for every redirect hop before following it, and
fail the request if a resolved IP is disallowed; implement this by injecting a
guarded lookup function into the HTTP/HTTPS agent used by the proxy, optionally
replace the blacklist with an allowlist of public CIDRs if policy permits, and
ensure errors include clear status/messages so createRPCResponse can return
appropriate error codes.

…ivate, link-local, and loopback addresses; update dependencies
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/features/web2/proxy/Proxy.ts (1)

250-266: Mitigate DNS rebinding by enforcing a safe lookup in the upstream connection.

You validate via preflight, but the actual http(s).request in http-proxy will resolve again, enabling TOCTOU DNS rebinding. Provide an Agent with a guarded lookup that rejects disallowed IPs.

             const proxyServer = httpProxy.createProxyServer({
                 target: targetUrl,
                 changeOrigin: true,
                 secure:
                     targetProtocol === "https:"
                         ? this._sslConfig.verifyCertificates
                         : false,
                 ssl:
                     targetProtocol === "https:"
                         ? {
                               rejectUnauthorized:
                                   this._sslConfig.verifyCertificates,
                           }
                         : undefined,
+                // Use a guarded lookup to ensure the connected IP remains allowed
+                agent: targetProtocol === "https:"
+                    ? new https.Agent({
+                          keepAlive: true,
+                          rejectUnauthorized: this._sslConfig.verifyCertificates,
+                          lookup: (hostname, options, cb) => {
+                              dns.lookup(hostname, options as any, (err: any, address: any, family: any) => {
+                                  if (err) return cb(err, address, family)
+                                  if (Array.isArray(address)) {
+                                      const allowed = address.find(a => !isDisallowedAddress(a.address))
+                                      return allowed
+                                          ? cb(null, allowed.address, allowed.family)
+                                          : cb(new Error("Resolved to disallowed address"), undefined as any, undefined as any)
+                                  }
+                                  return isDisallowedAddress(String(address))
+                                      ? cb(new Error("Resolved to disallowed address"), undefined as any, undefined as any)
+                                      : cb(null, address, family)
+                              })
+                          },
+                      })
+                    : new http.Agent({
+                          keepAlive: true,
+                          lookup: (hostname, options, cb) => {
+                              dns.lookup(hostname, options as any, (err: any, address: any, family: any) => {
+                                  if (err) return cb(err, address, family)
+                                  if (Array.isArray(address)) {
+                                      const allowed = address.find(a => !isDisallowedAddress(a.address))
+                                      return allowed
+                                          ? cb(null, allowed.address, allowed.family)
+                                          : cb(new Error("Resolved to disallowed address"), undefined as any, undefined as any)
+                                  }
+                                  return isDisallowedAddress(String(address))
+                                      ? cb(new Error("Resolved to disallowed address"), undefined as any, undefined as any)
+                                      : cb(null, address, family)
+                              })
+                          },
+                      }),
             })

This binds the connection to an IP vetted by your policy and prevents rebinding between preflight and connect.

Also applies to: 342-349

🧹 Nitpick comments (4)
src/features/web2/proxy/Proxy.ts (2)

223-248: Don’t reject the already-resolved server Promise from inside preflight; add a DNS timeout.

Calling reject(e) here targets the outer createNewServer promise, which is already settled by request time—this is a no-op and confusing. Also, dns.lookup can hang; add a short timeout.

-            const preflight = async () => {
+            const DNS_TIMEOUT_MS = 2000
+            const preflight = async () => {
                 try {
                     // If hostname is already an IP, just check it; otherwise resolve all
                     const ipVersion = net.isIP(targetHostname)
                     if (ipVersion) {
                         if (isDisallowedAddress(targetHostname)) {
                             throw new Error(
                                 "Target resolves to a private/link-local/loopback address",
                             )
                         }
                     } else {
-                        const answers = await dns.lookup(targetHostname, {
-                            all: true,
-                        })
+                        const answers = await Promise.race([
+                            dns.lookup(targetHostname, { all: true }),
+                            new Promise<never>((_, rej) =>
+                                setTimeout(() => rej(new Error("DNS lookup timeout")), DNS_TIMEOUT_MS),
+                            ),
+                        ])
                         if (answers.some(a => isDisallowedAddress(a.address))) {
                             throw new Error(
                                 "Target resolves to a private/link-local/loopback address",
                             )
                         }
                     }
                 } catch (e) {
-                    reject(e)
                     return false
                 }
                 return true
             }

320-334: Per-request preflight is good; consider returning a JSON error body for clients.

Small DX improvement: include a machine-readable error.

-                if (!ok) {
-                    res.writeHead(400)
-                    res.end("Invalid target host")
+                if (!ok) {
+                    res.writeHead(400, { "Content-Type": "application/json" })
+                    res.end(JSON.stringify({ error: "Invalid target host" }))
                     return
                 }
src/features/web2/validator.ts (2)

54-64: Tighten localhost checks to catch trailing-dot variants.

localhost. and *.localhost. should also be rejected.

-        if (hostLower === "localhost" || hostLower.endsWith(".localhost")) {
+        if (
+            hostLower === "localhost" ||
+            hostLower === "localhost." ||
+            hostLower.endsWith(".localhost") ||
+            hostLower.endsWith(".localhost.")
+        ) {
             return {
                 ok: false,
                 status: 400,
                 message: "Localhost targets are not allowed",
             }
         }

17-24: Avoid drift: share the address-blocking helper with Proxy.

validator.ts and Proxy.ts now implement overlapping blocklists. Extract a small shared helper (e.g., src/features/web2/net/isDisallowedAddress.ts) and use it in both places to keep behavior consistent.

Also applies to: 103-121

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between bb54060 and 4471fa9.

📒 Files selected for processing (3)
  • package.json (1 hunks)
  • src/features/web2/proxy/Proxy.ts (3 hunks)
  • src/features/web2/validator.ts (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • package.json
🔇 Additional comments (1)
src/features/web2/proxy/Proxy.ts (1)

6-6: LGTM: importing node:dns/promises for preflight is appropriate.

…l IPv6 multicast and unspecified addresses; enhance disallowed address checks for IPv4
@massouji82 massouji82 merged commit 4ee56f9 into testnet Sep 7, 2025
1 check passed
@coderabbitai coderabbitai bot mentioned this pull request Sep 14, 2025
@massouji82 massouji82 deleted the add-DAHR-url-validation branch October 2, 2025 16:06
@coderabbitai coderabbitai bot mentioned this pull request Nov 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant