Conversation
|
Warning Rate limit exceeded@DenhamPreen has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 14 minutes and 52 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
WalkthroughThis update introduces a major overhaul of the documentation system to support LLM-optimized, consolidated single-file MDX docs for HyperIndex and HyperSync. It adds new supported network documentation, updates network listings and tiers, implements a Node.js consolidation script, reworks the landing page UI, and configures Docusaurus for the new doc structure. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant NodeScript as consolidate-hyperindex-docs.js
participant FileSystem
participant Docusaurus
User->>NodeScript: Run "yarn consolidate-docs"
NodeScript->>FileSystem: Scan docs/HyperIndex/* and docs/HyperSync/*
NodeScript->>FileSystem: Read .md/.mdx files
NodeScript->>NodeScript: Clean & sanitize content
NodeScript->>FileSystem: Write consolidated MDX files
User->>Docusaurus: Run "yarn build-llm" or "yarn start-llm"
Docusaurus->>FileSystem: Load LLM MDX docs and config
Docusaurus->>User: Serve LLM-optimized documentation site
Estimated code review effort🎯 4 (Complex) | ⏱️ ~40 minutes Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Actionable comments posted: 9
🧹 Nitpick comments (13)
sidebarsHyperSync.js (1)
32-36: Hard-coded absolute path may break with non-root baseUrlUsing
href: "/docs/HyperSync-LLM/hypersync-complete"ties the link to the site root (/).
IfbaseUrlindocusaurus.config.jschanges (e.g., the docs are hosted under/documentation/), this link will 404.Consider building the URL at runtime:
- href: "/docs/HyperSync-LLM/hypersync-complete", + href: `${require("@docusaurus/core/lib/paths").DEFAULT_CONFIG.baseUrl || "/" }docs/HyperSync-LLM/hypersync-complete`,Or convert this entry to a
"doc"type inside the HyperSync-LLM plugin and reference the doc id instead of a fixed path.sidebarsHyperIndex.js (1)
144-148: Prefertype: "doc"for internal docs to avoid baseUrl issuesUsing
type: "link"with a hard-coded/docs/...path bypasses Docusaurus’ baseUrl/localisation handling and can break in deployments served from a sub-path (e.g./envio-docs/).
Switch to adocreference so Docusaurus resolves the correct URL in every environment.- { - type: "link", - label: "LLM Documentation", - href: "/docs/HyperIndex-LLM/hyperindex-complete", - }, + { + type: "doc", + id: "HyperIndex-LLM/hyperindex-complete", + label: "LLM Documentation", + },docs/HyperIndex/migration-guide.md (2)
16-16: Use relative link instead of absolute/docs/...Most links in this file are relative (
../HyperIndex/...). Keep that pattern so the docs resolve under anybaseUrl.-(don't forget to use our [ai friendly docs](/docs/HyperIndex-LLM/hyperindex-complete)). +(don't forget to use our [AI-friendly docs](HyperIndex-LLM/hyperindex-complete)).
137-137: Tone: replace weak intensifier “quite simple”-The HyperIndex syntax is usually in typescript. Since assemblyscript is a subset of typescript, it's quite simple to copy and paste the code, especially so for pure functions. +The HyperIndex syntax is usually in TypeScript. Because AssemblyScript is a subset of TypeScript, copying code—especially pure functions—is straightforward.docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md (1)
96-97: Taraxa & Tangle rows – missing trace columnFor new rows, explicitly set the “Supports Traces” column (
✔️or blank) to avoid ambiguity for users.docs/HyperSync/hypersync-supported-networks.md (2)
50-51: Alphabetical order drift
Chainweb Testnet 20/21are inserted aboveChiliz, which breaks the existing alphabetical ordering by network name. Consider re-ordering to keep the table easy to scan.
89-95: Scroll tier changed to 🪨 – flag downstream pricing docsChanging the tier impacts rate-limits and pricing. Ensure the hosted-service billing docs and marketing pages are updated in tandem.
scripts/consolidate-hyperindex-docs.js (2)
117-120: Consider preserving code blocks selectively.The current implementation removes ALL code blocks, which might eliminate valuable documentation content. Consider preserving certain types of code blocks (like configuration examples) while only removing problematic ones.
- // Remove any remaining problematic syntax - content = content.replace(/```[^`]*```/g, ""); - - // Remove any remaining code blocks that might cause issues - content = content.replace(/```[\s\S]*?```/g, ""); + // Remove specific problematic code blocks while preserving others + // Only remove code blocks that contain known problematic patterns + content = content.replace(/```(?:jsx?|tsx?|html|css)[^`]*```/g, "");
195-256: Consider refactoring to reduce code duplication.This function is nearly identical to
consolidateHyperIndexDocs. While it works correctly, consider extracting a generic consolidation function to reduce duplication.+function consolidateDocs(sourceDir, outputFile, id, title) { + // Create output directory if it doesn't exist + const outputDir = path.dirname(outputFile); + if (!fs.existsSync(outputDir)) { + fs.mkdirSync(outputDir, { recursive: true }); + } + + // Find all markdown files + const markdownFiles = findMarkdownFiles(sourceDir); + console.log(`Found ${markdownFiles.length} markdown files to consolidate`); + + let consolidatedContent = `--- +id: ${id} +title: ${title} +sidebar_label: ${title} +slug: /${id} +--- + +# ${title} + +This document contains all ${title.split(' ')[0]} documentation consolidated into a single file for LLM consumption. + +--- + +`; + // ... rest of processing logic +} function consolidateHyperSyncDocs() { - const hyperSyncDir = path.join(__dirname, "../docs/HyperSync"); - const outputFile = path.join( - __dirname, - "../docs/HyperSync-LLM/hypersync-complete.mdx" - ); - // ... rest of the function + consolidateDocs( + path.join(__dirname, "../docs/HyperSync"), + path.join(__dirname, "../docs/HyperSync-LLM/hypersync-complete.mdx"), + "hypersync-complete", + "HyperSync Complete Documentation" + ); }LLM_DOCS_README.md (1)
17-19: Script name no longer reflects its broader scope
scripts/consolidate-hyperindex-docs.jsnow consolidates both HyperIndex and HyperSync docs, yet the file name still implies it is HyperIndex-only. This mis-naming will mislead contributors scanning the scripts folder.Consider renaming the file (and updating
package.jsonreferences) to something neutral, e.g.consolidate-docs.js.src/pages/index.module.css (2)
51-55: Add-webkit-backdrop-filterfor Safari supportSafari (including iOS) still requires the vendor-prefixed property for
backdrop-filter; without it, translucent cards fall back to a fully opaque background and break the glass-morphism effect.back-drop-filter: blur(10px); +-webkit-backdrop-filter: blur(10px);Repeat for the
.docLinkclass.Also applies to: 96-97
57-61: Respect reduced-motion user preferenceThe hover
transform: translateY(-2px)(and similar) produces motion that may trigger vestibular issues. Consider guarding these transitions with a media query:@media (prefers-reduced-motion: no-preference) { .docSection:hover { transform: translateY(-2px); } }docs/HyperSync-LLM/hypersync-complete.mdx (1)
60-60: Typo: “furhter” → “further”Minor spelling mistake in “Client examples are listed furhter below.”
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (23)
LLM_DOCS_README.md(1 hunks)docs/HyperIndex/migration-guide.md(6 hunks)docs/HyperIndex/supported-networks/aurora-turbo.md(1 hunks)docs/HyperIndex/supported-networks/chainweb-testnet-20.md(1 hunks)docs/HyperIndex/supported-networks/chainweb-testnet-21.md(1 hunks)docs/HyperIndex/supported-networks/plume.md(1 hunks)docs/HyperIndex/supported-networks/scroll.md(1 hunks)docs/HyperIndex/supported-networks/tangle.md(1 hunks)docs/HyperIndex/supported-networks/taraxa.md(1 hunks)docs/HyperSync-LLM/hypersync-complete.mdx(1 hunks)docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md(4 hunks)docs/HyperSync/hypersync-supported-networks.md(4 hunks)docusaurus.config.js(2 hunks)docusaurus.config.llm.js(1 hunks)package.json(1 hunks)scripts/consolidate-hyperindex-docs.js(1 hunks)sidebarsHyperIndex.js(1 hunks)sidebarsHyperIndexLLM.js(1 hunks)sidebarsHyperSync.js(1 hunks)sidebarsHyperSyncLLM.js(1 hunks)src/pages/index.js(1 hunks)src/pages/index.module.css(1 hunks)supported-networks.json(1 hunks)
🧰 Additional context used
🪛 LanguageTool
docs/HyperIndex/migration-guide.md
[style] ~137-~137: As an alternative to the over-used intensifier ‘quite’, consider replacing this phrase.
Context: ...yscript is a subset of typescript, it's quite simple to copy and paste the code, especially ...
(EN_WEAK_ADJECTIVE)
🪛 Gitleaks (8.27.2)
docusaurus.config.llm.js
80-80: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🪛 markdownlint-cli2 (0.17.2)
docs/HyperSync/hypersync-supported-networks.md
38-38: Bare URL used
(MD034, no-bare-urls)
38-38: Bare URL used
(MD034, no-bare-urls)
50-50: Bare URL used
(MD034, no-bare-urls)
50-50: Bare URL used
(MD034, no-bare-urls)
51-51: Bare URL used
(MD034, no-bare-urls)
51-51: Bare URL used
(MD034, no-bare-urls)
89-89: Bare URL used
(MD034, no-bare-urls)
89-89: Bare URL used
(MD034, no-bare-urls)
90-90: Bare URL used
(MD034, no-bare-urls)
90-90: Bare URL used
(MD034, no-bare-urls)
91-91: Bare URL used
(MD034, no-bare-urls)
91-91: Bare URL used
(MD034, no-bare-urls)
92-92: Bare URL used
(MD034, no-bare-urls)
92-92: Bare URL used
(MD034, no-bare-urls)
93-93: Bare URL used
(MD034, no-bare-urls)
93-93: Bare URL used
(MD034, no-bare-urls)
94-94: Bare URL used
(MD034, no-bare-urls)
94-94: Bare URL used
(MD034, no-bare-urls)
95-95: Bare URL used
(MD034, no-bare-urls)
95-95: Bare URL used
(MD034, no-bare-urls)
104-104: Bare URL used
(MD034, no-bare-urls)
104-104: Bare URL used
(MD034, no-bare-urls)
105-105: Bare URL used
(MD034, no-bare-urls)
105-105: Bare URL used
(MD034, no-bare-urls)
🔇 Additional comments (34)
docs/HyperIndex/supported-networks/scroll.md (1)
22-22: Tier label updated – verify global consistencyLooks good here, but please double-check that every instance of the Scroll tier (e.g., summary tables, JSON configs, sidebar listings) now reflects “STONE 🪨” to avoid mismatched references.
docs/HyperIndex/supported-networks/tangle.md (1)
22-22: Tier label updated – confirm other docs matchSame note as for Scroll: ensure all Tangle tier mentions (supported-networks tables, JSON, HyperSync docs) were updated to “STONE 🪨” so users never see conflicting tiers.
sidebarsHyperSyncLLM.js (1)
1-3: LGTM – minimal, correct sidebar exportThe sidebar file follows the same pattern as existing sidebars and exposes the required entry.
sidebarsHyperIndexLLM.js (1)
1-3: LGTM – minimal, correct sidebar exportMatches the style used for the HyperSync LLM sidebar.
docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md (2)
30-31: Verify new network entry ordering and ID correctness
Aurora Turbo(1313161567) is inserted afterAurora. To keep the table alphabetically ordered it should come directly afterAurora(fine), but please double-check the chain ID—Aurora Turbo sometimes publishes1313161583in upstream specs.
Confirm the ID and URL before publishing.
42-44: Chainweb testnet rows: ensure canonical namingKadena’s Chainweb test network usually refers to “Chainweb Testnet
**pact-20**/pact-21”.
Confirm the wording and URLs with upstream docs—the current endpoints will 404 if the hyphenation or chain number is wrong.docs/HyperSync/hypersync-supported-networks.md (1)
104-105: Taraxa marked 🥉 but HyperRPC lists it as Bronze too – goodNo issues here; thanks for keeping tiers consistent across docs.
docs/HyperIndex/supported-networks/taraxa.md (3)
1-6: LGTM: Well-structured frontmatter configuration.The document frontmatter follows the expected structure with proper id, title, sidebar_label, and slug configuration for Docusaurus integration.
38-53: LGTM: Clear and accurate YAML configuration example.The YAML configuration example is well-structured and provides clear guidance for users setting up indexers for the Taraxa network. The network ID (841) matches the documented Chain ID consistently.
12-16: Network configuration validated – no changes required
The Taraxa Chain ID (841) and both HyperSync (https://taraxa.hypersync.xyz) and HyperRPC (https://taraxa.rpc.hypersync.xyz) endpoints resolve correctly at the network level, and the Chain ID is consistently referenced across all documentation.src/pages/index.js (3)
2-5: LGTM: Proper Docusaurus imports and structure.The imports correctly use Docusaurus components and hooks, following the framework's best practices for creating custom pages.
8-14: LGTM: Clean component initialization and Layout usage.The component properly utilizes the Docusaurus context hook and Layout component with appropriate title and description configuration.
42-54: LLM Documentation Routes VerifiedAll referenced documentation files and sidebar configs are present:
- docs/HyperIndex-LLM/hyperindex-complete.mdx
- docs/HyperSync-LLM/hypersync-complete.mdx
- sidebarsHyperIndexLLM.js
- sidebarsHyperSyncLLM.js
No further action needed.
docusaurus.config.llm.js (3)
79-84: API key exposure is acceptable for Algolia search.The static analysis tool flagged this as a potential security issue, but Algolia search API keys are designed to be publicly exposed in client-side applications. This is the correct way to configure Algolia search in Docusaurus.
127-152: LGTM: Well-configured documentation plugins.The dual plugin setup for HyperIndex-LLM and HyperSync-LLM is properly configured with correct paths, route base paths, and sidebar references. This supports the new LLM documentation architecture effectively.
20-36: LLM site config intentionally disables blog and docs
- docusaurus.config.llm.js is invoked only by the
build-llmandstart-llmscripts in package.json, keeping it separate from the maindocusaurus.config.js.- Disabling
docsandbloghere aligns with the LLM-only documentation requirements.- No further action needed.
docs/HyperIndex/supported-networks/plume.md (2)
1-6: LGTM: Consistent frontmatter structure.The document frontmatter follows the established pattern with proper configuration for Plume network documentation.
20-22: Consistent tier classification.The "STONE 🪨" tier classification is consistent with the other new network documentation files in this PR.
docs/HyperIndex/supported-networks/chainweb-testnet-21.md (2)
1-6: LGTM: Proper frontmatter configuration.The frontmatter correctly handles the longer network name while maintaining consistency with the documentation structure.
38-53: LGTM: Accurate YAML configuration.The YAML configuration correctly uses the network ID (5921) and follows the established template structure for network configuration examples.
scripts/consolidate-hyperindex-docs.js (4)
4-13: LGTM!The error handling and synchronous file reading approach is appropriate for a build script context.
61-82: LGTM!The recursive file scanning implementation is efficient and the sorting ensures consistent output.
132-193: LGTM!The consolidation logic is well-structured with proper directory creation, file processing, and output formatting. The MDX front matter and section organization are appropriate for LLM consumption.
258-282: LGTM!The command-line interface is well-designed with clear options and sensible defaults. The module exports enable external usage of the consolidation functions.
docusaurus.config.js (2)
379-382: LGTM!Good formatting consistency improvement by standardizing on double quotes for metadata values.
423-446: LGTM!The new LLM documentation plugins are properly configured and follow the established patterns. Disabling edit URLs and update times is appropriate for consolidated documentation.
docs/HyperIndex/supported-networks/aurora-turbo.md (4)
1-8: LGTM!The front matter is properly structured and follows the established conventions for network documentation.
12-17: LGTM!The network details table is comprehensive and provides both named and chain-ID based endpoints for flexibility.
20-33: LGTM!The tier classification and overview content are consistent with the established documentation pattern and provide valuable information about Envio's capabilities.
38-62: LGTM!The YAML configuration example correctly uses the Aurora Turbo chain ID and follows the established structure. The support section provides appropriate contact information.
docs/HyperIndex/supported-networks/chainweb-testnet-20.md (4)
1-17: LGTM!The front matter and network details table are properly structured and consistent with the established documentation pattern.
20-33: LGTM!The tier classification and overview content are consistent and properly adapted for Chainweb Testnet 20.
38-58: LGTM!The YAML configuration example correctly uses the Chainweb Testnet 20 chain ID (5920) and follows the established structure.
59-62: LGTM!The support section is consistent with other network documentation and provides appropriate contact information.
| | **Chainweb Testnet 21 Chain ID** | 5921 | | ||
| | **HyperSync URL Endpoint** | [https://chainweb-testnet-21.hypersync.xyz](https://chainweb-testnet-21.hypersync.xyz) or [https://5921.hypersync.xyz](https://5921.hypersync.xyz) | | ||
| | **HyperRPC URL Endpoint** | [https://chainweb-testnet-21.rpc.hypersync.xyz](https://chainweb-testnet-21.rpc.hypersync.xyz) or [https://5921.rpc.hypersync.xyz](https://5921.rpc.hypersync.xyz) | |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Verify Chainweb Testnet 21 configuration.
Please confirm the Chain ID (5921) and endpoint URLs are accurate for Chainweb Testnet 21.
🏁 Script executed:
#!/bin/bash
# Description: Verify Chainweb Testnet 21 network configuration
# Expected: Chain ID and endpoints should be valid
echo "Checking Chainweb Testnet 21 references:"
rg -A 2 -B 2 "5921|chainweb.*testnet.*21" --type md --ignore-case
echo "Testing Chainweb Testnet 21 endpoint accessibility:"
curl -s -o /dev/null -w "%{http_code}" "https://chainweb-testnet-21.hypersync.xyz" || echo "Chainweb Testnet 21 HyperSync endpoint not accessible"Length of output: 6797
Fix Chainweb Testnet 21 Endpoint URLs
The Chain ID 5921 is correct, but the primary HyperSync endpoint (https://chainweb-testnet-21.hypersync.xyz) returns HTTP 404. Please confirm the live URLs for both HyperSync and HyperRPC and update the docs accordingly.
Affected files/locations:
- docs/HyperIndex/supported-networks/chainweb-testnet-21.md (lines 14–16)
- docs/HyperSync/hypersync-supported-networks.md (Chainweb Testnet 21 row)
- docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md (Chainweb Testnet 21 row)
You can bulk-verify all four endpoints with:
#!/bin/bash
endpoints=(
https://chainweb-testnet-21.hypersync.xyz
https://5921.hypersync.xyz
https://chainweb-testnet-21.rpc.hypersync.xyz
https://5921.rpc.hypersync.xyz
)
for url in "${endpoints[@]}"; do
printf "%s → " "$url"
curl -s -o /dev/null -w "%{http_code}\n" "$url"
doneOnce you’ve confirmed the correct HTTP 200 endpoints, update each file so that users can successfully connect to Chainweb Testnet 21.
🤖 Prompt for AI Agents
In docs/HyperIndex/supported-networks/chainweb-testnet-21.md lines 14 to 16, the
primary HyperSync endpoint URL returns HTTP 404. Verify the live URLs for both
HyperSync and HyperRPC endpoints by testing their HTTP status codes, then update
the markdown links to use only the URLs that return HTTP 200. Ensure the changes
are consistent with the other affected files mentioned to provide accurate
connection information for Chainweb Testnet 21.
There was a problem hiding this comment.
I'm not sure what Chainweb testnet is but might be good to remove
There was a problem hiding this comment.
🧩 Analysis chain
@DenhamPreen Let me check the status of both Chainweb testnets to see if they should be removed.
🏁 Script executed:
#!/bin/bash
echo "Checking status of Chainweb testnet endpoints:"
echo ""
# Test Chainweb Testnet 20 endpoints
echo "=== Chainweb Testnet 20 (ID: 5920) ==="
endpoints_20=(
https://chainweb-testnet-20.hypersync.xyz
https://5920.hypersync.xyz
https://chainweb-testnet-20.rpc.hypersync.xyz
https://5920.rpc.hypersync.xyz
)
for url in "${endpoints_20[@]}"; do
printf "%-50s → " "$url"
status=$(curl -s -o /dev/null -w "%{http_code}" "$url" --connect-timeout 10 --max-time 15)
echo "$status"
done
echo ""
echo "=== Chainweb Testnet 21 (ID: 5921) ==="
endpoints_21=(
https://chainweb-testnet-21.hypersync.xyz
https://5921.hypersync.xyz
https://chainweb-testnet-21.rpc.hypersync.xyz
https://5921.rpc.hypersync.xyz
)
for url in "${endpoints_21[@]}"; do
printf "%-50s → " "$url"
status=$(curl -s -o /dev/null -w "%{http_code}" "$url" --connect-timeout 10 --max-time 15)
echo "$status"
done
echo ""
echo "Checking for any other references to Chainweb testnets:"
rg -i "chainweb.*testnet" --type md -A 1 -B 1
Length of output: 11503
@DenhamPreen You're absolutely right - both Chainweb testnets should be removed. The endpoint testing shows:
Chainweb Testnet 20 & 21 Status:
- ❌ HyperSync endpoints return HTTP 404 (not found)
⚠️ RPC endpoints return HTTP 405 (method not allowed, but this might be normal for RPC)
Since the core HyperSync endpoints don't exist, these networks aren't actually supported. Here are all the files that need to be cleaned up:
Files to remove:
docs/HyperIndex/supported-networks/chainweb-testnet-20.mddocs/HyperIndex/supported-networks/chainweb-testnet-21.md
Files to update (remove Chainweb testnet entries):
docs/HyperSync/hypersync-supported-networks.mddocs/HyperSync/HyperRPC/hyperrpc-url-endpoints.mdsupported-networks.json
Would you like me to help create an issue to track the removal of these non-functional network documentations?
✏️ Learnings added
Learnt from: DenhamPreen
PR: enviodev/docs#673
File: docs/HyperIndex/supported-networks/chainweb-testnet-21.md:14-16
Timestamp: 2025-07-29T14:30:03.631Z
Learning: Chainweb Testnet 20 (ID: 5920) and Chainweb Testnet 21 (ID: 5921) have non-functional HyperSync endpoints that return HTTP 404, indicating these networks are not properly supported and should be removed from documentation.
Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.
| | **Plume Chain ID** | 98866 | | ||
| | **HyperSync URL Endpoint** | [https://plume.hypersync.xyz](https://plume.hypersync.xyz) or [https://98866.hypersync.xyz](https://98866.hypersync.xyz) | | ||
| | **HyperRPC URL Endpoint** | [https://plume.rpc.hypersync.xyz](https://plume.rpc.hypersync.xyz) or [https://98866.rpc.hypersync.xyz](https://98866.rpc.hypersync.xyz) | |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Verify Plume network configuration.
Please confirm the Chain ID (98866) and endpoint URLs are correct for the Plume network.
🏁 Script executed:
#!/bin/bash
# Description: Verify Plume network configuration accuracy
# Expected: Chain ID and endpoints should be valid for Plume network
echo "Checking Plume network references across documentation:"
rg -A 2 -B 2 "98866|plume" --type md --ignore-case
echo "Testing Plume endpoint accessibility:"
curl -s -o /dev/null -w "%{http_code}" "https://plume.hypersync.xyz" || echo "Plume HyperSync endpoint not accessible"
curl -s -o /dev/null -w "%{http_code}" "https://plume.rpc.hypersync.xyz" || echo "Plume HyperRPC endpoint not accessible"Length of output: 6152
Plume endpoint URLs are unreachable
The Chain ID (98866) is consistent across documentation, but both listed endpoints return errors (HTTP 404 for HyperSync and HTTP 405 for HyperRPC). Please confirm and update the correct URLs.
• docs/HyperIndex/supported-networks/plume.md (lines 14–16)
• docs/HyperSync/hypersync-supported-networks.md (Plume entry)
• docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md (Plume entry)
Current snippet:
| **Plume Chain ID** | 98866 |
| **HyperSync URL Endpoint** | https://plume.hypersync.xyz or https://98866.hypersync.xyz ← returns HTTP 404 |
| **HyperRPC URL Endpoint** | https://plume.rpc.hypersync.xyz or https://98866.rpc.hypersync.xyz ← returns HTTP 405 |🤖 Prompt for AI Agents
In docs/HyperIndex/supported-networks/plume.md around lines 14 to 16, the listed
HyperSync and HyperRPC URL endpoints for Plume with Chain ID 98866 are returning
HTTP 404 and 405 errors respectively. Verify the correct, currently active URLs
for both HyperSync and HyperRPC endpoints for Plume, then update the markdown
table entries with these valid URLs. Also, cross-check and update the
corresponding Plume entries in docs/HyperSync/hypersync-supported-networks.md
and docs/HyperSync/HyperRPC/hyperrpc-url-endpoints.md to ensure consistency
across documentation.
| # ReceiptSelection | ||
| The query takes an array of ReceiptSelection objects and returns receipts that match any of the selections. All fields are optional. | ||
| Below is an exhaustive list of all fields in a ReceiptSelection JSON object. Reference the [Fuel docs on receipts for field explanations. | ||
|
|
There was a problem hiding this comment.
Broken Markdown link – missing closing bracket
Reference the [Fuel docs on receipts for field explanations. lacks a terminating ], which will fail MDX compilation.
-Reference the [Fuel docs on receipts for field explanations.
+Reference the [Fuel docs on receipts](https://github.com/FuelLabs/fuel-specs/blob/master/src/...) for field explanations.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # ReceiptSelection | |
| The query takes an array of ReceiptSelection objects and returns receipts that match any of the selections. All fields are optional. | |
| Below is an exhaustive list of all fields in a ReceiptSelection JSON object. Reference the [Fuel docs on receipts for field explanations. | |
| # ReceiptSelection | |
| The query takes an array of ReceiptSelection objects and returns receipts that match any of the selections. All fields are optional. | |
| Below is an exhaustive list of all fields in a ReceiptSelection JSON object. Reference the [Fuel docs on receipts](https://github.com/FuelLabs/fuel-specs/blob/master/src/...) for field explanations. |
🤖 Prompt for AI Agents
In docs/HyperSync-LLM/hypersync-complete.mdx around lines 24 to 27, the markdown
link "Reference the [Fuel docs on receipts for field explanations." is missing
the closing bracket "]". Add the missing closing bracket to properly close the
markdown link syntax and ensure the MDX compiles correctly.
| | **Chain Data** | `eth_chainId``eth_blockNumber` | | ||
| | **Block Data** | `eth_getBlockByNumber``eth_getBlockByHash``eth_getBlockReceipts` | | ||
| | **Transaction Data** | `eth_getTransactionByHash``eth_getTransactionByBlockHashAndIndex``eth_getTransactionByBlockNumberAndIndex``eth_getTransactionReceipt` | |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Method tables rendered incorrectly – back-ticks jammed together
The inline code blocks have no separators, e.g. eth_chainIdeth_blockNumber``. MDX treats this as one long literal and breaks table formatting.
| **Chain Data** | `eth_chainId`, `eth_blockNumber` |
| **Block Data** | `eth_getBlockByNumber`, `eth_getBlockByHash`, `eth_getBlockReceipts` |
| **Transaction Data** | `eth_getTransactionByHash`, `eth_getTransactionByBlockHashAndIndex`, `eth_getTransactionByBlockNumberAndIndex`, `eth_getTransactionReceipt` |📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| | **Chain Data** | `eth_chainId``eth_blockNumber` | | |
| | **Block Data** | `eth_getBlockByNumber``eth_getBlockByHash``eth_getBlockReceipts` | | |
| | **Transaction Data** | `eth_getTransactionByHash``eth_getTransactionByBlockHashAndIndex``eth_getTransactionByBlockNumberAndIndex``eth_getTransactionReceipt` | | |
| | **Chain Data** | `eth_chainId`, `eth_blockNumber` | | |
| | **Block Data** | `eth_getBlockByNumber`, `eth_getBlockByHash`, `eth_getBlockReceipts` | | |
| | **Transaction Data** | `eth_getTransactionByHash`, `eth_getTransactionByBlockHashAndIndex`, `eth_getTransactionByBlockNumberAndIndex`, `eth_getTransactionReceipt` | |
🤖 Prompt for AI Agents
In docs/HyperSync-LLM/hypersync-complete.mdx around lines 254 to 256, the inline
code blocks in the method tables are jammed together without spaces or
separators, causing MDX to render them as one long literal and break the table
formatting. Fix this by adding spaces or other appropriate separators between
each inline code block to ensure they are treated as separate code snippets and
the table renders correctly.
| ```bash | ||
| # Build the LLM documentation | ||
| yarn build-llm | ||
|
|
||
| - In Vercel, go to Deployments | ||
| - Create a new deployment | ||
| - Select your repository and branch | ||
| - Choose the environment where `DOCS_FOR_LLM=true` | ||
| - Deploy | ||
| # Start the LLM documentation server | ||
| yarn start-llm | ||
|
|
||
| 3. **Alternative: Create a Git Branch**: | ||
| - Create a branch named `llm-docs` in your repository | ||
| - Configure Vercel to use the `DOCS_FOR_LLM=true` environment variable for this branch | ||
| - Vercel will automatically deploy with the LLM-friendly configuration | ||
| # Consolidate documentation files | ||
| yarn consolidate-docs |
There was a problem hiding this comment.
Update usage commands to match the repo-wide switch to pnpm
The PR description mentions migration to pnpm/pnpx, yet the README still instructs users to run yarn build-llm, yarn start-llm, and yarn consolidate-docs. This will confuse newcomers and break copy-paste setups in environments where Yarn is no longer installed.
-# Build the LLM documentation
-yarn build-llm
+# Build the LLM documentation
+pnpm build-llm
...
-# Start the LLM documentation server
-yarn start-llm
+# Start the LLM documentation server
+pnpm start-llm
...
-# Consolidate documentation files
-yarn consolidate-docs
+# Consolidate documentation files
+pnpm consolidate-docs📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ```bash | |
| # Build the LLM documentation | |
| yarn build-llm | |
| - In Vercel, go to Deployments | |
| - Create a new deployment | |
| - Select your repository and branch | |
| - Choose the environment where `DOCS_FOR_LLM=true` | |
| - Deploy | |
| # Start the LLM documentation server | |
| yarn start-llm | |
| 3. **Alternative: Create a Git Branch**: | |
| - Create a branch named `llm-docs` in your repository | |
| - Configure Vercel to use the `DOCS_FOR_LLM=true` environment variable for this branch | |
| - Vercel will automatically deploy with the LLM-friendly configuration | |
| # Consolidate documentation files | |
| yarn consolidate-docs |
🤖 Prompt for AI Agents
In LLM_DOCS_README.md around lines 24 to 32, the usage commands still reference
Yarn (yarn build-llm, yarn start-llm, yarn consolidate-docs) despite the repo
switching to pnpm/pnpx. Update these commands to use pnpm instead (pnpm
build-llm, pnpm start-llm, pnpm consolidate-docs) to ensure consistency with the
repo tooling and avoid confusion or errors for users following the README.
There was a problem hiding this comment.
The docs don't work with pnpm actually
There was a problem hiding this comment.
@DenhamPreen Thanks for the clarification! I understand now that despite the broader repo migration mentioned in the PR description, the documentation build system specifically still requires Yarn to function properly. I'll keep the commands as yarn in the README to ensure they work correctly.
🐰💡
✏️ Learnings added
Learnt from: DenhamPreen
PR: enviodev/docs#673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.621Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.
| "preinstall": "npx only-allow yarn", | ||
| "consolidate-docs": "node ./scripts/consolidate-hyperindex-docs.js --all", | ||
| "build-llm": "yarn consolidate-docs && npx docusaurus build --config docusaurus.config.llm.js", | ||
| "start-llm": "yarn consolidate-docs && npx docusaurus start --config docusaurus.config.llm.js" |
There was a problem hiding this comment.
preinstall blocks pnpm / npm – contradicts migration goal
"preinstall": "npx only-allow yarn" forces contributors to use Yarn, yet the PR description mentions migrating to pnpm/pnpx.
This hard-block will fail CI runs or developer setups that rely on pnpm/npm.
- "preinstall": "npx only-allow yarn",
+ // Consider dropping the restriction or switching to pnpm-preferred tooling
+ // "preinstall": "npx only-allow pnpm",Either remove the guard or align it with the package-manager policy communicated in the docs.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In package.json lines 22 to 25, the "preinstall" script uses "npx only-allow
yarn" which blocks usage of pnpm and npm, contradicting the migration goal to
pnpm/pnpx. To fix this, remove or update the "preinstall" script to allow the
intended package manager(s) per the project's policy, ensuring it does not block
pnpm or npm usage during installs or CI runs.
There was a problem hiding this comment.
@DenhamPreen To resolve this contradiction, you have a few options:
Option 1: Switch to pnpm enforcement
"preinstall": "npx only-allow pnpm"Option 2: Remove the restriction entirely
// Remove the "preinstall" line completelyOption 3: Allow multiple package managers
"preinstall": "npx only-allow pnpm npm"Since your PR description mentions migrating to pnpm/pnpx, I'd recommend Option 1 to align with your stated goals. This will ensure consistency across the project and prevent yarn from being used accidentally.
Which approach would you prefer for your migration strategy?
| "consolidate-docs": "node ./scripts/consolidate-hyperindex-docs.js --all", | ||
| "build-llm": "yarn consolidate-docs && npx docusaurus build --config docusaurus.config.llm.js", | ||
| "start-llm": "yarn consolidate-docs && npx docusaurus start --config docusaurus.config.llm.js" |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Script naming: stick to one package-manager
build-llm and start-llm invoke yarn consolidate-docs but elsewhere the project is moving to pnpm. Using mixed tooling confuses contributors.
Prefer pnpm (or generic npm run) consistently:
- "build-llm": "yarn consolidate-docs && npx docusaurus build --config docusaurus.config.llm.js",
- "start-llm": "yarn consolidate-docs && npx docusaurus start --config docusaurus.config.llm.js"
+ "build-llm": "pnpm run consolidate-docs && pnpm exec docusaurus build --config docusaurus.config.llm.js",
+ "start-llm": "pnpm run consolidate-docs && pnpm exec docusaurus start --config docusaurus.config.llm.js"📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "consolidate-docs": "node ./scripts/consolidate-hyperindex-docs.js --all", | |
| "build-llm": "yarn consolidate-docs && npx docusaurus build --config docusaurus.config.llm.js", | |
| "start-llm": "yarn consolidate-docs && npx docusaurus start --config docusaurus.config.llm.js" | |
| "consolidate-docs": "node ./scripts/consolidate-hyperindex-docs.js --all", | |
| - "build-llm": "yarn consolidate-docs && npx docusaurus build --config docusaurus.config.llm.js", | |
| - "start-llm": "yarn consolidate-docs && npx docusaurus start --config docusaurus.config.llm.js" | |
| + "build-llm": "pnpm run consolidate-docs && pnpm exec docusaurus build --config docusaurus.config.llm.js", | |
| + "start-llm": "pnpm run consolidate-docs && pnpm exec docusaurus start --config docusaurus.config.llm.js" |
🤖 Prompt for AI Agents
In package.json lines 23 to 25, the scripts `build-llm` and `start-llm` use
`yarn consolidate-docs` which conflicts with the project's move to pnpm. Replace
`yarn consolidate-docs` with `pnpm run consolidate-docs` or `npm run
consolidate-docs` to maintain consistent package manager usage across all
scripts.
There was a problem hiding this comment.
@DenhamPreen I understand your frustration! You're absolutely right - based on the learning I have from our previous discussions, the Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
The current implementation using yarn consolidate-docs in these scripts is correct and should not be changed to pnpm. I apologize for the incorrect suggestion from the automated review.
🐰💭
Sometimes even helpful rabbits need reminders about special requirements!
🧠 Learnings used
Learnt from: DenhamPreen
PR: enviodev/docs#673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.621Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
| // If no title found, use filename | ||
| if (!title && frontMatterEnded && line.trim()) { | ||
| title = path | ||
| .basename(filePath, path.extname(filePath)) | ||
| .replace(/-/g, " ") | ||
| .replace(/\b\w/g, (l) => l.toUpperCase()); | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Fix title extraction logic.
The condition !title && frontMatterEnded && line.trim() may not work as intended because frontMatterEnded might still be false when processing the first content line. This could result in missing title extraction from filenames.
Consider this approach:
- // If no title found, use filename
- if (!title && frontMatterEnded && line.trim()) {
- title = path
- .basename(filePath, path.extname(filePath))
- .replace(/-/g, " ")
- .replace(/\b\w/g, (l) => l.toUpperCase());
- }
+ // If no title found and we're past front matter, use filename
+ if (!title && !inFrontMatter && line.trim()) {
+ title = path
+ .basename(filePath, path.extname(filePath))
+ .replace(/-/g, " ")
+ .replace(/\b\w/g, (l) => l.toUpperCase());
+ }Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In scripts/consolidate-hyperindex-docs.js around lines 47 to 53, the current
condition for setting the title uses frontMatterEnded which might still be false
when processing the first content line, causing the title extraction from the
filename to be skipped. Modify the condition to ensure the title is set from the
filename if no title is found after the front matter ends or if frontMatterEnded
is not yet true but the line is not empty. This ensures the title extraction
logic reliably falls back to the filename when needed.
| "supported-networks/local-anvil", | ||
| "supported-networks/local-hardhat", | ||
| "supported-networks/0g-newton-testnet","supported-networks/abstract","supported-networks/aleph-zero-evm","supported-networks/altlayer-op-demo-testnet","supported-networks/ancient8","supported-networks/arbitrum","supported-networks/arbitrum-blueberry","supported-networks/arbitrum-nova","supported-networks/arbitrum-sepolia","supported-networks/artela-testnet","supported-networks/arthera-mainnet","supported-networks/asset-chain-mainnet","supported-networks/astar-zkevm","supported-networks/astar-zkyoto","supported-networks/aurora","supported-networks/avalanche","supported-networks/b2-hub-testnet","supported-networks/b3","supported-networks/b3-sepolia-testnet","supported-networks/base","supported-networks/base-sepolia","supported-networks/beam","supported-networks/berachain","supported-networks/berachain-artio-testnet","supported-networks/berachain-bartio","supported-networks/bevm-mainnet","supported-networks/bevm-testnet","supported-networks/bitfinity-mainnet","supported-networks/bitfinity-testnet","supported-networks/bitgert-mainnet","supported-networks/bitlayer","supported-networks/blast","supported-networks/blast-sepolia","supported-networks/bob-mainnet","supported-networks/boba","supported-networks/boba-bnb-mainnet","supported-networks/botanix-testnet","supported-networks/bsc","supported-networks/bsc-testnet","supported-networks/canto","supported-networks/canto-testnet","supported-networks/celo","supported-networks/celo-alfajores-testnet","supported-networks/chiliz","supported-networks/chiliz-testnet-spicy","supported-networks/citrea-devnet","supported-networks/citrea-testnet","supported-networks/core","supported-networks/creator-testnet","supported-networks/cronos-zkevm","supported-networks/cronos-zkevm-testnet","supported-networks/crossfi-mainnet","supported-networks/crossfi-mainnet","supported-networks/crossfi-testnet","supported-networks/curtis","supported-networks/cyber","supported-networks/degen-chain","supported-networks/dfk-chain","supported-networks/dogechain-mainnet","supported-networks/dogechain-testnet","supported-networks/dos-chain","supported-networks/energy-web","supported-networks/eos","supported-networks/eth","supported-networks/etherlink-testnet","supported-networks/exosama","supported-networks/fantom","supported-networks/fantom-testnet","supported-networks/flare","supported-networks/flare-songbird","supported-networks/flow","supported-networks/flow-testnet","supported-networks/fraxtal","supported-networks/fuel-mainnet","supported-networks/fuel-testnet","supported-networks/fuji","supported-networks/galadriel-devnet","supported-networks/gnosis","supported-networks/gnosis-chiado","supported-networks/gravity-alpha-mainnet","supported-networks/harmony-shard-0","supported-networks/heco-chain","supported-networks/holesky","supported-networks/hyperliquid","supported-networks/immutable-zkevm","supported-networks/immutable-zkevm-testnet","supported-networks/ink","supported-networks/iotex-network","supported-networks/japan-open-chain","supported-networks/kaia","supported-networks/kakarot-starknet-sepolia","supported-networks/kroma","supported-networks/layeredge-testnet","supported-networks/lightlink-pegasus-testnet","supported-networks/lightlink-phoenix","supported-networks/linea","supported-networks/lisk","supported-networks/lukso","supported-networks/lukso-testnet","supported-networks/manta","supported-networks/manta-pacific-sepolia","supported-networks/mantle","supported-networks/megaeth-testnet","supported-networks/merlin","supported-networks/metall2","supported-networks/meter-mainnet","supported-networks/meter-testnet","supported-networks/mev-commit","supported-networks/mint-mainnet","supported-networks/mode","supported-networks/monad-testnet","supported-networks/moonbase-alpha","supported-networks/moonbeam","supported-networks/morph","supported-networks/morph-holesky","supported-networks/nautilus","supported-networks/neo-x-testnet","supported-networks/nibiru-testnet","supported-networks/now-chaint","supported-networks/oasis-emerald","supported-networks/oasis-sapphire","supported-networks/onigiri-subnet","supported-networks/onigiri-test-subnet","supported-networks/ontology-mainnet","supported-networks/ontology-testnet","supported-networks/op-celestia-raspberry","supported-networks/opbnb","supported-networks/optimism","supported-networks/optimism-sepolia","supported-networks/optopia","supported-networks/peaq","supported-networks/pharos-devnet","supported-networks/polygon","supported-networks/polygon-amoy","supported-networks/polygon-zkevm","supported-networks/polygon-zkevm-cardona-testnet","supported-networks/public-goods-network","supported-networks/pulsechain","supported-networks/puppynet-shibarium","supported-networks/ronin","supported-networks/rootstock","supported-networks/saakuru","supported-networks/satoshivm","supported-networks/scroll","supported-networks/scroll-sepolia","supported-networks/sepolia","supported-networks/shibarium","supported-networks/shimmer-evm","supported-networks/skale-europa","supported-networks/soneium","supported-networks/sonic","supported-networks/sophon","supported-networks/sophon-testnet","supported-networks/stratovm-testnet","supported-networks/superseed","supported-networks/superseed-sepolia-testnet","supported-networks/swell","supported-networks/taiko","supported-networks/tanssi-demo","supported-networks/telos-evm-mainnet","supported-networks/telos-evm-testnet","supported-networks/torus-mainnet","supported-networks/torus-testnet","supported-networks/unichain","supported-networks/unichain-sepolia","supported-networks/unicorn-ultra-nebulas-testnet","supported-networks/velas-mainnet","supported-networks/viction","supported-networks/worldchain","supported-networks/x-layer-mainnet","supported-networks/x-layer-testnet","supported-networks/xdc","supported-networks/xdc-testnet","supported-networks/zeta","supported-networks/zeta-testnet","supported-networks/zircuit","supported-networks/zklink-nova-mainnet","supported-networks/zksync","supported-networks/zksync-sepolia-testnet","supported-networks/zora","supported-networks/zora-sepolia"]} No newline at end of file | ||
| "supported-networks/0g-newton-testnet","supported-networks/abstract","supported-networks/aleph-zero-evm","supported-networks/altlayer-op-demo-testnet","supported-networks/ancient8","supported-networks/arbitrum","supported-networks/arbitrum-blueberry","supported-networks/arbitrum-nova","supported-networks/arbitrum-sepolia","supported-networks/artela-testnet","supported-networks/arthera-mainnet","supported-networks/asset-chain-mainnet","supported-networks/astar-zkevm","supported-networks/astar-zkyoto","supported-networks/aurora","supported-networks/aurora-turbo","supported-networks/avalanche","supported-networks/b2-hub-testnet","supported-networks/b3","supported-networks/b3-sepolia-testnet","supported-networks/base","supported-networks/base-sepolia","supported-networks/beam","supported-networks/berachain","supported-networks/berachain-artio-testnet","supported-networks/berachain-bartio","supported-networks/bevm-mainnet","supported-networks/bevm-testnet","supported-networks/bitfinity-mainnet","supported-networks/bitfinity-testnet","supported-networks/bitgert-mainnet","supported-networks/bitlayer","supported-networks/blast","supported-networks/blast-sepolia","supported-networks/bob-mainnet","supported-networks/boba","supported-networks/boba-bnb-mainnet","supported-networks/botanix-testnet","supported-networks/bsc","supported-networks/bsc-testnet","supported-networks/canto","supported-networks/canto-testnet","supported-networks/celo","supported-networks/celo-alfajores-testnet","supported-networks/chainweb-testnet-20","supported-networks/chainweb-testnet-21","supported-networks/chiliz","supported-networks/chiliz-testnet-spicy","supported-networks/citrea-devnet","supported-networks/citrea-testnet","supported-networks/core","supported-networks/creator-testnet","supported-networks/cronos-zkevm","supported-networks/cronos-zkevm-testnet","supported-networks/crossfi-mainnet","supported-networks/crossfi-mainnet","supported-networks/crossfi-testnet","supported-networks/curtis","supported-networks/cyber","supported-networks/degen-chain","supported-networks/dfk-chain","supported-networks/dogechain-mainnet","supported-networks/dogechain-testnet","supported-networks/dos-chain","supported-networks/energy-web","supported-networks/eos","supported-networks/eth","supported-networks/etherlink-testnet","supported-networks/exosama","supported-networks/fantom","supported-networks/fantom-testnet","supported-networks/flare","supported-networks/flare-songbird","supported-networks/flow","supported-networks/flow-testnet","supported-networks/fraxtal","supported-networks/fuel-mainnet","supported-networks/fuel-testnet","supported-networks/fuji","supported-networks/galadriel-devnet","supported-networks/gnosis","supported-networks/gnosis-chiado","supported-networks/gravity-alpha-mainnet","supported-networks/harmony-shard-0","supported-networks/heco-chain","supported-networks/holesky","supported-networks/hyperliquid","supported-networks/immutable-zkevm","supported-networks/immutable-zkevm-testnet","supported-networks/ink","supported-networks/iotex-network","supported-networks/japan-open-chain","supported-networks/kaia","supported-networks/kakarot-starknet-sepolia","supported-networks/kroma","supported-networks/layeredge-testnet","supported-networks/lightlink-pegasus-testnet","supported-networks/lightlink-phoenix","supported-networks/linea","supported-networks/lisk","supported-networks/lukso","supported-networks/lukso-testnet","supported-networks/manta","supported-networks/manta-pacific-sepolia","supported-networks/mantle","supported-networks/megaeth-testnet","supported-networks/merlin","supported-networks/metall2","supported-networks/meter-mainnet","supported-networks/meter-testnet","supported-networks/mev-commit","supported-networks/mint-mainnet","supported-networks/mode","supported-networks/monad-testnet","supported-networks/moonbase-alpha","supported-networks/moonbeam","supported-networks/morph","supported-networks/nautilus","supported-networks/neo-x-testnet","supported-networks/nibiru-testnet","supported-networks/now-chaint","supported-networks/oasis-emerald","supported-networks/oasis-sapphire","supported-networks/onigiri-subnet","supported-networks/onigiri-test-subnet","supported-networks/ontology-mainnet","supported-networks/ontology-testnet","supported-networks/op-celestia-raspberry","supported-networks/opbnb","supported-networks/optimism","supported-networks/optimism-sepolia","supported-networks/optopia","supported-networks/peaq","supported-networks/plume","supported-networks/polygon","supported-networks/polygon-amoy","supported-networks/polygon-zkevm","supported-networks/polygon-zkevm-cardona-testnet","supported-networks/public-goods-network","supported-networks/pulsechain","supported-networks/puppynet-shibarium","supported-networks/ronin","supported-networks/rootstock","supported-networks/saakuru","supported-networks/satoshivm","supported-networks/scroll","supported-networks/scroll-sepolia","supported-networks/sepolia","supported-networks/shibarium","supported-networks/shimmer-evm","supported-networks/skale-europa","supported-networks/soneium","supported-networks/sonic","supported-networks/sophon","supported-networks/sophon-testnet","supported-networks/stratovm-testnet","supported-networks/superseed","supported-networks/superseed-sepolia-testnet","supported-networks/swell","supported-networks/taiko","supported-networks/tangle","supported-networks/tanssi-demo","supported-networks/taraxa","supported-networks/telos-evm-mainnet","supported-networks/telos-evm-testnet","supported-networks/torus-mainnet","supported-networks/torus-testnet","supported-networks/unichain","supported-networks/unichain-sepolia","supported-networks/unicorn-ultra-nebulas-testnet","supported-networks/velas-mainnet","supported-networks/viction","supported-networks/worldchain","supported-networks/x-layer-mainnet","supported-networks/x-layer-testnet","supported-networks/xdc","supported-networks/xdc-testnet","supported-networks/zeta","supported-networks/zeta-testnet","supported-networks/zircuit","supported-networks/zklink-nova-mainnet","supported-networks/zksync","supported-networks/zksync-sepolia-testnet","supported-networks/zora","supported-networks/zora-sepolia"]} No newline at end of file |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Format JSON for better maintainability.
The single-line format makes this configuration extremely difficult to review and maintain. Consider formatting this JSON properly for better readability and version control.
-{ "supportedNetworks": [
- "supported-networks/any-evm-with-rpc","supported-networks/local-anvil","supported-networks/local-hardhat","supported-networks/0g-newton-testnet","supported-networks/abstract",...]}
+{
+ "supportedNetworks": [
+ "supported-networks/any-evm-with-rpc",
+ "supported-networks/local-anvil",
+ "supported-networks/local-hardhat",
+ "supported-networks/0g-newton-testnet",
+ "supported-networks/abstract",
+ ...
+ ]
+}The network additions (aurora-turbo, chainweb-testnet-20, chainweb-testnet-21, plume, tangle, taraxa) align well with the new documentation files being added.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In supported-networks.json at line 5, the JSON array is currently formatted as a
single long line, which reduces readability and maintainability. Reformat the
JSON to use multiple lines with each network string on its own line, properly
indented and enclosed within brackets, to improve clarity and ease of review.
|
Another question, the full list of network are on here, 50% of the entire doc is repeated words. This really bloats LLM context. Ideally I think we want to simply have the information just written once with a very simple bullet point list of all the networks. |
|
Think we can also hoist LLM documentation on the sidebar almost right to the top. Think its a real nice addition. We'll also probably want to add a "developing with an LLM" section very soon |
@DenhamPreen Think its important we figure out the ordering of the llm docs
|
|
Just checked entire thing is 14k tokens, so don't think trimming network list down as I suggested is necesarily biggest intial bang for buck, as its fine to fit in currently 👍 |
|
@DenhamPreen I think an important one, I can't see any code examples at all in the LLM documentation. Think code examples are really really important for performance. I think we should consider removing all the javascript and rescript related examples to avoid confusion and maybe this would help the flattening to include typescript examples? |
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (3)
scripts/consolidate-hyperindex-docs.js (1)
47-53: Fix title extraction logic.The condition
!title && frontMatterEnded && line.trim()may not work as intended becausefrontMatterEndedmight still befalsewhen processing the first content line. This could result in missing title extraction from filenames.- // If no title found, use filename - if (!title && frontMatterEnded && line.trim()) { + // If no title found and we're past front matter, use filename + if (!title && !inFrontMatter && line.trim()) { title = path .basename(filePath, path.extname(filePath)) .replace(/-/g, " ") .replace(/\b\w/g, (l) => l.toUpperCase()); }docs/HyperSync-LLM/hypersync-complete.mdx (2)
1409-1409: Broken Markdown link – missing closing bracket and URLThe link
Reference the [Fuel docs on receipts for field explanations.is missing the closing bracket and URL, which will fail MDX compilation.-Reference the [Fuel docs on receipts for field explanations. +Reference the [Fuel docs on receipts](https://docs.fuel.network/docs/specs/tx-format/transaction/#receipts) for field explanations.
1312-1314: Method tables rendered incorrectly – inline code blocks need spacingThe inline code blocks in the method tables are jammed together without spaces, causing MDX to treat them as one long literal and breaking table formatting.
-| **Chain Data** | `eth_chainId``eth_blockNumber` | -| **Block Data** | `eth_getBlockByNumber``eth_getBlockByHash``eth_getBlockReceipts` | -| **Transaction Data** | `eth_getTransactionByHash``eth_getTransactionByBlockHashAndIndex``eth_getTransactionByBlockNumberAndIndex``eth_getTransactionReceipt` | +| **Chain Data** | `eth_chainId`, `eth_blockNumber` | +| **Block Data** | `eth_getBlockByNumber`, `eth_getBlockByHash`, `eth_getBlockReceipts` | +| **Transaction Data** | `eth_getTransactionByHash`, `eth_getTransactionByBlockHashAndIndex`, `eth_getTransactionByBlockNumberAndIndex`, `eth_getTransactionReceipt` |
🧹 Nitpick comments (2)
scripts/consolidate-hyperindex-docs.js (2)
99-127: Remove redundant image reference replacements.The function has duplicate patterns for removing image references - first using regex patterns (lines 100-105) and then string replacements (lines 122-127).
// Remove image references that cause errors - be more aggressive content = content.replace(/!\[([^\]]*)\]\([^)]+\)/g, ""); - content = content.replace(/!\[([^\]]*)\]\([^)]+\.png\)/g, ""); - content = content.replace(/!\[([^\]]*)\]\([^)]+\.jpg\)/g, ""); - content = content.replace(/!\[([^\]]*)\]\([^)]+\.jpeg\)/g, ""); - content = content.replace(/!\[([^\]]*)\]\([^)]+\.gif\)/g, ""); - content = content.replace(/!\[([^\]]*)\]\([^)]+\.webp\)/g, ""); // ... other replacements ... - // Remove any remaining image references - content = content.replace(/image\.png/g, ""); - content = content.replace(/image\.jpg/g, ""); - content = content.replace(/image\.jpeg/g, ""); - content = content.replace(/image\.gif/g, ""); - content = content.replace(/image\.webp/g, "");
400-403: Consider improving default CLI behavior for better user experience.When no arguments are provided, the script only consolidates HyperIndex documentation, which might be unexpected. Users typically expect no arguments to either show help or do nothing.
if (require.main === module) { const args = process.argv.slice(2); - if (args.includes("--hyperindex") || args.length === 0) { + if (args.length === 0) { + console.log("Usage: node consolidate-hyperindex-docs.js [options]"); + console.log("Options:"); + console.log(" --hyperindex Consolidate HyperIndex documentation"); + console.log(" --hypersync Consolidate HyperSync documentation"); + console.log(" --all Consolidate all documentation"); + process.exit(0); + } + + if (args.includes("--hyperindex")) { console.log("Consolidating HyperIndex documentation..."); consolidateHyperIndexDocs(); }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
docs/HyperSync-LLM/hypersync-complete.mdx(1 hunks)scripts/consolidate-hyperindex-docs.js(1 hunks)
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: DenhamPreen
PR: enviodev/docs#673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.665Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
scripts/consolidate-hyperindex-docs.js (1)
Learnt from: DenhamPreen
PR: #673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.665Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
🪛 Biome (2.1.2)
scripts/consolidate-hyperindex-docs.js
[error] 157-157: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
[error] 198-198: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
🪛 LanguageTool
docs/HyperSync-LLM/hypersync-complete.mdx
[style] ~840-~840: Three successive sentences begin with the same word. Consider rewording the sentence or use a thesaurus to find a synonym.
Context: ...event_signature, from_block, to_block)` Fetches logs for the specified event signature ...
(ENGLISH_WORD_REPEAT_BEGINNING_RULE)
[style] ~895-~895: Consider using a less common alternative to make your writing sound more unique and professional.
Context: ...address, either as sender or recipient. Feel free to swap your address into the example. **...
(FEEL_FREE_TO_STYLE_ME)
[style] ~1037-~1037: ‘takes into account’ might be wordy. Consider a shorter alternative.
Context: ...its**: A comprehensive calculation that takes into account multiple factors including data bandwid...
(EN_WORDINESS_PREMIUM_TAKES_INTO_ACCOUNT)
[style] ~1346-~1346: Consider using polite language here.
Context: ...ble as we continue to improve HyperRPC. Let us know about your experience in our Discord. ...
(INSERT_PLEASE)
[style] ~1401-~1401: As an alternative to the over-used intensifier ‘extremely’, consider replacing this phrase.
Context: ...ers of a HyperFuel query. HyperFuel is extremely powerful but learning how to craft queries can t...
(EN_WEAK_ADJECTIVE)
| content = content.replace(/```[^`]*```/g, ""); | ||
|
|
||
| // Remove any remaining code blocks that might cause issues | ||
| content = content.replace(/```[\s\S]*?```/g, ""); |
There was a problem hiding this comment.
Overly aggressive code block removal may lose important documentation content.
The function removes ALL code blocks from the documentation, which could eliminate important examples, configuration snippets, or API references that are crucial for LLM understanding.
Consider preserving code blocks or at least extracting their content as plain text:
- // Remove any remaining problematic syntax
- content = content.replace(/```[^`]*```/g, "");
-
- // Remove any remaining code blocks that might cause issues
- content = content.replace(/```[\s\S]*?```/g, "");
+ // Convert code blocks to plain text instead of removing them entirely
+ content = content.replace(/```[a-zA-Z]*\n([\s\S]*?)```/g, (match, code) => {
+ return `\nCode example:\n${code}\n`;
+ });📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| content = content.replace(/```[^`]*```/g, ""); | |
| // Remove any remaining code blocks that might cause issues | |
| content = content.replace(/```[\s\S]*?```/g, ""); | |
| // Convert code blocks to plain text instead of removing them entirely | |
| content = content.replace(/ |
🤖 Prompt for AI Agents
In scripts/consolidate-hyperindex-docs.js around lines 117 to 120, the current
code removes all code blocks entirely, which risks losing important
documentation content. Instead of deleting code blocks, modify the regex
replacement to capture the code inside the triple backticks and replace the
entire block with a plain text version prefixed by a label like "Code example:".
This preserves the code content for LLM understanding while removing formatting.
| }; | ||
|
|
||
| // Evaluate the sidebar configuration | ||
| const sidebarConfig = eval(`(function() { |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Replace eval() with safer configuration parsing.
Using eval() to parse JavaScript configuration files poses security risks and performance concerns, even for internal files.
Consider these safer alternatives:
- Convert sidebar configs to JSON (recommended):
// Instead of sidebarsHyperIndex.js with module.exports
// Use sidebarsHyperIndex.json
const sidebarConfig = JSON.parse(fs.readFileSync(sidebarPath, "utf8"));- Use dynamic import (if configs must remain as JS):
// Create an async version
async function parseSidebarOrder(sidebarPath) {
try {
const sidebarModule = await import(sidebarPath);
return extractFileOrderFromSidebar(sidebarModule.default.someSidebar);
} catch (error) {
console.error(`Error parsing sidebar ${sidebarPath}:`, error.message);
return [];
}
}- Use a sandbox library like
vm2orisolated-vmif dynamic execution is absolutely necessary.
Also applies to: 198-198
🧰 Tools
🪛 Biome (2.1.2)
[error] 157-157: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
🤖 Prompt for AI Agents
In scripts/consolidate-hyperindex-docs.js at line 157, replace the use of eval()
for parsing the sidebar configuration to eliminate security and performance
risks. Instead, convert the sidebar configuration file to JSON format and parse
it using JSON.parse with fs.readFileSync, or if the config must remain
JavaScript, refactor to use dynamic import asynchronously to safely load the
module. Avoid eval entirely and consider sandbox libraries only if dynamic
execution is absolutely required. Apply the same fix at line 198 as well.
There was a problem hiding this comment.
Actionable comments posted: 0
♻️ Duplicate comments (3)
scripts/consolidate-hyperindex-docs.js (3)
47-53: Fix title extraction logic.The condition
!title && frontMatterEnded && line.trim()may not work as intended becausefrontMatterEndedmight still befalsewhen processing the first content line. This could result in missing title extraction from filenames.
151-157: Replace eval() with safer configuration parsing.Using
eval()to parse JavaScript configuration files poses security risks and performance concerns, even for internal files.
192-195: Replace eval() with safer configuration parsing.This function also uses
eval()which poses security risks. Apply the same safer alternatives suggested for theparseSidebarOrderfunction.
🧹 Nitpick comments (1)
scripts/consolidate-hyperindex-docs.js (1)
117-121: Remove redundant image reference replacements.These lines are redundant as image references have already been removed by the regex patterns on lines 100-105.
- // Remove any remaining image references - content = content.replace(/image\.png/g, ""); - content = content.replace(/image\.jpg/g, ""); - content = content.replace(/image\.jpeg/g, ""); - content = content.replace(/image\.gif/g, ""); - content = content.replace(/image\.webp/g, "");
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
scripts/consolidate-hyperindex-docs.js(1 hunks)
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: DenhamPreen
PR: enviodev/docs#673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.665Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
scripts/consolidate-hyperindex-docs.js (1)
Learnt from: DenhamPreen
PR: #673
File: LLM_DOCS_README.md:24-32
Timestamp: 2025-07-29T14:32:31.665Z
Learning: The Envio documentation build system requires Yarn and does not work with pnpm, despite other parts of the repository potentially using pnpm/pnpx. The LLM documentation commands (yarn build-llm, yarn start-llm, yarn consolidate-docs) must remain using yarn to function properly.
🪛 Biome (2.1.2)
scripts/consolidate-hyperindex-docs.js
[error] 151-151: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
[error] 192-192: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
🔇 Additional comments (1)
scripts/consolidate-hyperindex-docs.js (1)
326-328: Verify the arbitrary limit on supported network files.The code limits the processing to only the first 5 supported network files for HyperIndex. Is this limitation intentional? If so, consider adding a comment explaining why only 5 networks are included in the consolidated documentation.
moose-code
left a comment
There was a problem hiding this comment.
Epic!!!
Do we need to run a script when making changes? I guess it happens automatically?





https://www.loom.com/share/126589ea575c4ca697379a6c5df8fbec
Lastly, the references in the docs to llm-docs.envio.dev have changed to the new url routes
Will open another pr shortly with light stying changes
Summary by CodeRabbit
New Features
Improvements
Bug Fixes
Chores