Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions apps/docs-smoke/.gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
.output/
content-fixtures/
public/
public-real/
public-real2/
test-results/
9 changes: 9 additions & 0 deletions apps/docs-smoke/content/docs/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,10 @@ description: "Package docs for runtime adapters, remark plugins, conversion, LLM
type: "pipeline",
description: "Generate `llms.txt` and topic-scoped agent context files.",
},
"@inth/docs/search": {
type: "runtime",
description: "Static local search, source-grounded answer context, request guards, and AI SDK streaming helpers.",
},
Comment on lines +31 to +34
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Update package-surface count in the intro.

After adding @inth/docs/search, the sentence at Line 8 (“has five package surfaces”) is stale and should be updated to six.

📝 Suggested doc fix
-`@inth/docs` has five package surfaces:
+`@inth/docs` has six package surfaces:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/docs-smoke/content/docs/index.mdx` around lines 31 - 34, The docs intro
still says “has five package surfaces” but you added the new package
"@inth/docs/search"; update the intro sentence in
apps/docs-smoke/content/docs/index.mdx (the sentence that mentions package
surfaces) to say “six package surfaces” and ensure any adjacent counts or
pluralization remain consistent with the added "@inth/docs/search" entry.

"@inth/docs/lint": {
type: "pipeline",
description: "Validation APIs and the `inth-docs-lint` CLI.",
Expand Down Expand Up @@ -71,6 +75,11 @@ await convertAllMdx({
## What to open in this app

<Cards>
<Card
title="Search and AI Answers"
description="Explains the headless search surfaces, generated index, AI answer route, abuse guards, and demo."
href="/docs/search"
/>
<Card
title="Quickstart"
description="Shows the basic conversion flow plus hydrated `Tabs` and `PackageCommandTabs`."
Expand Down
1 change: 1 addition & 0 deletions apps/docs-smoke/content/docs/meta.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"title": "Docs",
"pages": [
"index",
"search",
"guides/quickstart",
"guides/components-fixture",
"guides/auto-type-table-fixture"
Expand Down
94 changes: 94 additions & 0 deletions apps/docs-smoke/content/docs/search.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
---
title: "Search and AI Answers"
description: "Generate a local docs search index and stream source-grounded AI answers."
---

# Search and AI Answers

`@inth/docs` includes headless search logic for docs sites that want to bring their own UI.

<Callout title="Demo route">
Open the live search example at [/search](/search). Typing runs local search only; the `Ask` button is the only action that can call the model.
</Callout>

## Package Surfaces

<TypeTable
type={{
"@inth/docs/search": {
type: "runtime",
description: "Edge-safe search, answer context, query validation, JSON body limits, and rate limiter helpers.",
required: true,
},
"@inth/docs/search/node": {
type: "build time",
description: "Node-only `generateSearchIndex` helper that reads converted markdown and writes `docs/search-index.json`.",
required: true,
},
"@inth/docs/search/ai": {
type: "runtime",
description: "Vercel AI SDK `streamText` wrapper for source-grounded plain text answer streams.",
},
}}
/>

## Build the Index

Run conversion first, then generate the search index from markdown:
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Capitalize “Markdown” for consistency.

At Line 37, use “Markdown” (proper noun) for documentation style consistency.

🧰 Tools
🪛 LanguageTool

[uncategorized] ~37-~37: Did you mean the formatting language “Markdown” (= proper noun)?
Context: ...st, then generate the search index from markdown: ```ts import { generateSearchIndex } ...

(MARKDOWN_NNP)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/docs-smoke/content/docs/search.mdx` at line 37, Summary: The word
"markdown" should be capitalized to "Markdown" for consistency. Update the text
string "Run conversion first, then generate the search index from markdown:" to
"Run conversion first, then generate the search index from Markdown:" in the
docs content (the line containing that sentence in
apps/docs-smoke/content/docs/search.mdx) so the proper noun "Markdown" is used
consistently.


```ts
import { generateSearchIndex } from "@inth/docs/search/node";

await generateSearchIndex({
outDir: "public",
baseUrl: "https://docs.example.com",
});
```

The generated index is static JSON. In the demo app, `scripts/search-generate.ts` copies it into `src/generated/docs-search-index.json` so routes can import it without reading from the file system at request time.

## Runtime Search

Import the generated JSON and query it from your own route handler:

```ts
import { searchDocs, type DocsSearchIndex } from "@inth/docs/search";
import indexJson from "./generated/docs-search-index.json";

const results = searchDocs(indexJson as DocsSearchIndex, "package tabs");
```

Search results include `urlWithHash` and `absoluteUrlWithHash` so the UI can link directly to the matched heading. The demo renders matching heading `id`s with `slugifyDocsHeading`.

## AI Answers

Use `streamDocsAnswer` when you want a simple Vercel AI SDK integration:

```ts
import { streamDocsAnswer } from "@inth/docs/search/ai";

const { response } = streamDocsAnswer({
index,
query,
model: process.env.DOCS_SEARCH_MODEL ?? "moonshotai/kimi-k2.6",
productName: "@inth/docs",
});
```

The answer prompt only includes retrieved docs context, tells the model to cite sources, and asks it to say when the docs do not contain enough context.

## Abuse Protection

<Steps>
<Step title="Search is cheap">
Debounced typing should call only the local `/api/docs/search` route. It does not call the model.
</Step>
<Step title="Answers are explicit">
Keep model calls behind a button such as `Ask`, `Enter to ask`, or `Cmd+Enter`.
</Step>
<Step title="Limit request paths">
Use `validateDocsQuery`, `readJsonWithLimit`, `getClientIdentifier`, and a `RateLimiter` implementation around public routes.
</Step>
</Steps>

The demo uses an in-memory limiter for local smoke coverage. Production apps should adapt the same `RateLimiter` interface to a shared store such as Redis, Vercel KV, Cloudflare KV, or Durable Objects.
4 changes: 3 additions & 1 deletion apps/docs-smoke/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,8 @@
"bench": "bun run pipeline:bench",
"pipeline:convert": "bun run scripts/mdx-convert.ts",
"pipeline:llm": "bun run scripts/llm-generate.ts",
"pipeline:build": "bun run pipeline:convert && bun run pipeline:llm",
"pipeline:search": "bun run scripts/search-generate.ts",
"pipeline:build": "bun run pipeline:convert && bun run pipeline:llm && bun run pipeline:search",
"pipeline:test": "bun run scripts/test-pipeline.ts",
"pipeline:setup-real": "bun run scripts/setup-real-content.ts",
"pipeline:test-real": "bun run pipeline:setup-real && bun run scripts/test-real.ts",
Expand All @@ -33,6 +34,7 @@
"@radix-ui/react-slot": "^1.2.3",
"@tanstack/react-router": "^1.167.4",
"@tanstack/react-start": "^1.166.15",
"ai": "^6.0.168",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"nitro": "3.0.260415-beta",
Expand Down
5 changes: 3 additions & 2 deletions apps/docs-smoke/scripts/mdx-convert.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,9 @@ import {

const scriptsRoot = dirname(fileURLToPath(import.meta.url));
const repoRoot = join(scriptsRoot, "..", "..");
const srcDir = join(scriptsRoot, "content");
const outDir = join(scriptsRoot, "public");
const appRoot = join(scriptsRoot, "..");
const srcDir = join(appRoot, "content");
const outDir = join(appRoot, "public");
const typeTableRemarkPlugin: NonNullable<
MdxToMarkdownConfig["remarkPlugins"]
>[number] = [remarkTypeTableToMarkdown, { basePath: repoRoot }];
Expand Down
27 changes: 27 additions & 0 deletions apps/docs-smoke/scripts/search-generate.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
#!/usr/bin/env bun
/**
* Generates a static docs search index from converted markdown.
*/

import { copyFile, mkdir } from "node:fs/promises";
import { dirname, join } from "node:path";
import { fileURLToPath } from "node:url";
import { generateSearchIndex } from "../../../packages/docs/src/search/node-index.ts";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if other scripts in this app use relative imports to packages/docs
rg -n "from ['\"]\.\.\/\.\.\/\.\.\/packages" apps/docs-smoke/scripts/

Repository: inthhq/docs

Length of output: 169


🏁 Script executed:

#!/bin/bash
# Fix: Search for imports from packages/docs using simpler pattern
echo "=== Searching for imports from packages/docs ===" 
rg 'from.*packages/docs' apps/docs-smoke/scripts/ --type ts

echo -e "\n=== Checking package.json exports for `@inth/docs` ===" 
# Find the docs package.json
find . -path "*/packages/docs/package.json" | head -1 | xargs cat | head -30

echo -e "\n=== Checking if node-index.ts is a proper export ===" 
# Look at the docs package structure
ls -la packages/docs/src/search/ 2>/dev/null || echo "Path not found"

echo -e "\n=== Searching how other scripts import from docs package ===" 
# Look at other apps' scripts
rg 'from.*docs' apps/*/scripts/ --type ts

Repository: inthhq/docs

Length of output: 4697


🏁 Script executed:

#!/bin/bash
# Get complete package.json exports section
echo "=== Complete package.json exports ===" 
packages/docs/package.json | jq '.exports' 2>/dev/null || sed -n '/"exports"/,/^  },*$/p' packages/docs/package.json

echo -e "\n=== Contents of node-index.ts ===" 
cat packages/docs/src/search/node-index.ts

echo -e "\n=== Contents of index.ts (search) ===" 
cat packages/docs/src/search/index.ts

Repository: inthhq/docs

Length of output: 1053


🏁 Script executed:

#!/bin/bash
# Read package.json properly
echo "=== Package.json exports (search related) ===" 
cat packages/docs/package.json | jq '.exports | select(. != null) | keys[] | select(contains("search"))'

echo -e "\n=== Full exports object ===" 
cat packages/docs/package.json | jq '.exports'

Repository: inthhq/docs

Length of output: 994


Consider using the package export for consistency with Node.js deployments.

The package exports ./search/node to ./dist/search/node-index.js, which this script could import via @inth/docs/search/node. This exercises the actual published package export rather than relying on relative paths to source files.

Note: Other scripts in this app use relative imports (e.g., ../../../packages/docs/src/convert/index.ts), so this would be inconsistent within the monorepo. Use this approach only if the team standardizes on package imports for docs utilities.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/docs-smoke/scripts/search-generate.ts` at line 9, Replace the relative
source import with the package export to exercise the published build: change
the import that currently pulls generateSearchIndex from
"../../../packages/docs/src/search/node-index.ts" to import {
generateSearchIndex } from "@inth/docs/search/node"; update any tooling or
TypeScript config if needed so the package resolution works at runtime; ensure
the symbol generateSearchIndex is imported from the package export and
tests/scripts still run.


const scriptsRoot = dirname(fileURLToPath(import.meta.url));
const appRoot = join(scriptsRoot, "..");
const outDir = join(appRoot, "public");
const generatedDir = join(appRoot, "src", "generated");
const generatedIndexPath = join(generatedDir, "docs-search-index.json");

const result = await generateSearchIndex({
outDir,
baseUrl: "https://docs.example.com",
});

await mkdir(generatedDir, { recursive: true });
await copyFile(result.outputPath, generatedIndexPath);

process.stdout.write(
`Search index generated: ${result.docs} docs, ${result.chunks} chunks, ${result.terms} terms\n`
);
Loading
Loading