Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,11 @@ Shared docs tooling for Inth docs projects: React MDX rendering, MDX-to-markdown
- `@inth/docs/remark`: remark plugins plus `defaultRemarkPlugins`
- `@inth/docs/convert`: MDX-to-markdown conversion APIs
- `@inth/docs/llm`: `llms.txt` and topic-scoped full-context generation
- `@inth/docs/search`: edge-safe search runtime, content readers, guards, and rate limiter helpers
- `@inth/docs/search`: search runtime, content readers, guards, and rate limiter helpers
- `@inth/docs/search/node`: Node-only search index generation
- `@inth/docs/search/ai`: AI SDK answer streaming helper
- `@inth/docs/search/bash`: optional bash-tool docs inspection adapter
- `@inth/docs/search/vercel`: Vercel AI Gateway / AI SDK answer streaming and bash tools
- `@inth/docs/search/tanstack`: TanStack AI answer streaming and bash tools
- `@inth/docs/search/cloudflare`: Cloudflare AI Gateway / Workers AI adapter helpers and bash tools
- `@inth/docs/lint`: docs validation and the `inth-docs-lint` CLI

## Install
Expand Down Expand Up @@ -114,7 +115,7 @@ await generateDocsSearchFiles({
});
```

At runtime, query the generated JSON with `@inth/docs/search`. Add `@inth/docs/search/ai` only when a user explicitly asks for a source-grounded answer.
At runtime, query the generated JSON with `@inth/docs/search`. Add a provider entrypoint such as `@inth/docs/search/vercel` only when a user explicitly asks for a source-grounded answer.

## Agent Docs

Expand Down
2 changes: 1 addition & 1 deletion apps/docs-smoke/content/docs/guides/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -234,7 +234,7 @@ export function search(query: string) {
}
```

If the app also supports AI answers, keep model calls behind an explicit user action and use `@inth/docs/search/ai` from a server route.
If the app also supports AI answers, keep model calls behind an explicit user action and use a provider entrypoint such as `@inth/docs/search/vercel` from a server route.

## 7. Add Package Scripts

Expand Down
16 changes: 10 additions & 6 deletions apps/docs-smoke/content/docs/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -32,19 +32,23 @@ description: "Developer reference for rendering MDX, converting docs, generating
},
"@inth/docs/search": {
type: "runtime",
description: "Edge-safe search runtime, content readers, guards, and rate limiter helpers.",
description: "Search runtime, content readers, guards, and rate limiter helpers.",
},
"@inth/docs/search/node": {
type: "build time",
description: "Node-only `generateDocsSearchFiles`.",
},
"@inth/docs/search/ai": {
"@inth/docs/search/vercel": {
type: "optional runtime",
description: "AI SDK answer streaming helper.",
description: "Vercel AI Gateway / AI SDK answer streaming and bash tools.",
},
"@inth/docs/search/bash": {
"@inth/docs/search/tanstack": {
type: "optional runtime",
description: "bash-tool docs inspection adapter.",
description: "TanStack AI answer streaming and native docs bash tools.",
},
"@inth/docs/search/cloudflare": {
type: "optional runtime",
description: "Cloudflare AI Gateway / Workers AI adapters and docs bash tools.",
},
"@inth/docs/lint": {
type: "build time",
Expand All @@ -63,7 +67,7 @@ description: "Developer reference for rendering MDX, converting docs, generating
Use `@inth/docs/convert` with `@inth/docs/remark` to flatten authored MDX into markdown that works in LLM bundles and search indexes.
</Step>
<Step title="Generate and query search">
Generate static search JSON with `@inth/docs/search/node`, query it with `@inth/docs/search`, and add `@inth/docs/search/ai` only for explicit answer requests.
Generate static search JSON with `@inth/docs/search/node`, query it with `@inth/docs/search`, and add a provider entrypoint such as `@inth/docs/search/vercel` only for explicit answer requests.
</Step>
</Steps>

Expand Down
72 changes: 65 additions & 7 deletions apps/docs-smoke/content/docs/search.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,15 @@ description: "Generate static docs search data, query it at runtime, and stream
Open [/search](/search) to test the generated index. Typing calls local search only. The `Ask` button is the only action that can call the model.
</Callout>

## Entrypoints

| Use case | Import |
| --- | --- |
| Search, content reads, request guards, and rate limiting | `@inth/docs/search` |
| Vercel AI Gateway / AI SDK answer + bash tools | `@inth/docs/search/vercel` |
| TanStack AI answer + bash tools | `@inth/docs/search/tanstack` |
| Cloudflare AI Gateway / Workers AI answer + bash tools | `@inth/docs/search/cloudflare` |

## Build The Index

Use the Node-only entry point after MDX has been converted to markdown.
Expand All @@ -28,7 +37,7 @@ The generator writes `docs/search-index.json` for compact metadata and `docs/sea

## Runtime Search

Use the edge-safe runtime with generated JSON.
Use the provider-neutral runtime with generated JSON.

```ts
import {
Expand All @@ -49,22 +58,59 @@ Results include heading paths, excerpts, `urlWithHash`, and `absoluteUrlWithHash

## Source-Grounded Answers

Use `@inth/docs/search/ai` only when the user explicitly asks for an answer.
Use a provider entrypoint only when the user explicitly asks for an answer. The demo app uses Vercel AI Gateway / AI SDK.

```ts
import { streamDocsAnswer } from "@inth/docs/search/ai";
import { streamDocsAnswer } from "@inth/docs/search/vercel";

const { response, sources } = streamDocsAnswer({
index,
content,
query,
model: process.env.DOCS_SEARCH_MODEL ?? "moonshotai/kimi-k2.6",
model: process.env.DOCS_SEARCH_MODEL ?? "openai/gpt-5.4-mini",
productName: "@inth/docs",
});
```

The prompt includes retrieved docs context, asks for citations, treats docs text as untrusted reference content, and tells the model to say when the retrieved context is not enough.

TanStack AI callers pass their adapter explicitly:

```ts
import { streamDocsAnswer } from "@inth/docs/search/tanstack";

const { response, sources } = streamDocsAnswer({
index,
content,
query,
adapter,
});
```

Cloudflare callers create an explicit TanStack-compatible adapter:

```ts
import {
createCloudflareDocsAdapter,
streamDocsAnswer,
} from "@inth/docs/search/cloudflare";

const adapter = createCloudflareDocsAdapter({
provider: "openai",
model: "gpt-4o",
options: {
binding: env.AI.gateway("docs-gateway"),
},
});

const { response, sources } = streamDocsAnswer({
index,
content,
query,
adapter,
});
```

## Abuse Protection

<Steps>
Expand All @@ -81,14 +127,26 @@ The prompt includes retrieved docs context, asks for citations, treats docs text

The demo uses an in-memory limiter for local smoke coverage. Production apps should adapt the `RateLimiter` interface to a shared store.

## Bash Adapter
## Bash Tools

Use `@inth/docs/search/bash` when an AI SDK agent should inspect docs with safe shell commands.
Use each provider entrypoint when an agent should inspect docs with safe shell commands. Tools are never created inside `streamDocsAnswer`; pass both tools and tool instructions explicitly.

```ts
import { createDocsBashTool } from "@inth/docs/search/bash";
import {
createDocsBashTool,
streamDocsAnswer,
} from "@inth/docs/search/vercel";

const { tools, instructions } = await createDocsBashTool(index, content);

const { response } = streamDocsAnswer({
index,
content,
query,
model,
tools,
toolInstructions: instructions,
});
```

The adapter creates a read-only `/docs` filesystem for `just-bash` and wraps it with `bash-tool`. Agents can use commands such as `ls`, `cat`, `find`, `grep`, and `rg`; network commands and writes are disabled by default.
2 changes: 1 addition & 1 deletion apps/docs-smoke/src/generated/docs-search-content.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion apps/docs-smoke/src/generated/docs-search-index.json

Large diffs are not rendered by default.

17 changes: 12 additions & 5 deletions apps/docs-smoke/src/lib/docs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -79,22 +79,29 @@ export const packageSurfaces: PackageSurface[] = [
{
importPath: "@inth/docs/search",
lifecycle: "runtime",
description: "Edge-safe search runtime, content readers, and guards.",
description:
"Search runtime, content readers, guards, and rate limiter helpers.",
},
{
importPath: "@inth/docs/search/node",
lifecycle: "build time",
description: "Node-only `generateDocsSearchFiles`.",
},
{
importPath: "@inth/docs/search/ai",
importPath: "@inth/docs/search/vercel",
lifecycle: "optional runtime",
description: "AI SDK answer streaming helper.",
description: "Vercel AI Gateway / AI SDK answer streaming and bash tools.",
},
{
importPath: "@inth/docs/search/bash",
importPath: "@inth/docs/search/tanstack",
lifecycle: "optional runtime",
description: "bash-tool docs inspection adapter.",
description: "TanStack AI answer streaming and native docs bash tools.",
},
{
importPath: "@inth/docs/search/cloudflare",
lifecycle: "optional runtime",
description:
"Cloudflare AI Gateway / Workers AI adapter helpers and docs bash tools.",
},
{
importPath: "@inth/docs/lint",
Expand Down
4 changes: 2 additions & 2 deletions apps/docs-smoke/src/routes/api/docs/ask.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import {
readJsonWithLimit,
validateDocsQuery,
} from "@inth/docs/search";
import { streamDocsAnswer } from "@inth/docs/search/ai";
import { streamDocsAnswer } from "@inth/docs/search/vercel";
import { createFileRoute } from "@tanstack/react-router";
import {
docsSearchContent,
Expand All @@ -15,7 +15,7 @@ import {
jsonResponse,
} from "@/lib/search";

const DEFAULT_MODEL = "moonshotai/kimi-k2.6";
const DEFAULT_MODEL = "openai/gpt-5.4-mini";
const DEFAULT_MAX_OUTPUT_TOKENS = 2000;
const DEFAULT_TIMEOUT = {
chunkMs: 15_000,
Expand Down
2 changes: 1 addition & 1 deletion apps/docs-smoke/src/routes/playground.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ import { defaultRemarkPlugins, remarkInclude } from "@inth/docs/remark";`,
summary:
"Use the generated index for local search, then stream answers only when a user asks.",
imports: `import { searchDocs } from "@inth/docs/search";
import { streamDocsAnswer } from "@inth/docs/search/ai";`,
import { streamDocsAnswer } from "@inth/docs/search/vercel";`,
code: `const results = searchDocs(index, query, { content });

const { response } = streamDocsAnswer({
Expand Down
4 changes: 2 additions & 2 deletions apps/docs-smoke/tests/e2e/smoke.e2e.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ test("home route renders the developer dashboard and package surfaces", async ({

expect(html).toContain("Build docs with");
expect(html).toContain("Implementation contract");
expect(html).toContain("@inth/docs/search/bash");
expect(html).toContain("@inth/docs/search/vercel");

await page.goto("/", { waitUntil: "networkidle" });
await expect(page.getByText(DASHBOARD_HEADING)).toBeVisible();
Expand All @@ -43,7 +43,7 @@ test("docs route renders package docs and extracted ExtractedTypeTable output",
const html = await response.text();

expect(html).toContain("@inth/docs");
expect(html).toContain("@inth/docs/search/bash");
expect(html).toContain("@inth/docs/search/cloudflare");
expect(html).toContain("PipelineExampleOptions");

await page.goto("/docs", { waitUntil: "networkidle" });
Expand Down
Loading
Loading