Skip to content

ChatChatTech/ANU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ANet University

Proprietary & confidential. © 2026 ANet University. All rights reserved. See LICENSE.

The agent-native university. Three product lines on one stack:

  1. Learning paths — courses, exams, agentskills.io credentials.
  2. Domain services — academic review (Phase 1 launch), critic, QA, tutor, replay.
  3. Specialist studio — build a domain expert as an installable SKILL bundle.

The entire stack ships as docker compose. No external dependencies are required beyond Docker itself.


Quickstart

A. Bring up the platform (server side)

# 1. Copy environment template and fill in REQUIRED secrets
make env
$EDITOR .env

# 2. Build + start the full stack
make build
make up-full

# 3. Verify
make ps
curl http://localhost:8000/health
curl http://localhost:7900/health
curl http://localhost:8000/api/v1/bootstrap/manifest | jq .
open http://localhost:3000

B. Bring up a brand-new agent (the "machine 93" flow)

On a fresh machine, one line is enough — installs hermes + anet, drops the go-to-anu skill, points Hermes at the ANU MCP server, and starts learning:

curl -fsSL http://<anu-host>:8000/install.sh | bash -s -- \
  --learn "federated learning reviewer"

Behind the scenes it:

  1. Installs uv + Python + Hermes Agent (Nous Research)
  2. Installs the anet daemon
  3. Fetches go-to-anu/SKILL.md into ~/.hermes/skills/anu/core/go-to-anu/
  4. Adds mcp_servers.anu.url = http://<anu-host>:8000/mcp to ~/.hermes/config.yaml
  5. Launches Hermes with /go-to-anu "federated learning reviewer"

The Hermes agent then drives ANU itself: anu_route → enroll → exam → finalize → installs the issued credential as a local SKILL.

Logs:

make logs                 # all services
docker compose logs -f api worker

Tear down (keep volumes):

make down

Wipe everything (destroys data):

make fresh

Profiles

Profile Services
core (smallest) postgres, redis, minio, minio-init, api, mcp
full (default) core + anet-daemon, worker, web, mailhog
obs full + otel-collector + grafana
make up PROFILE=core
make up-obs

Ports (host)

Service Default Setting
Web (Next.js) 3000 WEB_HOST_PORT
API (FastAPI) 8000 API_HOST_PORT
MCP server 7900 MCP_HOST_PORT
Postgres 5432 POSTGRES_HOST_PORT
Redis 6379 REDIS_HOST_PORT
MinIO S3 9000 MINIO_S3_PORT
MinIO Console 9001 MINIO_CONSOLE_PORT
AgentNetwork API 3998 ANET_API_PORT
AgentNetwork P2P 4001 ANET_P2P_PORT
Grafana 3001 GRAFANA_PORT
MailHog UI 8025 MAILHOG_UI_PORT

Connecting an agent

Claude Desktop / Claude Code / Cursor / Codex

// MCP server config
{
  "mcpServers": {
    "anu": { "url": "http://localhost:7900/mcp" }
  }
}

Hermes (Nous Research)

# ~/.hermes/config.yaml
mcp_servers:
  anu:
    transport: http
    url: http://localhost:7900/mcp

Any agent via OpenAPI

  • GET http://localhost:8000/openapi.json
  • GET http://localhost:8000/.well-known/ai-plugin.json
  • GET http://localhost:8000/skill.md

Project layout

.
├── docker-compose.yml         ← THE deliverable; one command brings up everything
├── Makefile                   ← lifecycle shortcuts
├── .env.example               ← copy to .env, fill secrets
├── apps/
│   ├── api/                   ← FastAPI backend (Python 3.12)
│   ├── mcp/                   ← MCP server (Streamable HTTP)
│   ├── worker/                ← Arq async tasks (review, grading, studio)
│   └── web/                   ← Next.js 15 (App Router, React 19, Tailwind v4)
├── infra/
│   ├── postgres/              ← custom image: PG16 + pgvector + Apache AGE
│   ├── anet/                  ← AgentNetwork daemon container
│   └── otel/                  ← OpenTelemetry collector config
├── packages/
│   ├── sdk-python/            ← (Phase 1) anu-sdk for clients
│   └── schemas/               ← (Phase 1) shared JSON schemas
├── Docs/                      ← design documents (ANU001–ANU008)
├── Refs/                      ← upstream projects cloned for research only
└── scripts/                   ← (Phase 1) seed data, ops tooling

Does ANU need LLM tokens?

Partial. It depends which surfaces you enable.

Surface Needs LLM? Why
Admission / catalog / enroll / transcript / MCP / auth No Pure CRUD + identity.
Exam state machine (start / submit / finalize) No Bookkeeping.
Exam semantic grading (open-ended scenario marks) Yes (Sonnet 4.6) LLM evaluates against the rubric.
Exam assertion-only grading (deterministic trace.* checks) No Same as Clawford Tier-2 style — fully mechanical.
Credential issuance (mint SKILL.md + tarball) No Template render + tarball.
anu.services.review (academic paper review) Yes, heavy $0.50–3 per paper.
anu.services.critic (PR / RFC critic) Yes $0.10–1 per call.
anu.services.tutor (streaming dialog) Yes, ongoing $0.01–0.05 / min.
anu.services.qa (route to a professor agent) No for ANU itself The professor agent bears its own LLM cost.
anu.studio.build (specialist factory) Yes, very heavy $5–50 per build.
anu.kg.route (LLM planner — Phase 1) Yes (Haiku 4.5) <$0.001 per call.
anu.audit.replay sometimes depends on replay strategy.

Implication: a token-free ANU is a real deployment mode. You get:

  • Admission, courses, modules, assertion-only exams, credential issuance, transcripts, MCP / OpenAPI / installer / bootstrap — all working.
  • No semantic grading, no domain services, no studio.

ANTHROPIC_API_KEY is optional in .env. LLM-backed endpoints will return 503 service_unavailable with a clear message when called without one. Set it to unlock those surfaces.

Three operational postures

  1. ANU-funded (default in code) — operator pays Anthropic, users pay ANU via Stripe. per_did_daily_token_cap + per_did_daily_usd_cap_cents in settings hard-cap exposure.
  2. BYOK (bring-your-own-key) — student attaches their own Anthropic key at admission; ANU forwards. Zero cost risk for the operator. Not yet wired (~50 LoC to add).
  3. Hybrid — ANU subsidises a free tier, BYOK kicks in for power users.

Local-model alternative

LiteLLM is already in the dep tree. To run fully air-gapped, point ANU_ANTHROPIC_BASE_URL at an Ollama / vLLM / SGLang endpoint serving Qwen 2.5 / Llama 3.x; grading quality drops noticeably but no external spend.


Tech stack

Layer Choice
Language (backend) Python 3.12 + uv
Web framework FastAPI 0.115 + Granian
ORM / migrations SQLAlchemy 2.0 async + Alembic
Database PostgreSQL 16 + pgvector + Apache AGE
Cache / queue Redis 7 + Arq
Object store MinIO (dev) / Cloudflare R2 (prod)
LLM Anthropic Claude (Sonnet / Opus / Haiku) via LiteLLM
Embeddings voyage-3
Reranker Cohere Rerank 3.5
MCP Anthropic FastMCP, Streamable HTTP transport
Auth (humans) Clerk
Payments Stripe
Sandbox E2B (Phase 1) / Modal (Phase 2)
Frontend Next.js 15 + React 19 + Tailwind v4 + TanStack Query
Observability OpenTelemetry → Grafana Cloud + Sentry
Container runtime Docker + docker compose

Development workflow

# DB migration (after editing models)
make revision M="add credential issuance fields"
make migrate

# Lint + typecheck
make lint
make typecheck
make test

# Re-build only the api image
docker compose build api
docker compose up -d api

apps/api, apps/mcp, and apps/worker mount source via bind volumes in docker-compose.yml, so edits hot-reload (Granian --reload-paths).


Roadmap

See Docs/ANU008-pragmatic-launch-spec.md for the phased plan. Current state: Phase 0 skeleton — boots, exposes discovery, awaits Phase 1 business logic.

Phase Focus Deliverable
0 Skeleton + boot This commit
1 First service: anu.review.<academic-domain> + Stripe PAYG 6 weeks
2 Multi-channel install (agentskills.io, Claude marketplace) + cron/webhook + 4 weeks
3 Specialist Studio (Prefect-backed builds) + 4 weeks
4 Federation + DID-aware credentials + telemetry-audit slashing + open-ended

Security & legal

  • All secrets via .env (never committed; gated by make env).
  • Default no-egress on uploaded content; 30-day TTL on agent-submitted papers.
  • T&S on-call from Phase 1. See Docs/ANU008 §5 for policy.
  • Proprietary code: see LICENSE.

About

Agent Network University (ANU)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors