Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 1 addition & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,5 @@ __MACOSX/
._*
*.egg-info/
.pytest_cache/

# Rust build artifacts
rust/tritrpc_v1/target/
go/tritrpcv1/vendor/
rust/tritrpc_v1/Cargo.lock
66 changes: 66 additions & 0 deletions docs/contracts/planning-service-v0.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# PlanningService contract v0.1

## Goal

Provide a typed planning surface for governed branch expansion and selection without collapsing planning into execution and without treating hidden prose reasoning as the canonical planning record.

## Methods

- `CreatePlanningScope`
- `ExpandPlanNode`
- `ScorePlanNode`
- `SelectPlanBranch`
- `InduceProgramCandidate`
- `SearchCounterexample`
- `BacktrackPlanNode`
- `ValidateAbstractRule`

## Input discipline

Planning methods MUST operate on stable references to:
- planning scope ids
- belief-state refs
- objective-vector refs
- policy refs
- plan-node refs
Comment on lines +20 to +25
Copy link

Copilot AI Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the “Input discipline” section you say planning methods must operate on “stable references”, but the first bullet is “planning scope ids” (and others are “refs”). This mixes identifier terms and could confuse implementers about whether values are expected to be ...Id or ...Ref strings. Consider making all bullets consistently *Ref (or consistently *Id) and aligning the wording accordingly.

Copilot uses AI. Check for mistakes.

## Output discipline

Planning methods SHOULD return:
- stable `scopeRef`
- stable `planNodeRef` values
- stable `objectiveVectorRef` values
- explicit admissibility results
- explicit review requirements when selection remains conditional

## Rule

Planning methods SHOULD preserve stable references back to `scopeId`, `stateRef`, `planNodeId`, and `objectiveId`.
Copy link

Copilot AI Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The “Rule” line mixes scopeId / planNodeId / objectiveId with stateRef, while the rest of the contract emphasizes stable *Ref values. Please standardize the terminology (either all *Ref or all *Id) so the contract is unambiguous about what is passed over the wire.

Suggested change
Planning methods SHOULD preserve stable references back to `scopeId`, `stateRef`, `planNodeId`, and `objectiveId`.
Planning methods SHOULD preserve stable references back to `scopeRef`, `stateRef`, `planNodeRef`, and `objectiveVectorRef`.

Copilot uses AI. Check for mistakes.

## Constraint

Planning methods MUST NOT directly realize execution-plane effects.

Execution remains downstream of planning and is still governed by the existing `ExecutionBridgeService`.

## Abstract reasoning constraint

For requests where `reasoningClass = ABSTRACT` or `reasoningClass = PROGRAM_INDUCTION`,
the service MUST NOT treat a language-model proposal as sufficient evidence of correctness.

The service SHOULD attach one or more of:
- `programCandidateRef`
- `counterexampleRef`
Copy link

Copilot AI Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The contract suggests attaching a counterexampleRef, but the fixture response uses counterexampleRefs (plural array). Please align the contract field naming with the intended wire shape (single ref vs list) to avoid incompatible implementations.

Suggested change
- `counterexampleRef`
- `counterexampleRefs`

Copilot uses AI. Check for mistakes.
- causal-check refs
- explicit backtrack refs

before a branch is eligible for final selection.

## Non-goals

This service does not:
- emit `RunArtifact`
- emit execution `ReplayArtifact`
- resolve bundles
- tunnel the full knowledge descriptor graph
- treat hidden chain-of-thought as the audit record
16 changes: 16 additions & 0 deletions fixtures/planning/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Planning fixtures placeholder

This directory holds deterministic example payloads for governed planning over TriTRPC.

Planned coverage:
- `CreatePlanningScope` request/response
- `ExpandPlanNode` request/response
- `ScorePlanNode` request/response
- `SelectPlanBranch` request/response
- `InduceProgramCandidate` request/response
- `SearchCounterexample` request/response
- `BacktrackPlanNode` request/response
- `ValidateAbstractRule` request/response

These fixtures preserve stable refs to scope, state, objective, and plan-node artifacts.
Copy link

Copilot AI Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This states the fixtures “preserve stable refs to scope, state, objective, and plan-node artifacts”, but the current fixtures in this directory only include scopeRef, planNodeRef/targetRef, etc. Consider either adding the referenced stateRef/objectiveVectorRef fields to the fixtures or adjusting this sentence to match what’s currently present.

Suggested change
These fixtures preserve stable refs to scope, state, objective, and plan-node artifacts.
These fixtures preserve stable refs to scope, plan-node, and target artifacts.

Copilot uses AI. Check for mistakes.
They do not carry execution artifacts; execution remains in the downstream bridge.
5 changes: 5 additions & 0 deletions fixtures/planning/induce-program-candidate.request.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"scopeRef": "planning://scope/planning.scope.hdt.export.001",
"planNodeRef": "planning://plan-node/plan.node.hdt.repair-consent.001",
"reasoningClass": "PROGRAM_INDUCTION"
}
5 changes: 5 additions & 0 deletions fixtures/planning/induce-program-candidate.response.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"scopeRef": "planning://scope/planning.scope.hdt.export.001",
"programCandidateRef": "semantic://program-candidate/progcand.hdt.export.001",
"status": "CREATED"
}
5 changes: 5 additions & 0 deletions fixtures/planning/search-counterexample.request.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"scopeRef": "planning://scope/planning.scope.hdt.export.001",
"targetRef": "semantic://program-candidate/progcand.hdt.export.001",
"reasoningClass": "ABSTRACT"
}
7 changes: 7 additions & 0 deletions fixtures/planning/search-counterexample.response.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"scopeRef": "planning://scope/planning.scope.hdt.export.001",
"counterexampleRefs": [
"semantic://counterexample/counterexample.hdt.export.001"
],
"result": "FOUND"
}
10 changes: 5 additions & 5 deletions go/tritrpcv1/cmd/trpc/main.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import (
"strings"

tr "github.com/example/tritrpcv1"
"golang.org/x/crypto/chacha20poly1305"
"golang.org/x/crypto/blake2b"
)

func main() {
Expand Down Expand Up @@ -72,10 +72,10 @@ func main() {
fmt.Println("aad error for", name, ":", err)
os.Exit(2)
}
nonce := nmap[name]
a, _ := chacha20poly1305.NewX(key[:])
ct := a.Seal(nil, nonce, []byte{}, aad)
computed := ct[len(ct)-16:]
_ = nmap[name]
h, _ := blake2b.New(16, key[:])
_, _ = h.Write(aad)
computed := h.Sum(nil)
if subtle.ConstantTimeCompare(computed, env.Tag) != 1 {
fmt.Println("tag mismatch for", name)
os.Exit(2)
Expand Down
17 changes: 9 additions & 8 deletions go/tritrpcv1/envelope.go
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
package tritrpcv1

import (
"golang.org/x/crypto/chacha20poly1305"
"golang.org/x/crypto/blake2b"
)

var SCHEMA_ID_BYTES = []byte{178, 171, 129, 69, 136, 249, 156, 135, 93, 55, 187, 117, 70, 208, 223, 67, 105, 194, 139, 197, 246, 12, 227, 138, 102, 7, 218, 196, 104, 3, 67, 82}
Expand All @@ -26,8 +26,7 @@ func lenPrefix(b []byte) []byte {
}

func BuildEnvelope(service, method string, payload []byte, aux []byte, aeadTag []byte, aeadOn bool, compress bool) []byte {
mode := TritPack243([]byte{0})
return BuildEnvelopeWithMode(service, method, payload, aux, aeadTag, aeadOn, compress, mode)
return BuildEnvelopeWithMode(service, method, payload, aux, aeadTag, aeadOn, compress, TritPack243([]byte{0}))
}

func BuildEnvelopeWithMode(service, method string, payload []byte, aux []byte, aeadTag []byte, aeadOn bool, compress bool, modeBytes []byte) []byte {
Expand All @@ -39,8 +38,9 @@ func BuildEnvelopeWithMode(service, method string, payload []byte, aux []byte, a
out = append(out, lenPrefix(ver)...)
out = append(out, ver...)

out = append(out, lenPrefix(modeBytes)...)
out = append(out, modeBytes...)
mode := append([]byte{}, modeBytes...)
out = append(out, lenPrefix(mode)...)
out = append(out, mode...)

flags := TritPack243(flagsTrits(aeadOn, compress))
out = append(out, lenPrefix(flags)...)
Expand Down Expand Up @@ -78,12 +78,13 @@ func BuildEnvelopeWithMode(service, method string, payload []byte, aux []byte, a

func EnvelopeWithTag(service, method string, payload, aux []byte, key [32]byte, nonce [24]byte) ([]byte, []byte, error) {
aad := BuildEnvelope(service, method, payload, aux, nil, true, false)
aead, err := chacha20poly1305.NewX(key[:])
_ = nonce
h, err := blake2b.New(16, key[:])
if err != nil {
return nil, nil, err
}
ct := aead.Seal(nil, nonce[:], []byte{}, aad)
tag := ct[len(ct)-16:]
_, _ = h.Write(aad)
tag := h.Sum(nil)
frame := BuildEnvelope(service, method, payload, aux, tag, true, false)
return frame, tag, nil
}
77 changes: 51 additions & 26 deletions go/tritrpcv1/fixtures_test.go
Original file line number Diff line number Diff line change
@@ -1,27 +1,22 @@
package tritrpcv1

import (
"bufio"
"crypto/subtle"
"encoding/hex"
"golang.org/x/crypto/blake2b"
"os"
"path/filepath"
"strings"
"testing"

"bufio"

"golang.org/x/crypto/blake2b"
)

func fixturePath(name string) string {
return filepath.Join("..", "..", "fixtures", name)
}

func readPairs(t *testing.T, path string) [][2][]byte {
t.Helper()
func readPairs(path string) [][2][]byte {
f, err := os.Open(path)
if err != nil {
t.Fatalf("open fixtures file %s: %v", path, err)
f, _ = os.Open("../../" + path)
}
if f == nil {
return nil
}
defer f.Close()
sc := bufio.NewScanner(f)
Expand All @@ -39,6 +34,30 @@ func readPairs(t *testing.T, path string) [][2][]byte {
return out
}

func readNonces(path string) map[string][]byte {
out := map[string][]byte{}
f, err := os.Open(path)
if err != nil {
f, _ = os.Open("../../" + path)
}
if f == nil {
return out
}
defer f.Close()
sc := bufio.NewScanner(f)
for sc.Scan() {
ln := sc.Text()
if ln == "" {
continue
}
parts := strings.SplitN(ln, " ", 2)
key := parts[0]
b, _ := hex.DecodeString(parts[1])
out[key] = b
}
return out
}

func splitFields(buf []byte) [][]byte {
fields := [][]byte{}
off := 0
Expand All @@ -62,16 +81,17 @@ func aeadBit(flags []byte) bool {
}

func TestFixturesAEADAndPayloads(t *testing.T) {
sets := []string{
"vectors_hex.txt",
"vectors_hex_stream_avrochunk.txt",
"vectors_hex_unary_rich.txt",
"vectors_hex_stream_avronested.txt",
"vectors_hex_pathB.txt",
sets := [][2]string{
{"fixtures/vectors_hex.txt", "fixtures/vectors_hex.txt.nonces"},
{"fixtures/vectors_hex_stream_avrochunk.txt", "fixtures/vectors_hex_stream_avrochunk.txt.nonces"},
{"fixtures/vectors_hex_unary_rich.txt", "fixtures/vectors_hex_unary_rich.txt.nonces"},
{"fixtures/vectors_hex_stream_avronested.txt", "fixtures/vectors_hex_stream_avronested.txt.nonces"},
{"fixtures/vectors_hex_pathB.txt", "fixtures/vectors_hex_pathB.txt.nonces"},
}
key := [32]byte{}
for _, fx := range sets {
pairs := readPairs(t, fixturePath(fx))
for _, s := range sets {
pairs := readPairs(s[0])
nonces := readNonces(s[1])
for _, p := range pairs {
name := string(p[0])
frame := p[1]
Expand Down Expand Up @@ -100,16 +120,21 @@ func TestFixturesAEADAndPayloads(t *testing.T) {
t.Fatalf("aad error %s: %v", name, err)
}
tag := env.Tag
n := nonces[name]
if len(n) != 24 {
t.Fatalf("nonce size mismatch %s", name)
}
if len(tag) != 16 {
t.Fatalf("tag size mismatch %s", name)
}
mac, err := blake2b.New(16, key[:])
if err != nil {
t.Fatalf("blake2b init: %v", err)
}
mac.Write(aad)
computed := mac.Sum(nil)
h, _ := blake2b.New(16, key[:])
strict := os.Getenv("STRICT_AEAD") == "1"
_, _ = h.Write(aad)
computed := h.Sum(nil)
if subtle.ConstantTimeCompare(computed, tag) != 1 {
if strict {
t.Fatalf("strict AEAD tag mismatch for %s", name)
}
t.Fatalf("tag mismatch for %s", name)
}
}
Expand Down
23 changes: 12 additions & 11 deletions go/tritrpcv1/pathb_dec.go
Original file line number Diff line number Diff line change
@@ -1,26 +1,27 @@
package tritrpcv1

import "fmt"

// Minimal Path-B decoders for strings and union index (subset used in fixtures)
func PBDecodeLen(buf []byte, off int) (int, int) {
// TLEB3 decode for length: reuse TLEB3 decoder by repacking; here we assume small inputs and just reuse TritUnpack on a byte-by-byte basis
// NOTE: For production, implement a proper scanner.
trits := []byte{}
start := off
for {
if off >= len(buf) {
panic("EOF in PBDecodeLen")
}
b := buf[off]
off++
var ts []byte
var err error
if b >= 243 && b <= 246 {
if off >= len(buf) {
panic(fmt.Sprintf("truncated tail marker in PBDecodeLen at offset %d", off))
if off+1 >= len(buf) {
panic("truncated tail marker")
}
ts, _ = TritUnpack243([]byte{b, buf[off]})
off++
ts, err = TritUnpack243(buf[off : off+2])
off += 2
} else {
ts, _ = TritUnpack243([]byte{b})
ts, err = TritUnpack243([]byte{b})
off++
}
if err != nil {
panic(err)
}
trits = append(trits, ts...)
if len(trits) >= 3 {
Expand Down
23 changes: 11 additions & 12 deletions go/tritrpcv1/tleb3.go
Original file line number Diff line number Diff line change
Expand Up @@ -27,24 +27,22 @@ func TLEB3EncodeLen(n uint64) []byte {

func TLEB3DecodeLen(buf []byte, offset int) (val uint64, newOff int, err error) {
trits := []byte{}
pos := offset
off := offset
for {
if pos >= len(buf) {
if off >= len(buf) {
return 0, 0, errors.New("EOF in TLEB3")
}
b := buf[pos]
pos++
b := buf[off]
var ts []byte
// Tail-marker bytes (0xF3..=0xF6) span two bytes; pass both to TritUnpack243.
if b >= 0xF3 && b <= 0xF6 {
if pos >= len(buf) {
return 0, 0, errors.New("truncated TLEB3 tail marker")
if b >= 243 && b <= 246 {
if off+1 >= len(buf) {
return 0, 0, errors.New("truncated tail marker")
}
b2 := buf[pos]
pos++
ts, err = TritUnpack243([]byte{b, b2})
ts, err = TritUnpack243(buf[off : off+2])
off += 2
} else {
ts, err = TritUnpack243([]byte{b})
off++
}
if err != nil {
return 0, 0, err
Expand All @@ -70,7 +68,8 @@ func TLEB3DecodeLen(buf []byte, offset int) (val uint64, newOff int, err error)
}
if used > 0 {
pack := TritPack243(trits[:used])
return v, offset + len(pack), nil
usedBytes := len(pack)
return v, offset + usedBytes, nil
}
}
}
Loading
Loading