Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
5ebd78e
Add wifi-densepose-pointcloud: real-time dense point cloud from camer…
ruvnet Apr 19, 2026
6cb0859
Optimize pointcloud: larger splat voxels, smaller responses, faster f…
ruvnet Apr 19, 2026
c1336c6
Complete implementation: camera capture, WiFi CSI receiver, training …
ruvnet Apr 19, 2026
de5dc9a
Fix viewer: replace WebSocket with fetch polling
ruvnet Apr 19, 2026
f39d88e
Wire live camera into server — real-time updating point cloud
ruvnet Apr 19, 2026
0c512ed
Add MiDaS GPU depth, serial CSI reader, full sensor fusion
ruvnet Apr 19, 2026
898d90f
Complete 7-component sensor fusion pipeline (all working)
ruvnet Apr 19, 2026
ae792aa
Add brain bridge — sparse spatial observation sync every 60s
ruvnet Apr 20, 2026
4ab6935
Update README + user guide with dense point cloud features
ruvnet Apr 20, 2026
c795432
Add ruview-geo: geospatial satellite integration (11 modules, 8/8 tests)
ruvnet Apr 20, 2026
f49ecb1
Update ADR-044: add Common Crawl WET, NASA FIRMS, OpenAQ, Overture Ma…
ruvnet Apr 20, 2026
e39a35e
Fix OSM/SRTM queries, add change detection + night mode
ruvnet Apr 20, 2026
b2e3f27
Enhance viewer: skeleton overlay, weather, buildings, better camera
ruvnet Apr 20, 2026
d5c457a
Add CSI fingerprint DB + night mode detection
ruvnet Apr 20, 2026
ca3c58a
Fix ADR-044 numbering conflict, update geo README
ruvnet Apr 20, 2026
8eb808d
Clean up warnings: suppress dead_code for conditional pipeline modules
ruvnet Apr 20, 2026
8505662
Fix PR #405 blockers: async runtime panic, crate rename, path travers…
ruvnet Apr 20, 2026
4d5bdb1
csi_pipeline: rename WiFlow stub to heuristic_pose_from_amplitude, de…
ruvnet Apr 20, 2026
770788f
Extract ADR-018 parser into parser.rs + wire Fingerprint CLI
ruvnet Apr 20, 2026
d2b2cbf
stream: extract viewer HTML to viewer.html, default bind to loopback
ruvnet Apr 20, 2026
3225eee
Dead-code cleanup + tests for fusion/depth/OSM/training/fingerprinting
ruvnet Apr 20, 2026
e1843c0
Merge main into feat/realtime-dense-pointcloud
ruvnet Apr 20, 2026
0824de7
Update README + user-guide for PR #405 review-fix additions
ruvnet Apr 20, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -250,3 +250,5 @@ v1/src/sensing/mac_wifi
# Local build scripts
firmware/esp32-csi-node/build_firmware.batdata/
models/
demo_pointcloud.ply
demo_splats.json
43 changes: 43 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,47 @@ node scripts/mincut-person-counter.js --port 5006 # Correct person counting
>
---

### Real-Time Dense Point Cloud (NEW)

RuView now generates **real-time 3D point clouds** by fusing camera depth + WiFi CSI + mmWave radar. All sensors stream simultaneously into a unified spatial model.

| Sensor | Data | Integration |
|--------|------|-------------|
| **Camera** | MiDaS monocular depth (GPU) | 640×480 → 19,200+ depth points per frame |
| **ESP32 CSI** | ADR-018 binary frames (UDP) | RF tomography → 8×8×4 occupancy grid |
| **WiFlow Pose** | 17 COCO keypoints from CSI | Skeleton overlay on point cloud |
| **Vital Signs** | Breathing rate from CSI phase | Stored in ruOS brain every 60s |
| **Motion** | CSI amplitude variance | Adaptive capture rate (skip depth when still) |

**Quick start:**
```bash
cd rust-port/wifi-densepose-rs
cargo build --release -p wifi-densepose-pointcloud
./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
# Open http://localhost:9880 for live 3D viewer
```

**CLI commands:**
```bash
ruview-pointcloud demo # synthetic demo
ruview-pointcloud serve --bind 127.0.0.1:9880 # live server + Three.js viewer
ruview-pointcloud capture --output room.ply # capture to PLY
ruview-pointcloud train # depth calibration + DPO pairs
ruview-pointcloud cameras # list available cameras
ruview-pointcloud csi-test --count 100 # send test CSI frames
ruview-pointcloud fingerprint office --seconds 5 # record named CSI room fingerprint
```

The HTTP/viewer server defaults to **loopback (`127.0.0.1`)** — exposing live camera/CSI/vitals on `0.0.0.0` is an explicit opt-in. Brain URL defaults to `http://127.0.0.1:9876` and is overridable via `RUVIEW_BRAIN_URL` env var or the `--brain` flag on `serve`/`train`.

The pose overlay currently uses an **amplitude-energy heuristic** (`heuristic_pose_from_amplitude`) rather than trained WiFlow inference — real ONNX/Candle inference is tracked as a follow-up.

**Performance:** 22ms pipeline, 905 req/s API, 40K voxel room model from 20 frames.

**Brain integration:** Spatial observations (motion, vitals, skeleton, occupancy) sync to the ruOS brain every 60 seconds for agent reasoning.

See [PR #405](https://github.com/ruvnet/RuView/pull/405) for full details.

### What's New in v0.7.0

<details>
Expand Down Expand Up @@ -904,6 +945,8 @@ cargo add wifi-densepose-ruvector # RuVector v2.0.4 integration layer (ADR-017
| [`wifi-densepose-api`](https://crates.io/crates/wifi-densepose-api) | REST + WebSocket API layer | -- | [![crates.io](https://img.shields.io/crates/v/wifi-densepose-api.svg)](https://crates.io/crates/wifi-densepose-api) |
| [`wifi-densepose-config`](https://crates.io/crates/wifi-densepose-config) | Configuration management | -- | [![crates.io](https://img.shields.io/crates/v/wifi-densepose-config.svg)](https://crates.io/crates/wifi-densepose-config) |
| [`wifi-densepose-db`](https://crates.io/crates/wifi-densepose-db) | Database persistence (PostgreSQL, SQLite, Redis) | -- | [![crates.io](https://img.shields.io/crates/v/wifi-densepose-db.svg)](https://crates.io/crates/wifi-densepose-db) |
| `wifi-densepose-pointcloud` | Real-time dense point cloud from camera + WiFi CSI fusion (Three.js viewer, brain bridge). Workspace-only for now. | -- | — |
| `wifi-densepose-geo` | Geospatial context (Sentinel-2 tiles, SRTM elevation, OSM, weather, night-mode). Workspace-only for now. | -- | — |

All crates integrate with [RuVector v2.0.4](https://github.com/ruvnet/ruvector) — see [AI Backbone](#ai-backbone-ruvector) below.

Expand Down
65 changes: 65 additions & 0 deletions docs/adr/ADR-044-geospatial-satellite-integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# ADR-044: Geospatial Satellite Integration

## Status
Accepted

## Context
RuView generates real-time 3D point clouds from camera + WiFi CSI, but these exist in a local coordinate frame with no geographic reference. Integrating free satellite imagery, terrain elevation, and map data provides environmental context that enables the ruOS brain to reason about the physical world beyond the room.

## Decision

### Data Sources (all free, no API keys)
| Source | Data | Resolution | Update | Format |
|--------|------|-----------|--------|--------|
| EOX Sentinel-2 Cloudless | Satellite tiles | 10m | Static mosaic | XYZ/JPEG |
| SRTM GL1 (NASA) | Elevation/DEM | 30m (1-arcsec) | Static | Binary HGT |
| Overpass API (OSM) | Buildings, roads | Vector | Real-time | JSON |
| ip-api.com | IP geolocation | ~1km | Per-request | JSON |
| Sentinel-2 STAC | Temporal satellite | 10m | Every 5 days | COG/STAC |
| Open Meteo | Weather | Point | Hourly | JSON |

### Architecture
Pure Rust implementation in `wifi-densepose-geo` crate. No GDAL/PROJ/GEOS — coordinate transforms implemented directly (~250 LOC). Tile caching on disk at `~/.local/share/ruview/geo-cache/`.

### Coordinate System
- WGS84 for geographic coordinates
- ENU (East-North-Up) as the bridge between local sensor frame and world
- Local sensor frame: camera origin, +Z forward, +Y up

### Temporal Awareness
Nightly scheduled fetch of Sentinel-2 latest imagery + OSM diffs + weather.
Changes detected via image comparison and stored as brain memories for
contrastive learning.

### Brain Integration
Geospatial context stored as brain memories:
- `spatial-geo`: location, elevation, nearby landmarks
- `spatial-change`: detected changes in satellite/OSM data
- `spatial-weather`: current conditions + forecast
- `spatial-season`: vegetation index, snow cover, seasonal patterns
- `spatial-local`: hyperlocal web context from Common Crawl WET

### Extended Data Sources (via ruvector WET/Common Crawl)
| Source | Data | Use |
|--------|------|-----|
| Common Crawl WET | Web text near location | Local business info, reviews, events |
| Wikidata | Structured knowledge | Building names, POI descriptions |
| NASA FIRMS | Active fire (3-hour) | Safety alerts |
| USGS Earthquakes | Seismic events | Safety context |
| OpenAQ | Air quality (PM2.5) | Environmental health |
| Overture Maps | Building footprints (Meta/MS) | Higher quality than OSM |

The ruvector brain server has existing `web_ingest` + Common Crawl support.
WET files filtered by geographic URL patterns provide hyperlocal context.

## Consequences
### Positive
- Agent gains environmental awareness beyond the room
- Temporal data enables seasonal calibration of CSI sensing
- Change detection finds construction, vegetation, weather effects
- All data sources are genuinely free with no API keys

### Negative
- Initial data fetch requires internet (~2MB tiles + ~25MB DEM)
- Cached data becomes stale (mitigated by nightly refresh)
- IP geolocation has ~1km accuracy (mitigated by manual override)
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# ADR-044: Provisioning Tool Enhancements
# ADR-050: Provisioning Tool Enhancements

**Status**: Proposed
**Date**: 2026-03-03
Expand Down
104 changes: 104 additions & 0 deletions docs/user-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -536,6 +536,110 @@ Both UIs update in real-time via WebSocket and auto-detect the sensing server on

---

## Dense Point Cloud (Camera + WiFi CSI Fusion)

RuView can generate real-time 3D point clouds by fusing camera depth estimation with WiFi CSI spatial sensing. This creates a spatial model of the environment that updates in real-time.

### Setup

```bash
# Build the pointcloud binary
cd rust-port/wifi-densepose-rs
cargo build --release -p wifi-densepose-pointcloud

# Start the server (auto-detects camera + CSI). Loopback-only by default.
./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
```

Open `http://localhost:9880` for the interactive Three.js 3D viewer.

> **Security note.** The server exposes live camera, skeleton, vitals, and occupancy over HTTP. The `--bind` flag defaults to `127.0.0.1:9880` (loopback-only). Exposing on `0.0.0.0` or a LAN IP is opt-in — the server logs a warning when it does, but there is no auth/TLS layer. Put a reverse proxy in front if you need remote access.

> **Brain URL.** Observations are POSTed to `http://127.0.0.1:9876` by default. Override via the `RUVIEW_BRAIN_URL` environment variable or the `--brain <url>` flag on `serve` / `train`.

### Sensors

| Sensor | Auto-detected | Data |
|--------|--------------|------|
| Camera (`/dev/video0`) | Yes (Linux UVC) | RGB frames → MiDaS depth → 3D points |
| ESP32 CSI (UDP:3333) | Yes (if provisioned) | ADR-018 binary → occupancy + pose + vitals |
| MiDaS depth server (port 9885) | Optional | GPU-accelerated neural depth estimation |

### Commands

| Command | Description |
|---------|-------------|
| `ruview-pointcloud serve --bind 127.0.0.1:9880` | Start HTTP server + Three.js viewer (loopback-only by default) |
| `ruview-pointcloud demo` | Generate synthetic point cloud (no hardware needed) |
| `ruview-pointcloud capture --output room.ply` | Capture single frame to PLY file |
| `ruview-pointcloud cameras` | List available cameras |
| `ruview-pointcloud train --data-dir ./data [--brain URL]` | Depth calibration + occupancy training (writes under canonicalized `data-dir`; refuses `..` traversal) |
| `ruview-pointcloud csi-test --count 100` | Send test CSI frames (no ESP32 needed) |
| `ruview-pointcloud fingerprint <name> [--seconds 5]` | Record a named CSI room fingerprint for later matching |

### Pipeline Components

1. **ADR-018 Parser** — Decodes ESP32 CSI binary frames from UDP (magic `0xC5110001` raw CSI and `0xC5110006` feature state), extracts I/Q subcarrier amplitudes and phases. Lives in `parser.rs`; unit-tested against hand-rolled test vectors.
2. **Pose (stub)** — 17 COCO keypoint *layout* generated by `heuristic_pose_from_amplitude` from CSI amplitude energy. This is **not** the trained WiFlow model — it is a placeholder so the viewer has a skeleton to render. Wiring to real Candle/ONNX inference from the `wifi-densepose-nn` crate is a planned follow-up.
3. **Vital Signs** — Breathing rate from CSI phase analysis (peak counting on stable subcarrier)
4. **Motion Detection** — CSI amplitude variance over 20 frames, triggers adaptive capture
5. **RF Tomography** — Backprojection from per-node RSSI to 8×8×4 occupancy grid
6. **Camera Depth** — MiDaS monocular depth (GPU) with luminance+edge fallback
7. **Sensor Fusion** — Voxel-grid merging of camera depth + CSI occupancy
8. **Brain Bridge** — Stores spatial observations in the ruOS brain every 60 seconds

### API Endpoints

| Endpoint | Method | Returns |
|----------|--------|---------|
| `/health` | GET | `{"status": "ok"}` |
| `/api/status` | GET | Camera, CSI, pipeline state, vitals, motion |
| `/api/cloud` | GET | Point cloud (up to 1000 points) + pipeline data |
| `/api/splats` | GET | Gaussian splats for Three.js rendering |
| `/` | GET | Interactive Three.js 3D viewer |

### Training

The training pipeline calibrates depth estimation and occupancy detection:

```bash
ruview-pointcloud train --data-dir ~/.local/share/ruview/training --brain http://127.0.0.1:9876
```

This captures frames, runs depth calibration (grid search over scale/offset/gamma), trains occupancy thresholds, exports DPO preference pairs, and submits results to the ruOS brain.

### Output Formats

- **PLY** — Standard 3D point cloud (ASCII, with RGB color)
- **Gaussian Splats** — JSON format for Three.js rendering
- **Brain Memories** — Spatial observations stored as `spatial-observation`, `spatial-motion`, `spatial-vitals`

### Deep Room Scan

Capture a high-quality 3D model of the room:

```bash
# Stop the live server first (frees the camera)
# Then capture 20 frames and process with MiDaS
ruview-pointcloud capture --frames 20 --output room_model.ply
```

Result: 40,000+ voxels at 5cm resolution, 12,000+ Gaussian splats.

### ESP32 Provisioning for CSI

To send CSI data to the pointcloud server:

```bash
python3 firmware/esp32-csi-node/provision.py \
--port /dev/ttyACM0 \
--ssid "YourWiFi" --password "YourPassword" \
--target-ip 192.168.1.123 --target-port 3333 \
--node-id 1
```

---

## Vital Sign Detection

The system extracts breathing rate and heart rate from CSI signal fluctuations using FFT peak detection.
Expand Down
73 changes: 66 additions & 7 deletions rust-port/wifi-densepose-rs/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading