Skip to content

Implement RuView edge AI perception system#1

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/add-edge-ai-perception-system
Draft

Implement RuView edge AI perception system#1
Copilot wants to merge 2 commits intomainfrom
copilot/add-edge-ai-perception-system

Conversation

Copy link

Copilot AI commented Mar 9, 2026

RuView is an edge AI perception system that reconstructs human presence, body pose, breathing rate, and heart rate entirely from WiFi Channel State Information (CSI) — no cameras, no cloud, no labeled data. Runs on ~$1 ESP32 nodes.

Python package (ruview/)

  • csi/CSIFrame / CSIBuffer with ESP32 raw-byte and UDP packet parsing; CSIProcessor for DC removal, SFO phase-ramp correction (linear regression), outlier rejection, EMA smoothing
  • signal/ — Zero-phase Butterworth bandpass/lowpass filters; Welch PSD dominant-frequency estimator; PCA compression, variance/energy feature extraction
  • presence/ — Variance-ratio occupancy detector; adaptive empty-room baseline via slow EMA; calibrate() API for forced re-baseline
  • vitals/BreathingMonitor (0.1–0.5 Hz band) and HeartRateMonitor (0.8–2.5 Hz band); both use bandpass → Welch PSD → BPM
  • pose/ — WiFi DensePose over COCO 17-keypoint skeleton; incremental linear regression from CSI feature vectors; anatomical upright-standing prior before any training data
  • edge/ — Thread-safe EdgeNode (rolling buffer + stale-timeout status); UDPReceiver (background thread) and SerialReceiver (sync-byte + XOR-checksum framing)
  • engine.pyRuViewEngine orchestrates all subsystems; merges frames from multiple nodes sorted by timestamp
  • cli.pyruview demo (synthetic data, no hardware) and ruview run --port / --serial

ESP32 firmware (firmware/csi_node/)

Arduino sketch using esp_wifi_set_csi_rx_cb() to capture per-packet CSI. Streams to host over UDP (8-byte node ID + double timestamp + RSSI + channel + int8 subcarrier pairs) and optionally serial (sync byte 0xAA + uint16 length + payload + XOR checksum). PlatformIO config included.

Quick start

from ruview import RuViewEngine
from ruview.edge import EdgeNode, UDPReceiver

node = EdgeNode(node_id="esp32-01")
engine = RuViewEngine(nodes=[node])
engine.start()

with UDPReceiver(port=5005, nodes={node.node_id: node}):
    engine.calibrate()          # empty room baseline
    obs = engine.observe()
    print(obs.presence.present, obs.breathing.rate_bpm, obs.heart_rate.rate_bpm)

No-hardware demo: ruview demo

Original prompt

Perceive the world through signals. No cameras. No wearables. No Internet. Just physics.

π RuView is an edge AI perception system that learns directly from the environment around it.
Instead of relying on cameras or cloud models, it observes whatever signals exist in a space such as WiFi, radio waves across the spectrum, motion patterns, vibration, sound, or other sensory inputs and builds an understanding of what is happening locally.

Built on top of RuVector, the project became widely known for its implementation of WiFi DensePose — a sensing technique first explored in academic research such as Carnegie Mellon University's DensePose From WiFi work. That research demonstrated that WiFi signals can be used to reconstruct human pose.

RuView extends that concept into a practical edge system. By analyzing Channel State Information (CSI) disturbances caused by human movement, RuView reconstructs body position, breathing rate, heart rate, and presence in real time using physics-based signal processing and machine learning.

Unlike research systems that rely on synchronized cameras for training, RuView is designed to operate entirely from radio signals and self-learned embeddings at the edge.

The system runs entirely on inexpensive hardware such as an ESP32 sensor mesh (as low as ~$1 per node). Small programmable edge modules analyze signals locally and learn the RF signature of a room over time, allowing the system to separate the environment from the activity happening inside it.

Because RuView learns in proximity to the signals it observes, it improves as it operates. Each deployment develops a local model of its surroundings and continuously adapts without requiring cameras, labeled data, or cloud infrastructure.

In practice this means ordinary environments gain a new kind of spatial awareness. Rooms, buildings, and devices begin to sense presence, movement, and vital activity using the signals that already fill the space.

Built for low-power edge applications
Edge modules are small programs that run directly on the ESP32 sensor — no internet needed, no cloud fees, instant response.


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Co-authored-by: attentiondotnet <566807+attentiondotnet@users.noreply.github.com>
Copilot AI changed the title [WIP] Add edge AI perception system for local signal analysis Implement RuView edge AI perception system Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants