Compare commits
11 Commits
162048720c
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
50b4cb545b | ||
|
|
dfbd2a2196 | ||
|
|
c1f580ba16 | ||
|
|
2456194332 | ||
|
|
5672c0c22e | ||
|
|
b36b1579c7 | ||
|
|
4b72b3293e | ||
|
|
58c974b535 | ||
|
|
3ad39cfaeb | ||
|
|
924d28aab0 | ||
|
|
9e3583d5f8 |
60
CHANGELOG.md
Normal file
60
CHANGELOG.md
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
## [v0.1.4] - 2026-02-06
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Device Intelligence dashboard at `/dashboard/` (htmx + Pico CSS + D3.js)
|
||||||
|
- Vendor treemap visualization (devices grouped by type and vendor)
|
||||||
|
- SSID social graph (force-directed graph linking devices by shared probed SSIDs)
|
||||||
|
- Fingerprint clusters (packed circles grouping devices by behavior)
|
||||||
|
- Intelligence API endpoints:
|
||||||
|
- `GET /api/v1/intelligence/vendor-treemap`
|
||||||
|
- `GET /api/v1/intelligence/ssid-graph`
|
||||||
|
- `GET /api/v1/intelligence/fingerprint-clusters`
|
||||||
|
- Vendored static assets: Pico CSS, htmx, D3.js v7 (`make vendor`)
|
||||||
|
- Jinja2 base template with dark theme
|
||||||
|
- Dashboard and API tests (13 new tests)
|
||||||
|
- Pagination totals, request logging, data retention CLI
|
||||||
|
|
||||||
|
## [v0.1.3] - 2026-02-05
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Sensor config endpoints (GET/PUT `/api/v1/sensors/<id>/config`)
|
||||||
|
- OTA trigger endpoint (`POST /api/v1/sensors/<id>/ota`)
|
||||||
|
- Calibration trigger endpoint (`POST /api/v1/sensors/<id>/calibrate`)
|
||||||
|
- Heartbeat timeout detection (marks sensors offline)
|
||||||
|
- Sensor metrics history endpoint
|
||||||
|
- OpenAPI 3.0 spec with Swagger UI at `/api/docs`
|
||||||
|
|
||||||
|
## [v0.1.2] - 2026-02-05
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- IEEE OUI database download (`make oui`)
|
||||||
|
- MAC vendor lookup utility
|
||||||
|
- BLE company_id to manufacturer mapping (30+ vendors)
|
||||||
|
- Device profile enrichment in API responses
|
||||||
|
- Export endpoints: devices.csv, devices.json, alerts.csv, probes.csv
|
||||||
|
- Auto-populate vendor field on device creation
|
||||||
|
|
||||||
|
## [v0.1.1] - 2026-02-05
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Makefile start/stop/restart/status commands
|
||||||
|
- Health endpoint with uptime tracking (`/api/v1/health`)
|
||||||
|
- CLI module (`esp32-web` command)
|
||||||
|
- Database migrations via Flask-Migrate
|
||||||
|
- Listen on all interfaces (0.0.0.0:5500)
|
||||||
|
- `make help` target
|
||||||
|
|
||||||
|
## [v0.1.0] - 2026-02-05
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Flask app factory with blueprint architecture
|
||||||
|
- SQLAlchemy 2.x models: Sensor, Device, Sighting, Alert, Event, Probe
|
||||||
|
- REST API endpoints for all models
|
||||||
|
- UDP collector with data stream parsers
|
||||||
|
- pytest fixtures and initial tests
|
||||||
|
- Containerfile for podman
|
||||||
|
- Makefile for common tasks
|
||||||
@@ -9,6 +9,8 @@ RUN pip install --no-cache-dir .
|
|||||||
# Copy source
|
# Copy source
|
||||||
COPY src/ src/
|
COPY src/ src/
|
||||||
COPY migrations/ migrations/
|
COPY migrations/ migrations/
|
||||||
|
COPY static/ static/
|
||||||
|
COPY templates/ templates/
|
||||||
|
|
||||||
# Expose ports (TCP for HTTP, UDP for collector)
|
# Expose ports (TCP for HTTP, UDP for collector)
|
||||||
EXPOSE 5500/tcp
|
EXPOSE 5500/tcp
|
||||||
|
|||||||
16
Makefile
16
Makefile
@@ -1,4 +1,4 @@
|
|||||||
.PHONY: help build run dev stop logs test migrate clean install start restart status
|
.PHONY: help build run dev stop logs test migrate clean install start restart status oui cleanup vendor
|
||||||
|
|
||||||
APP_NAME := esp32-web
|
APP_NAME := esp32-web
|
||||||
PORT := 5500
|
PORT := 5500
|
||||||
@@ -25,6 +25,8 @@ help:
|
|||||||
@echo "Development:"
|
@echo "Development:"
|
||||||
@echo " make install Install with dev dependencies"
|
@echo " make install Install with dev dependencies"
|
||||||
@echo " make test Run tests"
|
@echo " make test Run tests"
|
||||||
|
@echo " make oui Download OUI database"
|
||||||
|
@echo " make cleanup Delete expired data"
|
||||||
@echo " make clean Remove cache files"
|
@echo " make clean Remove cache files"
|
||||||
@echo ""
|
@echo ""
|
||||||
@echo "Container:"
|
@echo "Container:"
|
||||||
@@ -81,6 +83,12 @@ dev:
|
|||||||
test:
|
test:
|
||||||
pytest -v
|
pytest -v
|
||||||
|
|
||||||
|
oui:
|
||||||
|
flask --app src/esp32_web download-oui
|
||||||
|
|
||||||
|
cleanup:
|
||||||
|
flask --app src/esp32_web cleanup-data
|
||||||
|
|
||||||
migrate:
|
migrate:
|
||||||
flask --app src/esp32_web db upgrade
|
flask --app src/esp32_web db upgrade
|
||||||
|
|
||||||
@@ -107,6 +115,12 @@ container-stop:
|
|||||||
container-logs:
|
container-logs:
|
||||||
podman logs -f $(APP_NAME)
|
podman logs -f $(APP_NAME)
|
||||||
|
|
||||||
|
vendor:
|
||||||
|
mkdir -p static/css/vendor static/js/vendor static/js/viz
|
||||||
|
curl -sLo static/css/vendor/pico.min.css https://cdn.jsdelivr.net/npm/@picocss/pico@2/css/pico.min.css
|
||||||
|
curl -sLo static/js/vendor/htmx.min.js https://unpkg.com/htmx.org@2.0.4/dist/htmx.min.js
|
||||||
|
curl -sLo static/js/vendor/d3.min.js https://cdn.jsdelivr.net/npm/d3@7/dist/d3.min.js
|
||||||
|
|
||||||
clean:
|
clean:
|
||||||
rm -rf __pycache__ .pytest_cache .ruff_cache
|
rm -rf __pycache__ .pytest_cache .ruff_cache
|
||||||
find . -type d -name __pycache__ -exec rm -rf {} +
|
find . -type d -name __pycache__ -exec rm -rf {} +
|
||||||
|
|||||||
51
ROADMAP.md
51
ROADMAP.md
@@ -15,40 +15,63 @@
|
|||||||
- [x] Health endpoint with uptime
|
- [x] Health endpoint with uptime
|
||||||
- [x] Database migrations (Flask-Migrate)
|
- [x] Database migrations (Flask-Migrate)
|
||||||
- [x] Listen on all interfaces
|
- [x] Listen on all interfaces
|
||||||
|
- [x] make help target
|
||||||
|
|
||||||
## v0.2.0 - OSINT Features
|
## v0.1.2 - OSINT Features [DONE]
|
||||||
|
|
||||||
- [ ] MAC vendor lookup (IEEE OUI database)
|
- [x] MAC vendor lookup (IEEE OUI database)
|
||||||
- [ ] BLE company_id to manufacturer mapping
|
- [x] BLE company_id to manufacturer mapping
|
||||||
- [ ] Device profile enrichment
|
- [x] Device profile enrichment
|
||||||
- [ ] Export endpoints (CSV, JSON)
|
- [x] Export endpoints (CSV, JSON)
|
||||||
|
|
||||||
## v0.3.0 - Fleet Management
|
## v0.1.3 - Fleet Management [DONE]
|
||||||
|
|
||||||
- [ ] Sensor config endpoint (GET/PUT)
|
- [x] Sensor config endpoint (GET/PUT)
|
||||||
- [ ] OTA trigger endpoint
|
- [x] OTA trigger endpoint
|
||||||
- [ ] Calibration trigger endpoint
|
- [x] Calibration trigger endpoint
|
||||||
- [ ] Sensor history/metrics
|
- [ ] Sensor history/metrics (moved to v0.1.5)
|
||||||
|
|
||||||
## v0.4.0 - Zones & Presence
|
## v0.1.4 - Device Intelligence Dashboard [DONE]
|
||||||
|
|
||||||
|
- [x] Base dashboard layout (htmx + Pico CSS + D3.js dark theme)
|
||||||
|
- [x] Vendor treemap (D3 treemap by device type and vendor)
|
||||||
|
- [x] SSID social graph (D3 force-directed, shared probed SSIDs as edges)
|
||||||
|
- [x] Device fingerprint clusters (D3 packed circles by behavior)
|
||||||
|
- [x] Intelligence API endpoints (3 endpoints)
|
||||||
|
- [x] Vendored static assets (`make vendor`)
|
||||||
|
- [x] Pagination totals, request logging, data retention
|
||||||
|
|
||||||
|
## v0.1.5 - Zones & Presence
|
||||||
|
|
||||||
- [ ] Zone management (assign sensors to areas)
|
- [ ] Zone management (assign sensors to areas)
|
||||||
- [ ] Device zone tracking
|
- [ ] Device zone tracking
|
||||||
- [ ] Dwell time analysis
|
- [ ] Dwell time analysis
|
||||||
- [ ] Presence history
|
- [ ] Presence history
|
||||||
|
|
||||||
## v1.0.0 - Production Ready
|
## v0.1.6 - Production Ready
|
||||||
|
|
||||||
- [ ] Authentication (API keys or JWT)
|
- [ ] Authentication (API keys or JWT)
|
||||||
- [ ] Rate limiting
|
- [ ] Rate limiting
|
||||||
- [ ] PostgreSQL support
|
- [ ] PostgreSQL support
|
||||||
- [ ] Systemd service file
|
- [ ] Podman container deployment (quadlet/systemd unit)
|
||||||
- [ ] Production deployment guide
|
- [ ] Production deployment guide
|
||||||
|
|
||||||
|
## v0.2.0 - Visualization Dashboard
|
||||||
|
|
||||||
|
- [x] Base dashboard layout (htmx + Pico CSS + D3.js) — done in v0.1.4
|
||||||
|
- [x] Vendor treemap — done in v0.1.4
|
||||||
|
- [x] SSID social graph — done in v0.1.4
|
||||||
|
- [x] Device fingerprint clusters — done in v0.1.4
|
||||||
|
- [ ] Presence timeline (device enter/leave Gantt chart)
|
||||||
|
- [ ] Deauth attack timeline (alert overlay with source/target)
|
||||||
|
|
||||||
## Future
|
## Future
|
||||||
|
|
||||||
|
- RSSI heatmap / triangulation
|
||||||
|
- CSI radar display
|
||||||
|
- Temporal knowledge graph
|
||||||
|
- Entropy dashboard (ambient awareness metric)
|
||||||
- WebSocket for real-time updates
|
- WebSocket for real-time updates
|
||||||
- Web dashboard (htmx + Pico CSS)
|
|
||||||
- Home Assistant integration
|
- Home Assistant integration
|
||||||
- Grafana dashboards
|
- Grafana dashboards
|
||||||
- Webhook callbacks for alerts
|
- Webhook callbacks for alerts
|
||||||
|
|||||||
61
TASKS.md
61
TASKS.md
@@ -1,26 +1,56 @@
|
|||||||
# ESP32-Web Tasks
|
# ESP32-Web Tasks
|
||||||
|
|
||||||
**Last Updated:** 2026-02-05
|
**Last Updated:** 2026-02-06
|
||||||
|
|
||||||
## Current Sprint: v0.2.0 — OSINT Features
|
## Current Sprint: v0.1.5 — Zones & Presence
|
||||||
|
|
||||||
### P1 - High
|
### P1 - High
|
||||||
- [ ] Download and parse IEEE OUI database
|
- [ ] Zone model (name, description, location)
|
||||||
- [ ] MAC vendor lookup utility
|
- [ ] `POST /api/v1/zones` — create zone
|
||||||
- [ ] BLE company_id mapping
|
- [ ] `GET /api/v1/zones` — list zones
|
||||||
- [ ] `GET /api/v1/devices/<mac>/profile` enriched endpoint
|
- [ ] `PUT /api/v1/zones/<id>` — update zone
|
||||||
|
- [ ] Assign sensors to zones
|
||||||
|
|
||||||
### P2 - Normal
|
### P2 - Normal
|
||||||
- [ ] Export endpoints (`/api/v1/export/devices.csv`)
|
- [ ] Device zone tracking (which zone a device is in)
|
||||||
- [ ] Add vendor field population on device creation
|
- [ ] Dwell time analysis
|
||||||
- [ ] Sensor heartbeat timeout detection
|
- [ ] Presence history endpoint
|
||||||
|
|
||||||
### P3 - Low
|
### P2 - Dashboard (v0.2.0)
|
||||||
- [ ] Add pagination to all list endpoints
|
- [ ] Presence timeline (Gantt chart, low effort)
|
||||||
- [ ] Add OpenAPI/Swagger spec
|
- [ ] Deauth attack timeline (alert overlay, low effort)
|
||||||
- [ ] Add request logging middleware
|
|
||||||
|
|
||||||
## Completed: v0.1.1 - Server Management
|
## Completed: v0.1.4 — Device Intelligence Dashboard
|
||||||
|
|
||||||
|
- [x] Base dashboard layout (htmx + Pico CSS + D3.js dark theme)
|
||||||
|
- [x] Vendor treemap visualization (`/api/v1/intelligence/vendor-treemap`)
|
||||||
|
- [x] SSID social graph visualization (`/api/v1/intelligence/ssid-graph`)
|
||||||
|
- [x] Fingerprint clusters visualization (`/api/v1/intelligence/fingerprint-clusters`)
|
||||||
|
- [x] Jinja2 base template with tab navigation
|
||||||
|
- [x] Vendored static assets: Pico CSS, htmx, D3.js v7 (`make vendor`)
|
||||||
|
- [x] Dashboard + intelligence API tests (13 new tests, 59 total)
|
||||||
|
- [x] Pagination totals, request logging, data retention CLI
|
||||||
|
|
||||||
|
## Completed: v0.1.3 — Fleet Management
|
||||||
|
|
||||||
|
- [x] `GET /api/v1/sensors/<id>/config` — read sensor config
|
||||||
|
- [x] `PUT /api/v1/sensors/<id>/config` — update sensor config
|
||||||
|
- [x] `POST /api/v1/sensors/<id>/ota` — trigger OTA update
|
||||||
|
- [x] `POST /api/v1/sensors/<id>/calibrate` — trigger calibration
|
||||||
|
- [x] Sensor heartbeat timeout detection
|
||||||
|
- [x] Sensor metrics history endpoint
|
||||||
|
- [x] OpenAPI 3.0 spec with Swagger UI
|
||||||
|
|
||||||
|
## Completed: v0.1.2 — OSINT Features
|
||||||
|
|
||||||
|
- [x] IEEE OUI database download (`make oui`)
|
||||||
|
- [x] MAC vendor lookup utility
|
||||||
|
- [x] BLE company_id mapping (30+ vendors)
|
||||||
|
- [x] Device profile enrichment in API
|
||||||
|
- [x] Export endpoints (devices.csv, devices.json, alerts.csv, probes.csv)
|
||||||
|
- [x] Auto-populate vendor on device creation
|
||||||
|
|
||||||
|
## Completed: v0.1.1 — Server Management
|
||||||
|
|
||||||
- [x] Makefile start/stop/restart/status commands
|
- [x] Makefile start/stop/restart/status commands
|
||||||
- [x] Health endpoint with uptime tracking
|
- [x] Health endpoint with uptime tracking
|
||||||
@@ -28,7 +58,7 @@
|
|||||||
- [x] Initial database migration
|
- [x] Initial database migration
|
||||||
- [x] Listen on all interfaces (0.0.0.0:5500)
|
- [x] Listen on all interfaces (0.0.0.0:5500)
|
||||||
|
|
||||||
## Completed: v0.1.0 - Project Scaffold
|
## Completed: v0.1.0 — Project Scaffold
|
||||||
|
|
||||||
- [x] Flask app factory pattern
|
- [x] Flask app factory pattern
|
||||||
- [x] SQLAlchemy 2.x models
|
- [x] SQLAlchemy 2.x models
|
||||||
@@ -44,3 +74,4 @@
|
|||||||
- API listens on TCP 5500
|
- API listens on TCP 5500
|
||||||
- Commands sent to sensors on UDP 5501
|
- Commands sent to sensors on UDP 5501
|
||||||
- SQLite for dev, PostgreSQL for prod
|
- SQLite for dev, PostgreSQL for prod
|
||||||
|
- Dashboard at `/dashboard/` with htmx tab switching
|
||||||
|
|||||||
65
TODO.md
65
TODO.md
@@ -2,38 +2,33 @@
|
|||||||
|
|
||||||
## API
|
## API
|
||||||
|
|
||||||
- [ ] Pagination for all list endpoints
|
- [x] Pagination for all list endpoints (with total count)
|
||||||
|
- [x] Request logging middleware
|
||||||
|
- [x] Data retention policy (auto-cleanup old records)
|
||||||
- [ ] Filter by date range
|
- [ ] Filter by date range
|
||||||
- [ ] Sort options
|
- [ ] Sort options
|
||||||
- [ ] OpenAPI/Swagger spec generation
|
|
||||||
- [ ] Rate limiting (flask-limiter)
|
- [ ] Rate limiting (flask-limiter)
|
||||||
- [ ] API authentication (JWT or API keys)
|
- [ ] API authentication (JWT or API keys)
|
||||||
|
|
||||||
## OSINT
|
## OSINT
|
||||||
|
|
||||||
- [ ] IEEE OUI database download script
|
|
||||||
- [ ] MAC vendor lookup on device creation
|
|
||||||
- [ ] BLE company ID database
|
|
||||||
- [ ] Device fingerprinting by advertisement patterns
|
- [ ] Device fingerprinting by advertisement patterns
|
||||||
- [ ] SSID categorization (home, corporate, mobile hotspot)
|
- [ ] SSID categorization (home, corporate, mobile hotspot)
|
||||||
|
- [ ] MAC randomization detection (correlate probe bursts, RSSI, timing)
|
||||||
|
- [ ] Device reputation scoring (randomized MAC, probe hygiene, visit frequency)
|
||||||
|
- [ ] Organizational mapping (group devices by vendor + behavior)
|
||||||
|
|
||||||
## Collector
|
## Collector
|
||||||
|
|
||||||
- [ ] Heartbeat timeout (mark sensor offline)
|
|
||||||
- [ ] CSI data storage (optional, high volume)
|
- [ ] CSI data storage (optional, high volume)
|
||||||
- [ ] Data retention policy (auto-cleanup old records)
|
|
||||||
- [ ] Metrics collection (packets/sec, errors)
|
|
||||||
|
|
||||||
## Fleet Management
|
## Fleet Management
|
||||||
|
|
||||||
- [ ] Sensor config read/write
|
- [ ] Bulk commands (multi-sensor OTA/config)
|
||||||
- [ ] OTA orchestration
|
|
||||||
- [ ] Calibration management
|
|
||||||
- [ ] Bulk commands
|
|
||||||
|
|
||||||
## Deployment
|
## Deployment
|
||||||
|
|
||||||
- [ ] Systemd service file
|
- [ ] Podman quadlet (systemd integration)
|
||||||
- [ ] PostgreSQL configuration
|
- [ ] PostgreSQL configuration
|
||||||
- [ ] Nginx reverse proxy config
|
- [ ] Nginx reverse proxy config
|
||||||
- [ ] TLS setup guide
|
- [ ] TLS setup guide
|
||||||
@@ -46,6 +41,50 @@
|
|||||||
- [ ] Integration tests with mock sensors
|
- [ ] Integration tests with mock sensors
|
||||||
- [ ] Load testing
|
- [ ] Load testing
|
||||||
|
|
||||||
|
## Visualizations
|
||||||
|
|
||||||
|
### Spatial / RF (D3.js)
|
||||||
|
- [ ] RSSI heatmap — triangulate device positions from multi-sensor readings, animate over time
|
||||||
|
- [ ] Sensor coverage Voronoi — show reach/overlap/blind spots
|
||||||
|
- [ ] Channel utilization spectrogram — waterfall display per sensor
|
||||||
|
|
||||||
|
### Device Intelligence
|
||||||
|
- [x] Device fingerprint clusters — group by behavior (probes, BLE company, cadence)
|
||||||
|
- [x] SSID social graph — devices as nodes, shared probed SSIDs as edges (reveals co-location history)
|
||||||
|
- [ ] Probe request worldmap — map probed SSIDs to geolocations via WiGLE
|
||||||
|
- [x] Vendor treemap — OUI + BLE company breakdown, anomaly spotting
|
||||||
|
|
||||||
|
### Temporal
|
||||||
|
- [ ] Presence timeline / Gantt — per-device strips showing enter/leave range (routines, anomalies)
|
||||||
|
- [ ] First-seen drift — highlight novel devices vs. known regulars
|
||||||
|
- [ ] Dwell time distributions — histogram, bimodal = passers-by vs. occupants
|
||||||
|
|
||||||
|
### Purple Team
|
||||||
|
- [ ] Deauth attack timeline — overlay alerts with source/target, correlate with device disappearances
|
||||||
|
- [ ] Evil twin detection — flag when probed SSID appears as local AP
|
||||||
|
- [ ] Flood intensity gauge — real-time deauth rate + historical sparklines
|
||||||
|
- [ ] Attack surface dashboard — broadcast probes (evil twin targets), static MACs (trackable), deauth-vulnerable
|
||||||
|
- [ ] Kill chain tracker — map events to MITRE ATT&CK for WiFi
|
||||||
|
|
||||||
|
### Experimental
|
||||||
|
- [ ] CSI radar — amplitude/phase matrix as real-time presence radar (if CSI enabled)
|
||||||
|
- [ ] Mesh consensus view — sensor agreement graph, fork/resolve visualization
|
||||||
|
- [ ] Temporal knowledge graph — devices/SSIDs/sensors/alerts with timestamped edges
|
||||||
|
- [ ] Adversarial simulation replay — VCR-style event playback with what-if scenarios
|
||||||
|
- [ ] Entropy dashboard — single ambient metric (new devices/hr, probe diversity, alert rate)
|
||||||
|
|
||||||
|
### Priority picks (high value, low-medium effort)
|
||||||
|
1. ~~Presence timeline (low effort, high value)~~ — next up
|
||||||
|
2. ~~Deauth attack timeline (low effort, high value)~~ — next up
|
||||||
|
3. ~~SSID social graph (medium effort, high value)~~ — done v0.1.4
|
||||||
|
4. ~~Device fingerprint clusters (medium effort, high value)~~ — done v0.1.4
|
||||||
|
5. RSSI heatmap / triangulation (high effort, very high value)
|
||||||
|
|
||||||
|
### Tech notes
|
||||||
|
- D3.js v7 + htmx + Pico CSS served locally from `static/vendor/`
|
||||||
|
- Dashboard at `/dashboard/` with htmx tab switching
|
||||||
|
- Intelligence API at `/api/v1/intelligence/*`
|
||||||
|
|
||||||
## Ideas
|
## Ideas
|
||||||
|
|
||||||
- WebSocket for live updates
|
- WebSocket for live updates
|
||||||
|
|||||||
@@ -27,6 +27,12 @@ pytest -v # Verbose output
|
|||||||
pytest -k test_sensors # Run specific tests
|
pytest -k test_sensors # Run specific tests
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Static Assets
|
||||||
|
|
||||||
|
```bash
|
||||||
|
make vendor # Download Pico CSS, htmx, D3.js to static/vendor/
|
||||||
|
```
|
||||||
|
|
||||||
## Container
|
## Container
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -66,6 +72,14 @@ curl localhost:5500/api/v1/probes/ssids
|
|||||||
|
|
||||||
# Stats
|
# Stats
|
||||||
curl localhost:5500/api/v1/stats
|
curl localhost:5500/api/v1/stats
|
||||||
|
|
||||||
|
# Intelligence (Device Intelligence Dashboard)
|
||||||
|
curl localhost:5500/api/v1/intelligence/vendor-treemap
|
||||||
|
curl "localhost:5500/api/v1/intelligence/ssid-graph?hours=24&min_shared=1&limit=200"
|
||||||
|
curl "localhost:5500/api/v1/intelligence/fingerprint-clusters?hours=24"
|
||||||
|
|
||||||
|
# Dashboard
|
||||||
|
open http://localhost:5500/dashboard/
|
||||||
```
|
```
|
||||||
|
|
||||||
## Query Parameters
|
## Query Parameters
|
||||||
@@ -78,6 +92,8 @@ curl localhost:5500/api/v1/stats
|
|||||||
| offset | devices, alerts, events, probes | Skip N results |
|
| offset | devices, alerts, events, probes | Skip N results |
|
||||||
| ssid | probes | Filter by SSID |
|
| ssid | probes | Filter by SSID |
|
||||||
| sensor_id | alerts, events | Filter by sensor |
|
| sensor_id | alerts, events | Filter by sensor |
|
||||||
|
| hours | intelligence/ssid-graph, intelligence/fingerprint-clusters | Time window (default: 24) |
|
||||||
|
| min_shared | intelligence/ssid-graph | Min shared SSIDs for link (default: 1) |
|
||||||
|
|
||||||
## Files
|
## Files
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[project]
|
[project]
|
||||||
name = "esp32-web"
|
name = "esp32-web"
|
||||||
version = "0.1.1"
|
version = "0.1.3"
|
||||||
description = "REST API backend for ESP32 sensor fleet"
|
description = "REST API backend for ESP32 sensor fleet"
|
||||||
requires-python = ">=3.11"
|
requires-python = ">=3.11"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
@@ -10,6 +10,7 @@ dependencies = [
|
|||||||
"flask-cors>=4.0",
|
"flask-cors>=4.0",
|
||||||
"gunicorn>=21.0",
|
"gunicorn>=21.0",
|
||||||
"python-dotenv>=1.0",
|
"python-dotenv>=1.0",
|
||||||
|
"pyyaml>=6.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.optional-dependencies]
|
[project.optional-dependencies]
|
||||||
|
|||||||
@@ -1,10 +1,16 @@
|
|||||||
"""ESP32-Web Flask Application."""
|
"""ESP32-Web Flask Application."""
|
||||||
|
import click
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
from datetime import datetime, UTC
|
from datetime import datetime, UTC
|
||||||
from flask import Flask
|
from flask import Flask, Response, request, send_from_directory
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
from .config import Config
|
from .config import Config
|
||||||
from .extensions import db, migrate
|
from .extensions import db, migrate
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# Track app start time
|
# Track app start time
|
||||||
_start_time = None
|
_start_time = None
|
||||||
|
|
||||||
@@ -14,7 +20,12 @@ def create_app(config_class=Config):
|
|||||||
global _start_time
|
global _start_time
|
||||||
_start_time = datetime.now(UTC)
|
_start_time = datetime.now(UTC)
|
||||||
|
|
||||||
app = Flask(__name__)
|
project_root = Path(__file__).resolve().parent.parent.parent
|
||||||
|
app = Flask(
|
||||||
|
__name__,
|
||||||
|
static_folder=str(project_root / 'static'),
|
||||||
|
template_folder=str(project_root / 'templates'),
|
||||||
|
)
|
||||||
app.config.from_object(config_class)
|
app.config.from_object(config_class)
|
||||||
|
|
||||||
# Initialize extensions
|
# Initialize extensions
|
||||||
@@ -25,6 +36,24 @@ def create_app(config_class=Config):
|
|||||||
from .api import bp as api_bp
|
from .api import bp as api_bp
|
||||||
app.register_blueprint(api_bp, url_prefix='/api/v1')
|
app.register_blueprint(api_bp, url_prefix='/api/v1')
|
||||||
|
|
||||||
|
from .dashboard import bp as dashboard_bp
|
||||||
|
app.register_blueprint(dashboard_bp)
|
||||||
|
|
||||||
|
# Request logging
|
||||||
|
@app.before_request
|
||||||
|
def _start_timer():
|
||||||
|
request._start_time = time.monotonic()
|
||||||
|
|
||||||
|
@app.after_request
|
||||||
|
def _log_request(response):
|
||||||
|
if request.path == '/health':
|
||||||
|
return response
|
||||||
|
duration_ms = (time.monotonic() - getattr(request, '_start_time', time.monotonic())) * 1000
|
||||||
|
logger.info('%s %s %s %.0fms %s',
|
||||||
|
request.method, request.path, response.status_code,
|
||||||
|
duration_ms, request.remote_addr)
|
||||||
|
return response
|
||||||
|
|
||||||
# Health check with uptime
|
# Health check with uptime
|
||||||
@app.route('/health')
|
@app.route('/health')
|
||||||
def health():
|
def health():
|
||||||
@@ -40,6 +69,45 @@ def create_app(config_class=Config):
|
|||||||
uptime_str = f'{minutes}m{seconds}s'
|
uptime_str = f'{minutes}m{seconds}s'
|
||||||
return {'status': 'ok', 'uptime': uptime_str, 'uptime_seconds': uptime_seconds}
|
return {'status': 'ok', 'uptime': uptime_str, 'uptime_seconds': uptime_seconds}
|
||||||
|
|
||||||
|
# OpenAPI spec endpoints
|
||||||
|
@app.route('/openapi.yaml')
|
||||||
|
def openapi_yaml():
|
||||||
|
"""Serve OpenAPI spec as YAML."""
|
||||||
|
spec_path = Path(__file__).parent / 'openapi.yaml'
|
||||||
|
return Response(spec_path.read_text(), mimetype='text/yaml')
|
||||||
|
|
||||||
|
@app.route('/openapi.json')
|
||||||
|
def openapi_json():
|
||||||
|
"""Serve OpenAPI spec as JSON."""
|
||||||
|
import json
|
||||||
|
import yaml
|
||||||
|
spec_path = Path(__file__).parent / 'openapi.yaml'
|
||||||
|
spec = yaml.safe_load(spec_path.read_text())
|
||||||
|
return spec
|
||||||
|
|
||||||
|
@app.route('/docs')
|
||||||
|
def swagger_ui():
|
||||||
|
"""Serve Swagger UI."""
|
||||||
|
return '''<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>ESP32-Web API</title>
|
||||||
|
<link rel="stylesheet" href="https://unpkg.com/swagger-ui-dist@5/swagger-ui.css">
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="swagger-ui"></div>
|
||||||
|
<script src="https://unpkg.com/swagger-ui-dist@5/swagger-ui-bundle.js"></script>
|
||||||
|
<script>
|
||||||
|
SwaggerUIBundle({
|
||||||
|
url: "/openapi.json",
|
||||||
|
dom_id: '#swagger-ui',
|
||||||
|
presets: [SwaggerUIBundle.presets.apis, SwaggerUIBundle.SwaggerUIStandalonePreset],
|
||||||
|
layout: "BaseLayout"
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>'''
|
||||||
|
|
||||||
# Start UDP collector in non-testing mode
|
# Start UDP collector in non-testing mode
|
||||||
if not app.config.get('TESTING'):
|
if not app.config.get('TESTING'):
|
||||||
from .collector import collector
|
from .collector import collector
|
||||||
@@ -47,4 +115,36 @@ def create_app(config_class=Config):
|
|||||||
with app.app_context():
|
with app.app_context():
|
||||||
collector.start()
|
collector.start()
|
||||||
|
|
||||||
|
# Register CLI commands
|
||||||
|
@app.cli.command('download-oui')
|
||||||
|
@click.option('--path', default=None, help='Path to save OUI database')
|
||||||
|
def download_oui_cmd(path):
|
||||||
|
"""Download IEEE OUI database."""
|
||||||
|
from .utils.oui import download_oui_db, load_oui_db, OUI_DB_PATH
|
||||||
|
target = Path(path) if path else OUI_DB_PATH
|
||||||
|
if download_oui_db(target):
|
||||||
|
count = load_oui_db(target)
|
||||||
|
click.echo(f'Downloaded and loaded {count} OUI entries')
|
||||||
|
else:
|
||||||
|
click.echo('Failed to download OUI database', err=True)
|
||||||
|
|
||||||
|
@app.cli.command('check-heartbeats')
|
||||||
|
def check_heartbeats_cmd():
|
||||||
|
"""Check and update sensor heartbeat status."""
|
||||||
|
from .services.heartbeat import update_all_heartbeats
|
||||||
|
counts = update_all_heartbeats()
|
||||||
|
click.echo(f"Sensors: {counts['online']} online, {counts['stale']} stale, {counts['offline']} offline")
|
||||||
|
|
||||||
|
@app.cli.command('cleanup-data')
|
||||||
|
def cleanup_data_cmd():
|
||||||
|
"""Delete data older than retention thresholds."""
|
||||||
|
from .services.retention import cleanup_old_data
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
total = sum(counts.values())
|
||||||
|
parts = [f"{name}: {n}" for name, n in counts.items() if n > 0]
|
||||||
|
if parts:
|
||||||
|
click.echo(f"Deleted {total} rows ({', '.join(parts)})")
|
||||||
|
else:
|
||||||
|
click.echo("No expired data found")
|
||||||
|
|
||||||
return app
|
return app
|
||||||
|
|||||||
@@ -1,6 +1,27 @@
|
|||||||
"""API Blueprint."""
|
"""API Blueprint."""
|
||||||
from flask import Blueprint
|
from flask import Blueprint, request
|
||||||
|
from ..extensions import db
|
||||||
|
|
||||||
bp = Blueprint('api', __name__)
|
bp = Blueprint('api', __name__)
|
||||||
|
|
||||||
from . import sensors, devices, alerts, events, probes, stats # noqa: E402, F401
|
|
||||||
|
def paginate(query, schema_fn):
|
||||||
|
"""Apply limit/offset pagination to a query and return items with metadata.
|
||||||
|
|
||||||
|
Returns dict with 'items', 'total', 'limit', 'offset'.
|
||||||
|
"""
|
||||||
|
limit = min(request.args.get('limit', 100, type=int), 1000)
|
||||||
|
offset = request.args.get('offset', 0, type=int)
|
||||||
|
total = db.session.scalar(
|
||||||
|
db.select(db.func.count()).select_from(query.subquery())
|
||||||
|
)
|
||||||
|
results = db.session.scalars(query.limit(limit).offset(offset)).all()
|
||||||
|
return {
|
||||||
|
'items': [schema_fn(r) for r in results],
|
||||||
|
'total': total,
|
||||||
|
'limit': limit,
|
||||||
|
'offset': offset,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
from . import sensors, devices, alerts, events, probes, stats, export, intelligence # noqa: E402, F401
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
"""Alert endpoints."""
|
"""Alert endpoints."""
|
||||||
from datetime import datetime, timedelta, UTC
|
from datetime import datetime, timedelta, UTC
|
||||||
from flask import request
|
from flask import request
|
||||||
from . import bp
|
from . import bp, paginate
|
||||||
from ..models import Alert
|
from ..models import Alert
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
|
|
||||||
@@ -12,8 +12,6 @@ def list_alerts():
|
|||||||
alert_type = request.args.get('type')
|
alert_type = request.args.get('type')
|
||||||
sensor_id = request.args.get('sensor_id', type=int)
|
sensor_id = request.args.get('sensor_id', type=int)
|
||||||
hours = request.args.get('hours', 24, type=int)
|
hours = request.args.get('hours', 24, type=int)
|
||||||
limit = min(int(request.args.get('limit', 100)), 1000)
|
|
||||||
offset = int(request.args.get('offset', 0))
|
|
||||||
|
|
||||||
since = datetime.now(UTC) - timedelta(hours=hours)
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
query = db.select(Alert).where(Alert.timestamp >= since).order_by(Alert.timestamp.desc())
|
query = db.select(Alert).where(Alert.timestamp >= since).order_by(Alert.timestamp.desc())
|
||||||
@@ -23,7 +21,6 @@ def list_alerts():
|
|||||||
if sensor_id:
|
if sensor_id:
|
||||||
query = query.where(Alert.sensor_id == sensor_id)
|
query = query.where(Alert.sensor_id == sensor_id)
|
||||||
|
|
||||||
query = query.limit(limit).offset(offset)
|
result = paginate(query, Alert.to_dict)
|
||||||
alerts = db.session.scalars(query).all()
|
return {'alerts': result['items'], 'total': result['total'],
|
||||||
|
'limit': result['limit'], 'offset': result['offset']}
|
||||||
return {'alerts': [a.to_dict() for a in alerts], 'limit': limit, 'offset': offset}
|
|
||||||
|
|||||||
@@ -1,24 +1,23 @@
|
|||||||
"""Device endpoints."""
|
"""Device endpoints."""
|
||||||
from flask import request
|
from flask import request
|
||||||
from . import bp
|
from . import bp, paginate
|
||||||
from ..models import Device, Sighting
|
from ..models import Device, Sighting
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
|
from ..services.device_service import enrich_device
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/devices')
|
@bp.route('/devices')
|
||||||
def list_devices():
|
def list_devices():
|
||||||
"""List all devices."""
|
"""List all devices."""
|
||||||
device_type = request.args.get('type') # 'ble' or 'wifi'
|
device_type = request.args.get('type') # 'ble' or 'wifi'
|
||||||
limit = min(int(request.args.get('limit', 100)), 1000)
|
|
||||||
offset = int(request.args.get('offset', 0))
|
|
||||||
|
|
||||||
query = db.select(Device).order_by(Device.last_seen.desc())
|
query = db.select(Device).order_by(Device.last_seen.desc())
|
||||||
if device_type:
|
if device_type:
|
||||||
query = query.where(Device.device_type == device_type)
|
query = query.where(Device.device_type == device_type)
|
||||||
query = query.limit(limit).offset(offset)
|
|
||||||
|
|
||||||
devices = db.session.scalars(query).all()
|
result = paginate(query, enrich_device)
|
||||||
return {'devices': [d.to_dict() for d in devices], 'limit': limit, 'offset': offset}
|
return {'devices': result['items'], 'total': result['total'],
|
||||||
|
'limit': result['limit'], 'offset': result['offset']}
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/devices/<mac>')
|
@bp.route('/devices/<mac>')
|
||||||
@@ -37,6 +36,6 @@ def get_device(mac):
|
|||||||
.limit(20)
|
.limit(20)
|
||||||
).all()
|
).all()
|
||||||
|
|
||||||
result = device.to_dict()
|
result = enrich_device(device)
|
||||||
result['sightings'] = [s.to_dict() for s in sightings]
|
result['sightings'] = [s.to_dict() for s in sightings]
|
||||||
return result
|
return result
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
"""Event endpoints."""
|
"""Event endpoints."""
|
||||||
from datetime import datetime, timedelta, UTC
|
from datetime import datetime, timedelta, UTC
|
||||||
from flask import request
|
from flask import request
|
||||||
from . import bp
|
from . import bp, paginate
|
||||||
from ..models import Event
|
from ..models import Event
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
|
|
||||||
@@ -12,8 +12,6 @@ def list_events():
|
|||||||
event_type = request.args.get('type')
|
event_type = request.args.get('type')
|
||||||
sensor_id = request.args.get('sensor_id', type=int)
|
sensor_id = request.args.get('sensor_id', type=int)
|
||||||
hours = request.args.get('hours', 24, type=int)
|
hours = request.args.get('hours', 24, type=int)
|
||||||
limit = min(int(request.args.get('limit', 100)), 1000)
|
|
||||||
offset = int(request.args.get('offset', 0))
|
|
||||||
|
|
||||||
since = datetime.now(UTC) - timedelta(hours=hours)
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
query = db.select(Event).where(Event.timestamp >= since).order_by(Event.timestamp.desc())
|
query = db.select(Event).where(Event.timestamp >= since).order_by(Event.timestamp.desc())
|
||||||
@@ -23,7 +21,6 @@ def list_events():
|
|||||||
if sensor_id:
|
if sensor_id:
|
||||||
query = query.where(Event.sensor_id == sensor_id)
|
query = query.where(Event.sensor_id == sensor_id)
|
||||||
|
|
||||||
query = query.limit(limit).offset(offset)
|
result = paginate(query, Event.to_dict)
|
||||||
events = db.session.scalars(query).all()
|
return {'events': result['items'], 'total': result['total'],
|
||||||
|
'limit': result['limit'], 'offset': result['offset']}
|
||||||
return {'events': [e.to_dict() for e in events], 'limit': limit, 'offset': offset}
|
|
||||||
|
|||||||
100
src/esp32_web/api/export.py
Normal file
100
src/esp32_web/api/export.py
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
"""Export endpoints."""
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
from flask import Response, request
|
||||||
|
from . import bp
|
||||||
|
from ..models import Device, Alert, Probe
|
||||||
|
from ..extensions import db
|
||||||
|
from ..services.device_service import enrich_device
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/export/devices.csv')
|
||||||
|
def export_devices_csv():
|
||||||
|
"""Export devices as CSV."""
|
||||||
|
devices = db.session.scalars(db.select(Device).order_by(Device.last_seen.desc())).all()
|
||||||
|
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output)
|
||||||
|
writer.writerow(['mac', 'type', 'vendor', 'name', 'company_id', 'first_seen', 'last_seen'])
|
||||||
|
|
||||||
|
for d in devices:
|
||||||
|
writer.writerow([
|
||||||
|
d.mac, d.device_type, d.vendor or '', d.name or '',
|
||||||
|
d.company_id or '', d.first_seen.isoformat(), d.last_seen.isoformat()
|
||||||
|
])
|
||||||
|
|
||||||
|
return Response(
|
||||||
|
output.getvalue(),
|
||||||
|
mimetype='text/csv',
|
||||||
|
headers={'Content-Disposition': 'attachment; filename=devices.csv'}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/export/devices.json')
|
||||||
|
def export_devices_json():
|
||||||
|
"""Export devices as JSON."""
|
||||||
|
devices = db.session.scalars(db.select(Device).order_by(Device.last_seen.desc())).all()
|
||||||
|
data = [enrich_device(d) for d in devices]
|
||||||
|
|
||||||
|
return Response(
|
||||||
|
json.dumps(data, indent=2),
|
||||||
|
mimetype='application/json',
|
||||||
|
headers={'Content-Disposition': 'attachment; filename=devices.json'}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/export/alerts.csv')
|
||||||
|
def export_alerts_csv():
|
||||||
|
"""Export alerts as CSV."""
|
||||||
|
hours = request.args.get('hours', 24, type=int)
|
||||||
|
from datetime import datetime, timedelta, UTC
|
||||||
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
|
|
||||||
|
alerts = db.session.scalars(
|
||||||
|
db.select(Alert).where(Alert.timestamp >= since).order_by(Alert.timestamp.desc())
|
||||||
|
).all()
|
||||||
|
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output)
|
||||||
|
writer.writerow(['timestamp', 'sensor_id', 'type', 'source_mac', 'target_mac', 'rssi'])
|
||||||
|
|
||||||
|
for a in alerts:
|
||||||
|
writer.writerow([
|
||||||
|
a.timestamp.isoformat(), a.sensor_id, a.alert_type,
|
||||||
|
a.source_mac or '', a.target_mac or '', a.rssi or ''
|
||||||
|
])
|
||||||
|
|
||||||
|
return Response(
|
||||||
|
output.getvalue(),
|
||||||
|
mimetype='text/csv',
|
||||||
|
headers={'Content-Disposition': 'attachment; filename=alerts.csv'}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/export/probes.csv')
|
||||||
|
def export_probes_csv():
|
||||||
|
"""Export probe requests as CSV."""
|
||||||
|
hours = request.args.get('hours', 24, type=int)
|
||||||
|
from datetime import datetime, timedelta, UTC
|
||||||
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
|
|
||||||
|
probes = db.session.scalars(
|
||||||
|
db.select(Probe).where(Probe.timestamp >= since).order_by(Probe.timestamp.desc())
|
||||||
|
).all()
|
||||||
|
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output)
|
||||||
|
writer.writerow(['timestamp', 'sensor_id', 'device_id', 'ssid', 'rssi', 'channel'])
|
||||||
|
|
||||||
|
for p in probes:
|
||||||
|
writer.writerow([
|
||||||
|
p.timestamp.isoformat(), p.sensor_id, p.device_id,
|
||||||
|
p.ssid, p.rssi, p.channel
|
||||||
|
])
|
||||||
|
|
||||||
|
return Response(
|
||||||
|
output.getvalue(),
|
||||||
|
mimetype='text/csv',
|
||||||
|
headers={'Content-Disposition': 'attachment; filename=probes.csv'}
|
||||||
|
)
|
||||||
241
src/esp32_web/api/intelligence.py
Normal file
241
src/esp32_web/api/intelligence.py
Normal file
@@ -0,0 +1,241 @@
|
|||||||
|
"""Device intelligence API endpoints."""
|
||||||
|
from collections import defaultdict
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
from flask import request
|
||||||
|
from . import bp
|
||||||
|
from ..extensions import db
|
||||||
|
from ..models import Device, Probe, Sighting
|
||||||
|
from ..utils.ble_companies import lookup_ble_company
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/intelligence/vendor-treemap')
|
||||||
|
def vendor_treemap():
|
||||||
|
"""Return D3-ready hierarchy of devices grouped by type and vendor."""
|
||||||
|
# WiFi devices grouped by vendor
|
||||||
|
wifi_rows = db.session.execute(
|
||||||
|
db.select(
|
||||||
|
db.func.coalesce(Device.vendor, 'Unknown'),
|
||||||
|
db.func.count(),
|
||||||
|
)
|
||||||
|
.where(Device.device_type == 'wifi')
|
||||||
|
.group_by(Device.vendor)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# BLE devices grouped by company_id
|
||||||
|
ble_rows = db.session.execute(
|
||||||
|
db.select(
|
||||||
|
Device.company_id,
|
||||||
|
db.func.count(),
|
||||||
|
)
|
||||||
|
.where(Device.device_type == 'ble')
|
||||||
|
.group_by(Device.company_id)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
wifi_children = [
|
||||||
|
{'name': vendor, 'value': count}
|
||||||
|
for vendor, count in wifi_rows if count > 0
|
||||||
|
]
|
||||||
|
ble_children = [
|
||||||
|
{'name': (lookup_ble_company(cid) or f'ID {cid}') if cid else 'Unknown', 'value': count}
|
||||||
|
for cid, count in ble_rows if count > 0
|
||||||
|
]
|
||||||
|
|
||||||
|
children = []
|
||||||
|
if wifi_children:
|
||||||
|
children.append({'name': 'wifi', 'children': wifi_children})
|
||||||
|
if ble_children:
|
||||||
|
children.append({'name': 'ble', 'children': ble_children})
|
||||||
|
|
||||||
|
return {'name': 'devices', 'children': children}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/intelligence/ssid-graph')
|
||||||
|
def ssid_graph():
|
||||||
|
"""Return force-graph data: nodes (devices) and links (shared probed SSIDs)."""
|
||||||
|
hours = max(1, request.args.get('hours', 24, type=int))
|
||||||
|
min_shared = max(1, request.args.get('min_shared', 1, type=int))
|
||||||
|
limit = max(1, min(request.args.get('limit', 200, type=int), 500))
|
||||||
|
|
||||||
|
cutoff = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Get distinct (device_id, ssid) pairs in time window
|
||||||
|
rows = db.session.execute(
|
||||||
|
db.select(Probe.device_id, Probe.ssid)
|
||||||
|
.where(Probe.timestamp >= cutoff)
|
||||||
|
.distinct()
|
||||||
|
).all()
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
return {'nodes': [], 'links': [], 'ssids': []}
|
||||||
|
|
||||||
|
# Build mappings
|
||||||
|
device_ssids: dict[int, set[str]] = defaultdict(set)
|
||||||
|
ssid_devices: dict[str, set[int]] = defaultdict(set)
|
||||||
|
for device_id, ssid in rows:
|
||||||
|
device_ssids[device_id].add(ssid)
|
||||||
|
ssid_devices[ssid].add(device_id)
|
||||||
|
|
||||||
|
# Rank devices by probe diversity, cap at limit
|
||||||
|
top_devices = sorted(device_ssids.keys(),
|
||||||
|
key=lambda d: len(device_ssids[d]),
|
||||||
|
reverse=True)[:limit]
|
||||||
|
top_set = set(top_devices)
|
||||||
|
|
||||||
|
# Find device pairs sharing >= min_shared SSIDs
|
||||||
|
links = []
|
||||||
|
seen_pairs: set[tuple[int, int]] = set()
|
||||||
|
for ssid, devices in ssid_devices.items():
|
||||||
|
device_list = [d for d in devices if d in top_set]
|
||||||
|
for i, d1 in enumerate(device_list):
|
||||||
|
for d2 in device_list[i + 1:]:
|
||||||
|
pair = (min(d1, d2), max(d1, d2))
|
||||||
|
if pair not in seen_pairs:
|
||||||
|
seen_pairs.add(pair)
|
||||||
|
shared = device_ssids[d1] & device_ssids[d2]
|
||||||
|
if len(shared) >= min_shared:
|
||||||
|
links.append({
|
||||||
|
'source_id': pair[0],
|
||||||
|
'target_id': pair[1],
|
||||||
|
'shared_ssids': sorted(shared),
|
||||||
|
'weight': len(shared),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Collect device IDs that appear in at least one link
|
||||||
|
linked_ids = set()
|
||||||
|
for link in links:
|
||||||
|
linked_ids.add(link['source_id'])
|
||||||
|
linked_ids.add(link['target_id'])
|
||||||
|
|
||||||
|
# Also include isolated nodes from top devices
|
||||||
|
node_ids = linked_ids | top_set
|
||||||
|
|
||||||
|
# Fetch device details
|
||||||
|
devices = db.session.scalars(
|
||||||
|
db.select(Device).where(Device.id.in_(node_ids))
|
||||||
|
).all()
|
||||||
|
device_map = {d.id: d for d in devices}
|
||||||
|
|
||||||
|
nodes = []
|
||||||
|
for did in node_ids:
|
||||||
|
d = device_map.get(did)
|
||||||
|
if d:
|
||||||
|
nodes.append({
|
||||||
|
'id': d.mac,
|
||||||
|
'device_id': d.id,
|
||||||
|
'vendor': d.vendor or 'Unknown',
|
||||||
|
'type': d.device_type,
|
||||||
|
'ssid_count': len(device_ssids.get(did, set())),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Map device IDs to MACs in links
|
||||||
|
out_links = []
|
||||||
|
for link in links:
|
||||||
|
src = device_map.get(link['source_id'])
|
||||||
|
tgt = device_map.get(link['target_id'])
|
||||||
|
if src and tgt:
|
||||||
|
out_links.append({
|
||||||
|
'source': src.mac,
|
||||||
|
'target': tgt.mac,
|
||||||
|
'shared_ssids': link['shared_ssids'],
|
||||||
|
'weight': link['weight'],
|
||||||
|
})
|
||||||
|
|
||||||
|
# SSID summary
|
||||||
|
ssid_summary = sorted(
|
||||||
|
[{'ssid': s, 'device_count': len(ds)} for s, ds in ssid_devices.items()],
|
||||||
|
key=lambda x: x['device_count'],
|
||||||
|
reverse=True,
|
||||||
|
)[:50]
|
||||||
|
|
||||||
|
return {'nodes': nodes, 'links': out_links, 'ssids': ssid_summary}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/intelligence/fingerprint-clusters')
|
||||||
|
def fingerprint_clusters():
|
||||||
|
"""Group active devices by behavior: (type, vendor, activity level)."""
|
||||||
|
hours = max(1, request.args.get('hours', 24, type=int))
|
||||||
|
cutoff = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Probe counts per device
|
||||||
|
probe_counts = dict(db.session.execute(
|
||||||
|
db.select(Probe.device_id, db.func.count())
|
||||||
|
.where(Probe.timestamp >= cutoff)
|
||||||
|
.group_by(Probe.device_id)
|
||||||
|
).all())
|
||||||
|
|
||||||
|
# Sighting counts and avg RSSI per device
|
||||||
|
sighting_stats = {
|
||||||
|
row[0]: (row[1], row[2])
|
||||||
|
for row in db.session.execute(
|
||||||
|
db.select(
|
||||||
|
Sighting.device_id,
|
||||||
|
db.func.count(),
|
||||||
|
db.func.avg(Sighting.rssi),
|
||||||
|
)
|
||||||
|
.where(Sighting.timestamp >= cutoff)
|
||||||
|
.group_by(Sighting.device_id)
|
||||||
|
).all()
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get all active device IDs
|
||||||
|
active_ids = set(probe_counts.keys()) | set(sighting_stats.keys())
|
||||||
|
if not active_ids:
|
||||||
|
return {'clusters': [], 'total_devices': 0, 'total_clusters': 0}
|
||||||
|
|
||||||
|
# Fetch devices
|
||||||
|
devices = db.session.scalars(
|
||||||
|
db.select(Device).where(Device.id.in_(active_ids))
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Bucket activity level
|
||||||
|
def activity_bucket(count: int) -> str:
|
||||||
|
if count > 20:
|
||||||
|
return 'High'
|
||||||
|
if count >= 5:
|
||||||
|
return 'Medium'
|
||||||
|
return 'Low'
|
||||||
|
|
||||||
|
# Group by (device_type, vendor, activity_bucket)
|
||||||
|
clusters: dict[tuple[str, str, str], list] = defaultdict(list)
|
||||||
|
for d in devices:
|
||||||
|
pc = probe_counts.get(d.id, 0)
|
||||||
|
sc, avg_rssi = sighting_stats.get(d.id, (0, None))
|
||||||
|
total_activity = pc + sc
|
||||||
|
bucket = activity_bucket(total_activity)
|
||||||
|
vendor = d.vendor or 'Unknown'
|
||||||
|
key = (d.device_type, vendor, bucket)
|
||||||
|
clusters[key].append({
|
||||||
|
'mac': d.mac,
|
||||||
|
'probe_count': pc,
|
||||||
|
'sighting_count': sc,
|
||||||
|
'avg_rssi': round(avg_rssi) if avg_rssi is not None else None,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Build response
|
||||||
|
result = []
|
||||||
|
for idx, ((dtype, vendor, bucket), devs) in enumerate(
|
||||||
|
sorted(clusters.items(), key=lambda x: len(x[1]), reverse=True)
|
||||||
|
):
|
||||||
|
probe_total = sum(d['probe_count'] for d in devs)
|
||||||
|
sight_total = sum(d['sighting_count'] for d in devs)
|
||||||
|
rssi_vals = [d['avg_rssi'] for d in devs if d['avg_rssi'] is not None]
|
||||||
|
result.append({
|
||||||
|
'id': idx,
|
||||||
|
'label': f'{vendor} {dtype.upper()} - {bucket} Activity',
|
||||||
|
'device_type': dtype,
|
||||||
|
'vendor': vendor,
|
||||||
|
'activity': bucket,
|
||||||
|
'device_count': len(devs),
|
||||||
|
'devices': devs,
|
||||||
|
'centroid': {
|
||||||
|
'probe_rate': round(probe_total / len(devs), 1) if devs else 0,
|
||||||
|
'sighting_rate': round(sight_total / len(devs), 1) if devs else 0,
|
||||||
|
'avg_rssi': round(sum(rssi_vals) / len(rssi_vals)) if rssi_vals else None,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
'clusters': result,
|
||||||
|
'total_devices': len(active_ids),
|
||||||
|
'total_clusters': len(result),
|
||||||
|
}
|
||||||
@@ -2,7 +2,7 @@
|
|||||||
from datetime import datetime, timedelta, UTC
|
from datetime import datetime, timedelta, UTC
|
||||||
from flask import request
|
from flask import request
|
||||||
from sqlalchemy import func
|
from sqlalchemy import func
|
||||||
from . import bp
|
from . import bp, paginate
|
||||||
from ..models import Probe, Device
|
from ..models import Probe, Device
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
|
|
||||||
@@ -12,8 +12,6 @@ def list_probes():
|
|||||||
"""List probe requests."""
|
"""List probe requests."""
|
||||||
ssid = request.args.get('ssid')
|
ssid = request.args.get('ssid')
|
||||||
hours = request.args.get('hours', 24, type=int)
|
hours = request.args.get('hours', 24, type=int)
|
||||||
limit = min(int(request.args.get('limit', 100)), 1000)
|
|
||||||
offset = int(request.args.get('offset', 0))
|
|
||||||
|
|
||||||
since = datetime.now(UTC) - timedelta(hours=hours)
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
query = db.select(Probe).where(Probe.timestamp >= since).order_by(Probe.timestamp.desc())
|
query = db.select(Probe).where(Probe.timestamp >= since).order_by(Probe.timestamp.desc())
|
||||||
@@ -21,10 +19,9 @@ def list_probes():
|
|||||||
if ssid:
|
if ssid:
|
||||||
query = query.where(Probe.ssid == ssid)
|
query = query.where(Probe.ssid == ssid)
|
||||||
|
|
||||||
query = query.limit(limit).offset(offset)
|
result = paginate(query, Probe.to_dict)
|
||||||
probes = db.session.scalars(query).all()
|
return {'probes': result['items'], 'total': result['total'],
|
||||||
|
'limit': result['limit'], 'offset': result['offset']}
|
||||||
return {'probes': [p.to_dict() for p in probes], 'limit': limit, 'offset': offset}
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/probes/ssids')
|
@bp.route('/probes/ssids')
|
||||||
|
|||||||
@@ -1,16 +1,21 @@
|
|||||||
"""Sensor endpoints."""
|
"""Sensor endpoints."""
|
||||||
|
import json
|
||||||
import socket
|
import socket
|
||||||
|
from datetime import datetime, timedelta, UTC
|
||||||
from flask import request, current_app
|
from flask import request, current_app
|
||||||
from . import bp
|
from . import bp, paginate
|
||||||
from ..models import Sensor
|
from ..models import Sensor, Event, Sighting, Alert
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
|
from ..services.heartbeat import get_heartbeat_summary, update_all_heartbeats
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/sensors')
|
@bp.route('/sensors')
|
||||||
def list_sensors():
|
def list_sensors():
|
||||||
"""List all sensors."""
|
"""List all sensors."""
|
||||||
sensors = db.session.scalars(db.select(Sensor).order_by(Sensor.hostname)).all()
|
query = db.select(Sensor).order_by(Sensor.hostname)
|
||||||
return {'sensors': [s.to_dict() for s in sensors]}
|
result = paginate(query, Sensor.to_dict)
|
||||||
|
return {'sensors': result['items'], 'total': result['total'],
|
||||||
|
'limit': result['limit'], 'offset': result['offset']}
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/sensors/<hostname>')
|
@bp.route('/sensors/<hostname>')
|
||||||
@@ -37,7 +42,8 @@ def send_command(hostname):
|
|||||||
|
|
||||||
# Whitelist allowed commands
|
# Whitelist allowed commands
|
||||||
allowed = ('STATUS', 'REBOOT', 'IDENTIFY', 'BLE', 'ADAPTIVE', 'RATE', 'POWER',
|
allowed = ('STATUS', 'REBOOT', 'IDENTIFY', 'BLE', 'ADAPTIVE', 'RATE', 'POWER',
|
||||||
'CSIMODE', 'PRESENCE', 'CALIBRATE', 'CHANSCAN')
|
'CSIMODE', 'PRESENCE', 'CALIBRATE', 'CHANSCAN', 'OTA', 'TARGET',
|
||||||
|
'THRESHOLD', 'SCANRATE', 'PROBERATE', 'POWERSAVE', 'FLOODTHRESH')
|
||||||
if not any(command.startswith(a) for a in allowed):
|
if not any(command.startswith(a) for a in allowed):
|
||||||
return {'error': 'Command not allowed'}, 403
|
return {'error': 'Command not allowed'}, 403
|
||||||
|
|
||||||
@@ -51,3 +57,233 @@ def send_command(hostname):
|
|||||||
return {'error': f'Socket error: {e}'}, 500
|
return {'error': f'Socket error: {e}'}, 500
|
||||||
|
|
||||||
return {'status': 'sent', 'command': command}
|
return {'status': 'sent', 'command': command}
|
||||||
|
|
||||||
|
|
||||||
|
def _send_command_with_response(ip: str, command: str, timeout: float = 2.0) -> str | None:
|
||||||
|
"""Send UDP command and wait for response."""
|
||||||
|
try:
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||||
|
sock.settimeout(timeout)
|
||||||
|
sock.sendto(command.encode(), (ip, current_app.config['SENSOR_CMD_PORT']))
|
||||||
|
data, _ = sock.recvfrom(1400)
|
||||||
|
sock.close()
|
||||||
|
return data.decode('utf-8', errors='replace').strip()
|
||||||
|
except socket.timeout:
|
||||||
|
return None
|
||||||
|
except socket.error:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_status_response(response: str) -> dict:
|
||||||
|
"""Parse STATUS response into dict."""
|
||||||
|
result = {}
|
||||||
|
if not response or not response.startswith('OK STATUS'):
|
||||||
|
return result
|
||||||
|
# Parse key=value pairs
|
||||||
|
for part in response.split():
|
||||||
|
if '=' in part:
|
||||||
|
key, value = part.split('=', 1)
|
||||||
|
# Try to convert to appropriate type
|
||||||
|
if value.isdigit():
|
||||||
|
result[key] = int(value)
|
||||||
|
elif value.replace('.', '').replace('-', '').isdigit():
|
||||||
|
try:
|
||||||
|
result[key] = float(value)
|
||||||
|
except ValueError:
|
||||||
|
result[key] = value
|
||||||
|
elif value in ('on', 'true'):
|
||||||
|
result[key] = True
|
||||||
|
elif value in ('off', 'false'):
|
||||||
|
result[key] = False
|
||||||
|
else:
|
||||||
|
result[key] = value
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/<hostname>/config')
|
||||||
|
def get_sensor_config(hostname):
|
||||||
|
"""Get sensor configuration by querying STATUS."""
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == hostname))
|
||||||
|
if not sensor:
|
||||||
|
return {'error': 'Sensor not found'}, 404
|
||||||
|
|
||||||
|
response = _send_command_with_response(sensor.ip, 'STATUS')
|
||||||
|
if not response:
|
||||||
|
return {'error': 'Sensor not responding'}, 504
|
||||||
|
|
||||||
|
config = _parse_status_response(response)
|
||||||
|
config['hostname'] = sensor.hostname
|
||||||
|
config['ip'] = sensor.ip
|
||||||
|
|
||||||
|
# Store config in database
|
||||||
|
sensor.config_json = json.dumps(config)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
return {'config': config}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/<hostname>/config', methods=['PUT'])
|
||||||
|
def update_sensor_config(hostname):
|
||||||
|
"""Update sensor configuration."""
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == hostname))
|
||||||
|
if not sensor:
|
||||||
|
return {'error': 'Sensor not found'}, 404
|
||||||
|
|
||||||
|
data = request.get_json()
|
||||||
|
if not data:
|
||||||
|
return {'error': 'No configuration provided'}, 400
|
||||||
|
|
||||||
|
results = {}
|
||||||
|
errors = []
|
||||||
|
|
||||||
|
# Map config keys to commands
|
||||||
|
config_commands = {
|
||||||
|
'rate': lambda v: f'RATE {v}',
|
||||||
|
'power': lambda v: f'POWER {v}',
|
||||||
|
'adaptive': lambda v: f'ADAPTIVE {"ON" if v else "OFF"}',
|
||||||
|
'threshold': lambda v: f'THRESHOLD {v}',
|
||||||
|
'ble': lambda v: f'BLE {"ON" if v else "OFF"}',
|
||||||
|
'csi_mode': lambda v: f'CSIMODE {v.upper()}',
|
||||||
|
'presence': lambda v: f'PRESENCE {"ON" if v else "OFF"}',
|
||||||
|
'powersave': lambda v: f'POWERSAVE {"ON" if v else "OFF"}',
|
||||||
|
'chanscan': lambda v: f'CHANSCAN {"ON" if v else "OFF"}',
|
||||||
|
}
|
||||||
|
|
||||||
|
for key, value in data.items():
|
||||||
|
if key not in config_commands:
|
||||||
|
errors.append(f'Unknown config key: {key}')
|
||||||
|
continue
|
||||||
|
|
||||||
|
command = config_commands[key](value)
|
||||||
|
try:
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||||
|
sock.settimeout(2.0)
|
||||||
|
sock.sendto(command.encode(), (sensor.ip, current_app.config['SENSOR_CMD_PORT']))
|
||||||
|
sock.close()
|
||||||
|
results[key] = 'ok'
|
||||||
|
except socket.error as e:
|
||||||
|
errors.append(f'{key}: {e}')
|
||||||
|
|
||||||
|
return {'results': results, 'errors': errors}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/<hostname>/ota', methods=['POST'])
|
||||||
|
def trigger_ota(hostname):
|
||||||
|
"""Trigger OTA update on sensor."""
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == hostname))
|
||||||
|
if not sensor:
|
||||||
|
return {'error': 'Sensor not found'}, 404
|
||||||
|
|
||||||
|
data = request.get_json()
|
||||||
|
if not data or 'url' not in data:
|
||||||
|
return {'error': 'Missing OTA URL'}, 400
|
||||||
|
|
||||||
|
url = data['url']
|
||||||
|
if not url.startswith(('http://', 'https://')):
|
||||||
|
return {'error': 'Invalid URL scheme'}, 400
|
||||||
|
|
||||||
|
command = f'OTA {url}'
|
||||||
|
try:
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||||
|
sock.settimeout(2.0)
|
||||||
|
sock.sendto(command.encode(), (sensor.ip, current_app.config['SENSOR_CMD_PORT']))
|
||||||
|
sock.close()
|
||||||
|
except socket.error as e:
|
||||||
|
return {'error': f'Socket error: {e}'}, 500
|
||||||
|
|
||||||
|
return {'status': 'ota_triggered', 'url': url}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/<hostname>/calibrate', methods=['POST'])
|
||||||
|
def trigger_calibrate(hostname):
|
||||||
|
"""Trigger baseline calibration on sensor."""
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == hostname))
|
||||||
|
if not sensor:
|
||||||
|
return {'error': 'Sensor not found'}, 404
|
||||||
|
|
||||||
|
data = request.get_json() or {}
|
||||||
|
seconds = data.get('seconds', 10)
|
||||||
|
|
||||||
|
if not isinstance(seconds, int) or seconds < 3 or seconds > 60:
|
||||||
|
return {'error': 'seconds must be 3-60'}, 400
|
||||||
|
|
||||||
|
command = f'CALIBRATE {seconds}'
|
||||||
|
try:
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||||
|
sock.settimeout(2.0)
|
||||||
|
sock.sendto(command.encode(), (sensor.ip, current_app.config['SENSOR_CMD_PORT']))
|
||||||
|
sock.close()
|
||||||
|
except socket.error as e:
|
||||||
|
return {'error': f'Socket error: {e}'}, 500
|
||||||
|
|
||||||
|
return {'status': 'calibration_started', 'seconds': seconds}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/heartbeat')
|
||||||
|
def get_heartbeat_status():
|
||||||
|
"""Get heartbeat status for all sensors."""
|
||||||
|
return get_heartbeat_summary()
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/heartbeat', methods=['POST'])
|
||||||
|
def refresh_heartbeats():
|
||||||
|
"""Update heartbeat status for all sensors."""
|
||||||
|
counts = update_all_heartbeats()
|
||||||
|
return {
|
||||||
|
'status': 'updated',
|
||||||
|
'online': counts['online'],
|
||||||
|
'stale': counts['stale'],
|
||||||
|
'offline': counts['offline']
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/sensors/<hostname>/metrics')
|
||||||
|
def get_sensor_metrics(hostname):
|
||||||
|
"""Get sensor activity metrics and recent events."""
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == hostname))
|
||||||
|
if not sensor:
|
||||||
|
return {'error': 'Sensor not found'}, 404
|
||||||
|
|
||||||
|
# Time range (default: last 24 hours)
|
||||||
|
hours = request.args.get('hours', 24, type=int)
|
||||||
|
if hours < 1 or hours > 168: # max 1 week
|
||||||
|
hours = 24
|
||||||
|
since = datetime.now(UTC) - timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Count activity
|
||||||
|
sightings_count = db.session.scalar(
|
||||||
|
db.select(db.func.count(Sighting.id))
|
||||||
|
.where(Sighting.sensor_id == sensor.id)
|
||||||
|
.where(Sighting.timestamp >= since)
|
||||||
|
) or 0
|
||||||
|
|
||||||
|
alerts_count = db.session.scalar(
|
||||||
|
db.select(db.func.count(Alert.id))
|
||||||
|
.where(Alert.sensor_id == sensor.id)
|
||||||
|
.where(Alert.timestamp >= since)
|
||||||
|
) or 0
|
||||||
|
|
||||||
|
events_count = db.session.scalar(
|
||||||
|
db.select(db.func.count(Event.id))
|
||||||
|
.where(Event.sensor_id == sensor.id)
|
||||||
|
.where(Event.timestamp >= since)
|
||||||
|
) or 0
|
||||||
|
|
||||||
|
# Recent events (last 20)
|
||||||
|
recent_events = db.session.scalars(
|
||||||
|
db.select(Event)
|
||||||
|
.where(Event.sensor_id == sensor.id)
|
||||||
|
.order_by(Event.timestamp.desc())
|
||||||
|
.limit(20)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
return {
|
||||||
|
'hostname': sensor.hostname,
|
||||||
|
'hours': hours,
|
||||||
|
'activity': {
|
||||||
|
'sightings': sightings_count,
|
||||||
|
'alerts': alerts_count,
|
||||||
|
'events': events_count,
|
||||||
|
},
|
||||||
|
'recent_events': [e.to_dict() for e in recent_events]
|
||||||
|
}
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ from datetime import datetime, UTC
|
|||||||
|
|
||||||
from ..extensions import db
|
from ..extensions import db
|
||||||
from ..models import Sensor, Device, Sighting, Alert, Event, Probe
|
from ..models import Sensor, Device, Sighting, Alert, Event, Probe
|
||||||
|
from ..utils.oui import lookup_vendor
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -29,11 +30,15 @@ def get_or_create_device(mac: str, device_type: str) -> Device:
|
|||||||
mac = mac.lower()
|
mac = mac.lower()
|
||||||
device = db.session.scalar(db.select(Device).where(Device.mac == mac))
|
device = db.session.scalar(db.select(Device).where(Device.mac == mac))
|
||||||
if not device:
|
if not device:
|
||||||
device = Device(mac=mac, device_type=device_type)
|
vendor = lookup_vendor(mac)
|
||||||
|
device = Device(mac=mac, device_type=device_type, vendor=vendor)
|
||||||
db.session.add(device)
|
db.session.add(device)
|
||||||
db.session.flush()
|
db.session.flush()
|
||||||
else:
|
else:
|
||||||
device.last_seen = datetime.now(UTC)
|
device.last_seen = datetime.now(UTC)
|
||||||
|
# Update vendor if not set
|
||||||
|
if not device.vendor:
|
||||||
|
device.vendor = lookup_vendor(mac)
|
||||||
return device
|
return device
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -13,6 +13,12 @@ class Config:
|
|||||||
SENSOR_CMD_PORT = int(os.environ.get('CMD_PORT', 5501))
|
SENSOR_CMD_PORT = int(os.environ.get('CMD_PORT', 5501))
|
||||||
SENSOR_TIMEOUT = int(os.environ.get('SENSOR_TIMEOUT', 60))
|
SENSOR_TIMEOUT = int(os.environ.get('SENSOR_TIMEOUT', 60))
|
||||||
|
|
||||||
|
# Data retention (days)
|
||||||
|
RETENTION_SIGHTINGS_DAYS = int(os.environ.get('RETENTION_SIGHTINGS_DAYS', 14))
|
||||||
|
RETENTION_PROBES_DAYS = int(os.environ.get('RETENTION_PROBES_DAYS', 14))
|
||||||
|
RETENTION_EVENTS_DAYS = int(os.environ.get('RETENTION_EVENTS_DAYS', 60))
|
||||||
|
RETENTION_ALERTS_DAYS = int(os.environ.get('RETENTION_ALERTS_DAYS', 365))
|
||||||
|
|
||||||
|
|
||||||
class TestConfig(Config):
|
class TestConfig(Config):
|
||||||
"""Testing configuration."""
|
"""Testing configuration."""
|
||||||
|
|||||||
40
src/esp32_web/dashboard/__init__.py
Normal file
40
src/esp32_web/dashboard/__init__.py
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
"""Dashboard blueprint."""
|
||||||
|
from flask import Blueprint, render_template
|
||||||
|
from ..extensions import db
|
||||||
|
from ..models import Device, Sensor
|
||||||
|
|
||||||
|
bp = Blueprint('dashboard', __name__, url_prefix='/dashboard')
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/')
|
||||||
|
def index():
|
||||||
|
"""Render the main dashboard page."""
|
||||||
|
total_devices = db.session.scalar(
|
||||||
|
db.select(db.func.count()).select_from(Device)
|
||||||
|
)
|
||||||
|
total_sensors = db.session.scalar(
|
||||||
|
db.select(db.func.count()).select_from(Sensor)
|
||||||
|
)
|
||||||
|
return render_template(
|
||||||
|
'dashboard/index.html',
|
||||||
|
total_devices=total_devices,
|
||||||
|
total_sensors=total_sensors,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/tab/vendor-treemap')
|
||||||
|
def tab_vendor_treemap():
|
||||||
|
"""Return vendor treemap partial."""
|
||||||
|
return render_template('dashboard/partials/vendor_treemap.html')
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/tab/ssid-graph')
|
||||||
|
def tab_ssid_graph():
|
||||||
|
"""Return SSID graph partial."""
|
||||||
|
return render_template('dashboard/partials/ssid_graph.html')
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/tab/fingerprint-clusters')
|
||||||
|
def tab_fingerprint_clusters():
|
||||||
|
"""Return fingerprint clusters partial."""
|
||||||
|
return render_template('dashboard/partials/fingerprint_clusters.html')
|
||||||
1040
src/esp32_web/openapi.yaml
Normal file
1040
src/esp32_web/openapi.yaml
Normal file
File diff suppressed because it is too large
Load Diff
33
src/esp32_web/services/device_service.py
Normal file
33
src/esp32_web/services/device_service.py
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
"""Device enrichment service."""
|
||||||
|
from ..models import Device
|
||||||
|
from ..utils.oui import lookup_vendor
|
||||||
|
from ..utils.ble_companies import lookup_ble_company
|
||||||
|
|
||||||
|
|
||||||
|
def enrich_device(device: Device) -> dict:
|
||||||
|
"""Enrich device with vendor and company info."""
|
||||||
|
data = device.to_dict()
|
||||||
|
|
||||||
|
# Add vendor from OUI if not set
|
||||||
|
if not device.vendor:
|
||||||
|
vendor = lookup_vendor(device.mac)
|
||||||
|
if vendor:
|
||||||
|
data['vendor'] = vendor
|
||||||
|
|
||||||
|
# Add BLE company name if company_id present
|
||||||
|
if device.company_id:
|
||||||
|
company_name = lookup_ble_company(device.company_id)
|
||||||
|
if company_name:
|
||||||
|
data['company_name'] = company_name
|
||||||
|
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def get_vendor_for_mac(mac: str) -> str | None:
|
||||||
|
"""Get vendor for MAC address."""
|
||||||
|
return lookup_vendor(mac)
|
||||||
|
|
||||||
|
|
||||||
|
def get_company_for_id(company_id: int) -> str | None:
|
||||||
|
"""Get company name for BLE company ID."""
|
||||||
|
return lookup_ble_company(company_id)
|
||||||
83
src/esp32_web/services/heartbeat.py
Normal file
83
src/esp32_web/services/heartbeat.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
"""Sensor heartbeat service."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
from ..extensions import db
|
||||||
|
from ..models import Sensor
|
||||||
|
|
||||||
|
|
||||||
|
# Default thresholds in seconds
|
||||||
|
ONLINE_THRESHOLD = 60 # < 1 minute = online
|
||||||
|
STALE_THRESHOLD = 300 # 1-5 minutes = stale
|
||||||
|
# > 5 minutes = offline
|
||||||
|
|
||||||
|
|
||||||
|
def check_sensor_status(sensor: Sensor, now: datetime | None = None) -> str:
|
||||||
|
"""Determine sensor status based on last_seen timestamp."""
|
||||||
|
if now is None:
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
|
||||||
|
# Handle timezone-naive datetimes from DB
|
||||||
|
last_seen = sensor.last_seen
|
||||||
|
if last_seen.tzinfo is None:
|
||||||
|
last_seen = last_seen.replace(tzinfo=UTC)
|
||||||
|
|
||||||
|
delta = (now - last_seen).total_seconds()
|
||||||
|
|
||||||
|
if delta < ONLINE_THRESHOLD:
|
||||||
|
return 'online'
|
||||||
|
elif delta < STALE_THRESHOLD:
|
||||||
|
return 'stale'
|
||||||
|
else:
|
||||||
|
return 'offline'
|
||||||
|
|
||||||
|
|
||||||
|
def update_all_heartbeats() -> dict:
|
||||||
|
"""Update status for all sensors based on last_seen.
|
||||||
|
|
||||||
|
Returns dict with counts: {'online': n, 'stale': n, 'offline': n}
|
||||||
|
"""
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
sensors = db.session.scalars(db.select(Sensor)).all()
|
||||||
|
|
||||||
|
counts = {'online': 0, 'stale': 0, 'offline': 0}
|
||||||
|
|
||||||
|
for sensor in sensors:
|
||||||
|
new_status = check_sensor_status(sensor, now)
|
||||||
|
if sensor.status != new_status:
|
||||||
|
sensor.status = new_status
|
||||||
|
counts[new_status] += 1
|
||||||
|
|
||||||
|
db.session.commit()
|
||||||
|
return counts
|
||||||
|
|
||||||
|
|
||||||
|
def get_heartbeat_summary() -> dict:
|
||||||
|
"""Get summary of sensor heartbeat status."""
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
sensors = db.session.scalars(db.select(Sensor)).all()
|
||||||
|
|
||||||
|
summary = {
|
||||||
|
'total': len(sensors),
|
||||||
|
'online': 0,
|
||||||
|
'stale': 0,
|
||||||
|
'offline': 0,
|
||||||
|
'sensors': []
|
||||||
|
}
|
||||||
|
|
||||||
|
for sensor in sensors:
|
||||||
|
status = check_sensor_status(sensor, now)
|
||||||
|
summary[status] += 1
|
||||||
|
|
||||||
|
# Handle timezone-naive datetimes from DB
|
||||||
|
last_seen = sensor.last_seen
|
||||||
|
if last_seen.tzinfo is None:
|
||||||
|
last_seen = last_seen.replace(tzinfo=UTC)
|
||||||
|
|
||||||
|
summary['sensors'].append({
|
||||||
|
'hostname': sensor.hostname,
|
||||||
|
'ip': sensor.ip,
|
||||||
|
'status': status,
|
||||||
|
'last_seen': last_seen.isoformat(),
|
||||||
|
'seconds_ago': int((now - last_seen).total_seconds())
|
||||||
|
})
|
||||||
|
|
||||||
|
return summary
|
||||||
35
src/esp32_web/services/retention.py
Normal file
35
src/esp32_web/services/retention.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
"""Data retention service."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
from flask import current_app
|
||||||
|
from ..extensions import db
|
||||||
|
from ..models import Sighting, Probe, Event, Alert
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup_old_data() -> dict:
|
||||||
|
"""Delete rows older than configured retention periods.
|
||||||
|
|
||||||
|
Returns dict with counts deleted per table.
|
||||||
|
"""
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
counts = {}
|
||||||
|
|
||||||
|
tables = [
|
||||||
|
('sightings', Sighting, Sighting.timestamp,
|
||||||
|
current_app.config['RETENTION_SIGHTINGS_DAYS']),
|
||||||
|
('probes', Probe, Probe.timestamp,
|
||||||
|
current_app.config['RETENTION_PROBES_DAYS']),
|
||||||
|
('events', Event, Event.timestamp,
|
||||||
|
current_app.config['RETENTION_EVENTS_DAYS']),
|
||||||
|
('alerts', Alert, Alert.timestamp,
|
||||||
|
current_app.config['RETENTION_ALERTS_DAYS']),
|
||||||
|
]
|
||||||
|
|
||||||
|
for name, model, ts_col, days in tables:
|
||||||
|
cutoff = now - timedelta(days=days)
|
||||||
|
result = db.session.execute(
|
||||||
|
db.delete(model).where(ts_col < cutoff)
|
||||||
|
)
|
||||||
|
counts[name] = result.rowcount
|
||||||
|
|
||||||
|
db.session.commit()
|
||||||
|
return counts
|
||||||
@@ -1 +1,5 @@
|
|||||||
"""Utility modules."""
|
"""Utility modules."""
|
||||||
|
from .oui import lookup_vendor, load_oui_db, download_oui_db
|
||||||
|
from .ble_companies import lookup_ble_company
|
||||||
|
|
||||||
|
__all__ = ['lookup_vendor', 'load_oui_db', 'download_oui_db', 'lookup_ble_company']
|
||||||
|
|||||||
46
src/esp32_web/utils/ble_companies.py
Normal file
46
src/esp32_web/utils/ble_companies.py
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
"""BLE Company ID lookup."""
|
||||||
|
|
||||||
|
# Common BLE Company IDs (Bluetooth SIG assigned)
|
||||||
|
# https://www.bluetooth.com/specifications/assigned-numbers/company-identifiers/
|
||||||
|
BLE_COMPANIES: dict[int, str] = {
|
||||||
|
0x0000: 'Ericsson Technology Licensing',
|
||||||
|
0x0001: 'Nokia Mobile Phones',
|
||||||
|
0x0002: 'Intel Corp.',
|
||||||
|
0x0003: 'IBM Corp.',
|
||||||
|
0x0004: 'Toshiba Corp.',
|
||||||
|
0x0006: 'Microsoft',
|
||||||
|
0x000A: 'Qualcomm',
|
||||||
|
0x000D: 'Texas Instruments',
|
||||||
|
0x000F: 'Broadcom',
|
||||||
|
0x0010: 'Mitel Semiconductor',
|
||||||
|
0x0013: 'Atmel',
|
||||||
|
0x001D: 'Qualcomm Technologies',
|
||||||
|
0x004C: 'Apple, Inc.',
|
||||||
|
0x0059: 'Nordic Semiconductor',
|
||||||
|
0x005D: 'Realtek Semiconductor',
|
||||||
|
0x0075: 'Samsung Electronics',
|
||||||
|
0x0087: 'Garmin International',
|
||||||
|
0x00E0: 'Google',
|
||||||
|
0x00D2: 'Dialog Semiconductor',
|
||||||
|
0x0157: 'Anhui Huami Information Technology', # Xiaomi/Amazfit
|
||||||
|
0x0171: 'Amazon.com Services',
|
||||||
|
0x01A3: 'Facebook Technologies',
|
||||||
|
0x0224: 'Xiaomi Inc.',
|
||||||
|
0x02E5: 'Shenzhen Goodix Technology',
|
||||||
|
0x0310: 'Tile, Inc.',
|
||||||
|
0x038F: 'Bose Corporation',
|
||||||
|
0x0499: 'Ruuvi Innovations',
|
||||||
|
0x04E7: 'Sonos, Inc.',
|
||||||
|
0x0591: 'Espressif Inc.',
|
||||||
|
0x0822: 'Nothing Technology',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def lookup_ble_company(company_id: int) -> str | None:
|
||||||
|
"""Lookup BLE company by ID."""
|
||||||
|
return BLE_COMPANIES.get(company_id)
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_companies() -> dict[int, str]:
|
||||||
|
"""Get all known BLE companies."""
|
||||||
|
return BLE_COMPANIES.copy()
|
||||||
87
src/esp32_web/utils/oui.py
Normal file
87
src/esp32_web/utils/oui.py
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
"""MAC OUI vendor lookup."""
|
||||||
|
import csv
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# OUI database path
|
||||||
|
OUI_DB_PATH = Path(os.environ.get('OUI_DB_PATH', '/var/lib/esp32-web/oui.csv'))
|
||||||
|
OUI_DB_URL = 'https://standards-oui.ieee.org/oui/oui.csv'
|
||||||
|
|
||||||
|
# In-memory cache
|
||||||
|
_oui_cache: dict[str, str] = {}
|
||||||
|
_loaded = False
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_mac(mac: str) -> str:
|
||||||
|
"""Extract OUI prefix (first 6 hex chars) from MAC."""
|
||||||
|
clean = mac.upper().replace(':', '').replace('-', '').replace('.', '')
|
||||||
|
return clean[:6] if len(clean) >= 6 else ''
|
||||||
|
|
||||||
|
|
||||||
|
def load_oui_db(path: Path | None = None) -> int:
|
||||||
|
"""Load OUI database from CSV file."""
|
||||||
|
global _oui_cache, _loaded
|
||||||
|
|
||||||
|
if path is None:
|
||||||
|
path = OUI_DB_PATH
|
||||||
|
|
||||||
|
if not path.exists():
|
||||||
|
log.warning('OUI database not found: %s', path)
|
||||||
|
return 0
|
||||||
|
|
||||||
|
_oui_cache.clear()
|
||||||
|
count = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(path, 'r', encoding='utf-8') as f:
|
||||||
|
reader = csv.DictReader(f)
|
||||||
|
for row in reader:
|
||||||
|
# IEEE CSV format: Registry,Assignment,Organization Name,...
|
||||||
|
assignment = row.get('Assignment', '').strip().upper()
|
||||||
|
org = row.get('Organization Name', '').strip()
|
||||||
|
if assignment and org:
|
||||||
|
_oui_cache[assignment] = org
|
||||||
|
count += 1
|
||||||
|
except Exception as e:
|
||||||
|
log.exception('Error loading OUI database: %s', e)
|
||||||
|
return 0
|
||||||
|
|
||||||
|
_loaded = True
|
||||||
|
log.info('Loaded %d OUI entries from %s', count, path)
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
def lookup_vendor(mac: str) -> str | None:
|
||||||
|
"""Lookup vendor by MAC address."""
|
||||||
|
global _loaded
|
||||||
|
|
||||||
|
if not _loaded:
|
||||||
|
load_oui_db()
|
||||||
|
|
||||||
|
oui = _normalize_mac(mac)
|
||||||
|
if not oui:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return _oui_cache.get(oui)
|
||||||
|
|
||||||
|
|
||||||
|
def download_oui_db(path: Path | None = None) -> bool:
|
||||||
|
"""Download OUI database from IEEE."""
|
||||||
|
import urllib.request
|
||||||
|
|
||||||
|
if path is None:
|
||||||
|
path = OUI_DB_PATH
|
||||||
|
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
try:
|
||||||
|
log.info('Downloading OUI database from %s', OUI_DB_URL)
|
||||||
|
urllib.request.urlretrieve(OUI_DB_URL, path)
|
||||||
|
log.info('OUI database saved to %s', path)
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
log.exception('Error downloading OUI database: %s', e)
|
||||||
|
return False
|
||||||
149
static/css/main.css
Normal file
149
static/css/main.css
Normal file
@@ -0,0 +1,149 @@
|
|||||||
|
/* ESP32-Web Dashboard — dark theme overrides for Pico CSS */
|
||||||
|
:root {
|
||||||
|
--pico-font-family: system-ui, -apple-system, sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Tab navigation */
|
||||||
|
.tab-nav {
|
||||||
|
display: flex;
|
||||||
|
gap: 0;
|
||||||
|
border-bottom: 2px solid var(--pico-muted-border-color);
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.tab-nav a {
|
||||||
|
padding: 0.6rem 1.2rem;
|
||||||
|
text-decoration: none;
|
||||||
|
color: var(--pico-muted-color);
|
||||||
|
border-bottom: 2px solid transparent;
|
||||||
|
margin-bottom: -2px;
|
||||||
|
transition: color 0.2s, border-color 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
.tab-nav a:hover {
|
||||||
|
color: var(--pico-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.tab-nav a.active {
|
||||||
|
color: var(--pico-primary);
|
||||||
|
border-bottom-color: var(--pico-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Viz containers */
|
||||||
|
.viz-container {
|
||||||
|
position: relative;
|
||||||
|
width: 100%;
|
||||||
|
min-height: 400px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.viz-container svg {
|
||||||
|
width: 100%;
|
||||||
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Loading state */
|
||||||
|
.viz-loading {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
min-height: 400px;
|
||||||
|
color: var(--pico-muted-color);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Treemap */
|
||||||
|
.treemap-cell {
|
||||||
|
stroke: var(--pico-background-color);
|
||||||
|
stroke-width: 1px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.treemap-label {
|
||||||
|
fill: #fff;
|
||||||
|
font-size: 11px;
|
||||||
|
pointer-events: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Force graph */
|
||||||
|
.graph-node {
|
||||||
|
stroke: var(--pico-background-color);
|
||||||
|
stroke-width: 1.5px;
|
||||||
|
cursor: grab;
|
||||||
|
}
|
||||||
|
|
||||||
|
.graph-node:active {
|
||||||
|
cursor: grabbing;
|
||||||
|
}
|
||||||
|
|
||||||
|
.graph-link {
|
||||||
|
stroke: var(--pico-muted-border-color);
|
||||||
|
stroke-opacity: 0.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
.graph-label {
|
||||||
|
font-size: 10px;
|
||||||
|
fill: var(--pico-muted-color);
|
||||||
|
pointer-events: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Pack layout */
|
||||||
|
.cluster-circle {
|
||||||
|
stroke: var(--pico-muted-border-color);
|
||||||
|
stroke-width: 1px;
|
||||||
|
fill-opacity: 0.7;
|
||||||
|
}
|
||||||
|
|
||||||
|
.cluster-label {
|
||||||
|
font-size: 11px;
|
||||||
|
fill: #fff;
|
||||||
|
pointer-events: none;
|
||||||
|
text-anchor: middle;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* D3 tooltip */
|
||||||
|
.d3-tooltip {
|
||||||
|
position: absolute;
|
||||||
|
padding: 0.5rem 0.75rem;
|
||||||
|
background: var(--pico-card-background-color, #1a1a2e);
|
||||||
|
border: 1px solid var(--pico-muted-border-color);
|
||||||
|
border-radius: 4px;
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: var(--pico-color);
|
||||||
|
pointer-events: none;
|
||||||
|
z-index: 10;
|
||||||
|
max-width: 300px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Stats row */
|
||||||
|
.stats-row {
|
||||||
|
display: flex;
|
||||||
|
gap: 1rem;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-card {
|
||||||
|
flex: 1;
|
||||||
|
min-width: 120px;
|
||||||
|
text-align: center;
|
||||||
|
padding: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-card .stat-value {
|
||||||
|
font-size: 2rem;
|
||||||
|
font-weight: 700;
|
||||||
|
color: var(--pico-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-card .stat-label {
|
||||||
|
font-size: 0.85rem;
|
||||||
|
color: var(--pico-muted-color);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Empty state */
|
||||||
|
.viz-empty {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
min-height: 300px;
|
||||||
|
color: var(--pico-muted-color);
|
||||||
|
font-style: italic;
|
||||||
|
}
|
||||||
4
static/css/vendor/pico.min.css
vendored
Normal file
4
static/css/vendor/pico.min.css
vendored
Normal file
File diff suppressed because one or more lines are too long
58
static/js/main.js
Normal file
58
static/js/main.js
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
/* ESP32-Web Dashboard — tab switching & shared utilities */
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', () => {
|
||||||
|
initTabs();
|
||||||
|
});
|
||||||
|
|
||||||
|
function initTabs() {
|
||||||
|
const tabs = document.querySelectorAll('.tab-nav a');
|
||||||
|
tabs.forEach(tab => {
|
||||||
|
tab.addEventListener('click', (e) => {
|
||||||
|
e.preventDefault();
|
||||||
|
if (tab.classList.contains('active')) return;
|
||||||
|
tabs.forEach(t => t.classList.remove('active'));
|
||||||
|
tab.classList.add('active');
|
||||||
|
// htmx handles the AJAX load via hx-get on the element
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Load first tab on page load
|
||||||
|
const first = document.querySelector('.tab-nav a');
|
||||||
|
if (first) {
|
||||||
|
first.click();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Shared D3 tooltip */
|
||||||
|
function createTooltip() {
|
||||||
|
let tip = document.querySelector('.d3-tooltip');
|
||||||
|
if (!tip) {
|
||||||
|
tip = document.createElement('div');
|
||||||
|
tip.className = 'd3-tooltip';
|
||||||
|
tip.style.display = 'none';
|
||||||
|
document.body.appendChild(tip);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
show(html, event) {
|
||||||
|
tip.innerHTML = html;
|
||||||
|
tip.style.display = 'block';
|
||||||
|
tip.style.left = (event.pageX + 12) + 'px';
|
||||||
|
tip.style.top = (event.pageY - 12) + 'px';
|
||||||
|
},
|
||||||
|
move(event) {
|
||||||
|
tip.style.left = (event.pageX + 12) + 'px';
|
||||||
|
tip.style.top = (event.pageY - 12) + 'px';
|
||||||
|
},
|
||||||
|
hide() {
|
||||||
|
tip.style.display = 'none';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Responsive SVG dimensions from container */
|
||||||
|
function getVizSize(container) {
|
||||||
|
return {
|
||||||
|
width: Math.max(container.clientWidth || container.parentElement.clientWidth, 300),
|
||||||
|
height: Math.max(container.clientHeight, 400)
|
||||||
|
};
|
||||||
|
}
|
||||||
2
static/js/vendor/d3.min.js
vendored
Normal file
2
static/js/vendor/d3.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
1
static/js/vendor/htmx.min.js
vendored
Normal file
1
static/js/vendor/htmx.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
110
static/js/viz/fingerprint_clusters.js
Normal file
110
static/js/viz/fingerprint_clusters.js
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
/* Fingerprint Clusters — D3 packed circles grouping devices by behavior */
|
||||||
|
function renderFingerprintClusters(selector, apiUrl) {
|
||||||
|
const container = document.querySelector(selector);
|
||||||
|
const tooltip = createTooltip();
|
||||||
|
const { width, height } = getVizSize(container);
|
||||||
|
|
||||||
|
fetch(apiUrl + '?hours=24')
|
||||||
|
.then(r => r.json())
|
||||||
|
.then(data => {
|
||||||
|
if (!data.clusters || data.clusters.length === 0) {
|
||||||
|
container.innerHTML = '<div class="viz-empty">No active devices in the last 24 hours</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build hierarchy for d3.pack
|
||||||
|
const hierarchy = {
|
||||||
|
name: 'root',
|
||||||
|
children: data.clusters.map(c => ({
|
||||||
|
name: c.label,
|
||||||
|
value: c.device_count,
|
||||||
|
cluster: c,
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
|
||||||
|
const activityColor = d3.scaleOrdinal()
|
||||||
|
.domain(['Low', 'Medium', 'High'])
|
||||||
|
.range(['#495057', '#f59f00', '#ff6b6b']);
|
||||||
|
|
||||||
|
const root = d3.hierarchy(hierarchy)
|
||||||
|
.sum(d => d.value || 0)
|
||||||
|
.sort((a, b) => b.value - a.value);
|
||||||
|
|
||||||
|
d3.pack()
|
||||||
|
.size([width, height])
|
||||||
|
.padding(8)(root);
|
||||||
|
|
||||||
|
const svg = d3.select(selector)
|
||||||
|
.append('svg')
|
||||||
|
.attr('viewBox', `0 0 ${width} ${height}`)
|
||||||
|
.attr('preserveAspectRatio', 'xMidYMid meet');
|
||||||
|
|
||||||
|
const leaves = svg.selectAll('g')
|
||||||
|
.data(root.leaves())
|
||||||
|
.join('g')
|
||||||
|
.attr('transform', d => `translate(${d.x},${d.y})`);
|
||||||
|
|
||||||
|
leaves.append('circle')
|
||||||
|
.attr('class', 'cluster-circle')
|
||||||
|
.attr('r', d => d.r)
|
||||||
|
.attr('fill', d => activityColor(d.data.cluster.activity))
|
||||||
|
.on('mouseover', (event, d) => {
|
||||||
|
const c = d.data.cluster;
|
||||||
|
tooltip.show(
|
||||||
|
`<strong>${c.label}</strong><br>` +
|
||||||
|
`Devices: ${c.device_count}<br>` +
|
||||||
|
`Avg probe rate: ${c.centroid.probe_rate}/device<br>` +
|
||||||
|
`Avg sighting rate: ${c.centroid.sighting_rate}/device<br>` +
|
||||||
|
`Avg RSSI: ${c.centroid.avg_rssi !== null ? c.centroid.avg_rssi + ' dBm' : 'N/A'}`,
|
||||||
|
event
|
||||||
|
);
|
||||||
|
d3.select(event.target).attr('fill-opacity', 1).attr('stroke', '#fff');
|
||||||
|
})
|
||||||
|
.on('mousemove', (event) => {
|
||||||
|
tooltip.move(event);
|
||||||
|
})
|
||||||
|
.on('mouseout', (event) => {
|
||||||
|
tooltip.hide();
|
||||||
|
d3.select(event.target).attr('fill-opacity', 0.7).attr('stroke', null);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Labels for circles big enough
|
||||||
|
leaves.append('text')
|
||||||
|
.attr('class', 'cluster-label')
|
||||||
|
.attr('dy', '-0.3em')
|
||||||
|
.text(d => {
|
||||||
|
if (d.r < 25) return '';
|
||||||
|
const name = d.data.cluster.vendor;
|
||||||
|
const maxChars = Math.floor(d.r * 2 / 7);
|
||||||
|
return name.length > maxChars ? name.slice(0, maxChars - 1) + '\u2026' : name;
|
||||||
|
});
|
||||||
|
|
||||||
|
leaves.append('text')
|
||||||
|
.attr('class', 'cluster-label')
|
||||||
|
.attr('dy', '1em')
|
||||||
|
.style('font-size', '10px')
|
||||||
|
.style('opacity', 0.8)
|
||||||
|
.text(d => d.r >= 20 ? d.data.cluster.device_count : '');
|
||||||
|
|
||||||
|
// Legend
|
||||||
|
const legend = svg.append('g')
|
||||||
|
.attr('transform', `translate(${width - 140}, 20)`);
|
||||||
|
|
||||||
|
['Low', 'Medium', 'High'].forEach((level, i) => {
|
||||||
|
const g = legend.append('g')
|
||||||
|
.attr('transform', `translate(0, ${i * 22})`);
|
||||||
|
g.append('circle')
|
||||||
|
.attr('r', 6)
|
||||||
|
.attr('fill', activityColor(level));
|
||||||
|
g.append('text')
|
||||||
|
.attr('x', 14)
|
||||||
|
.attr('y', 4)
|
||||||
|
.attr('fill', '#adb5bd')
|
||||||
|
.style('font-size', '12px')
|
||||||
|
.text(level + ' Activity');
|
||||||
|
});
|
||||||
|
})
|
||||||
|
.catch(err => {
|
||||||
|
container.innerHTML = `<div class="viz-empty">Failed to load data: ${err.message}</div>`;
|
||||||
|
});
|
||||||
|
}
|
||||||
110
static/js/viz/ssid_graph.js
Normal file
110
static/js/viz/ssid_graph.js
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
/* SSID Social Graph — D3 force-directed graph linking devices by shared probed SSIDs */
|
||||||
|
function renderSsidGraph(selector, apiUrl) {
|
||||||
|
const container = document.querySelector(selector);
|
||||||
|
const tooltip = createTooltip();
|
||||||
|
const { width, height } = getVizSize(container);
|
||||||
|
|
||||||
|
fetch(apiUrl + '?hours=24&min_shared=1&limit=200')
|
||||||
|
.then(r => r.json())
|
||||||
|
.then(data => {
|
||||||
|
if (!data.nodes || data.nodes.length === 0) {
|
||||||
|
container.innerHTML = '<div class="viz-empty">No probe data available</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const color = d3.scaleOrdinal()
|
||||||
|
.domain(['wifi', 'ble'])
|
||||||
|
.range(['#4dabf7', '#69db7c']);
|
||||||
|
|
||||||
|
const sizeScale = d3.scaleSqrt()
|
||||||
|
.domain([1, d3.max(data.nodes, d => d.ssid_count) || 1])
|
||||||
|
.range([4, 16]);
|
||||||
|
|
||||||
|
const svg = d3.select(selector)
|
||||||
|
.append('svg')
|
||||||
|
.attr('viewBox', `0 0 ${width} ${height}`)
|
||||||
|
.attr('preserveAspectRatio', 'xMidYMid meet');
|
||||||
|
|
||||||
|
const simulation = d3.forceSimulation(data.nodes)
|
||||||
|
.force('link', d3.forceLink(data.links).id(d => d.id).distance(80))
|
||||||
|
.force('charge', d3.forceManyBody().strength(-100))
|
||||||
|
.force('center', d3.forceCenter(width / 2, height / 2))
|
||||||
|
.force('collision', d3.forceCollide().radius(d => sizeScale(d.ssid_count) + 2));
|
||||||
|
|
||||||
|
const link = svg.append('g')
|
||||||
|
.selectAll('line')
|
||||||
|
.data(data.links)
|
||||||
|
.join('line')
|
||||||
|
.attr('class', 'graph-link')
|
||||||
|
.attr('stroke-width', d => Math.max(1, d.weight));
|
||||||
|
|
||||||
|
const node = svg.append('g')
|
||||||
|
.selectAll('circle')
|
||||||
|
.data(data.nodes)
|
||||||
|
.join('circle')
|
||||||
|
.attr('class', 'graph-node')
|
||||||
|
.attr('r', d => sizeScale(d.ssid_count))
|
||||||
|
.attr('fill', d => color(d.type))
|
||||||
|
.on('mouseover', (event, d) => {
|
||||||
|
tooltip.show(
|
||||||
|
`<strong>${d.id}</strong><br>` +
|
||||||
|
`Vendor: ${d.vendor}<br>` +
|
||||||
|
`Type: ${d.type}<br>` +
|
||||||
|
`Probed SSIDs: ${d.ssid_count}`,
|
||||||
|
event
|
||||||
|
);
|
||||||
|
d3.select(event.target).attr('stroke', '#fff').attr('stroke-width', 3);
|
||||||
|
})
|
||||||
|
.on('mousemove', (event) => {
|
||||||
|
tooltip.move(event);
|
||||||
|
})
|
||||||
|
.on('mouseout', (event) => {
|
||||||
|
tooltip.hide();
|
||||||
|
d3.select(event.target).attr('stroke', null).attr('stroke-width', 1.5);
|
||||||
|
})
|
||||||
|
.call(drag(simulation));
|
||||||
|
|
||||||
|
// Link hover
|
||||||
|
link.on('mouseover', (event, d) => {
|
||||||
|
const ssids = d.shared_ssids.slice(0, 5).join(', ');
|
||||||
|
const more = d.shared_ssids.length > 5 ? ` (+${d.shared_ssids.length - 5} more)` : '';
|
||||||
|
tooltip.show(
|
||||||
|
`<strong>Shared SSIDs (${d.weight})</strong><br>${ssids}${more}`,
|
||||||
|
event
|
||||||
|
);
|
||||||
|
}).on('mousemove', (event) => tooltip.move(event))
|
||||||
|
.on('mouseout', () => tooltip.hide());
|
||||||
|
|
||||||
|
simulation.on('tick', () => {
|
||||||
|
link
|
||||||
|
.attr('x1', d => d.source.x)
|
||||||
|
.attr('y1', d => d.source.y)
|
||||||
|
.attr('x2', d => d.target.x)
|
||||||
|
.attr('y2', d => d.target.y);
|
||||||
|
node
|
||||||
|
.attr('cx', d => d.x)
|
||||||
|
.attr('cy', d => d.y);
|
||||||
|
});
|
||||||
|
|
||||||
|
function drag(sim) {
|
||||||
|
return d3.drag()
|
||||||
|
.on('start', (event, d) => {
|
||||||
|
if (!event.active) sim.alphaTarget(0.3).restart();
|
||||||
|
d.fx = d.x;
|
||||||
|
d.fy = d.y;
|
||||||
|
})
|
||||||
|
.on('drag', (event, d) => {
|
||||||
|
d.fx = event.x;
|
||||||
|
d.fy = event.y;
|
||||||
|
})
|
||||||
|
.on('end', (event, d) => {
|
||||||
|
if (!event.active) sim.alphaTarget(0);
|
||||||
|
d.fx = null;
|
||||||
|
d.fy = null;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.catch(err => {
|
||||||
|
container.innerHTML = `<div class="viz-empty">Failed to load data: ${err.message}</div>`;
|
||||||
|
});
|
||||||
|
}
|
||||||
95
static/js/viz/vendor_treemap.js
Normal file
95
static/js/viz/vendor_treemap.js
Normal file
@@ -0,0 +1,95 @@
|
|||||||
|
/* Vendor Treemap — D3 treemap of device vendors by type */
|
||||||
|
function renderVendorTreemap(selector, apiUrl) {
|
||||||
|
const container = document.querySelector(selector);
|
||||||
|
const tooltip = createTooltip();
|
||||||
|
const { width, height } = getVizSize(container);
|
||||||
|
|
||||||
|
fetch(apiUrl)
|
||||||
|
.then(r => r.json())
|
||||||
|
.then(data => {
|
||||||
|
if (!data.children || data.children.length === 0) {
|
||||||
|
container.innerHTML = '<div class="viz-empty">No device data available</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const color = d3.scaleOrdinal()
|
||||||
|
.domain(['wifi', 'ble'])
|
||||||
|
.range(['#4dabf7', '#69db7c']);
|
||||||
|
|
||||||
|
const root = d3.hierarchy(data)
|
||||||
|
.sum(d => d.value || 0)
|
||||||
|
.sort((a, b) => b.value - a.value);
|
||||||
|
|
||||||
|
d3.treemap()
|
||||||
|
.size([width, height])
|
||||||
|
.padding(2)
|
||||||
|
.round(true)(root);
|
||||||
|
|
||||||
|
const svg = d3.select(selector)
|
||||||
|
.append('svg')
|
||||||
|
.attr('viewBox', `0 0 ${width} ${height}`)
|
||||||
|
.attr('preserveAspectRatio', 'xMidYMid meet');
|
||||||
|
|
||||||
|
const leaves = svg.selectAll('g')
|
||||||
|
.data(root.leaves())
|
||||||
|
.join('g')
|
||||||
|
.attr('transform', d => `translate(${d.x0},${d.y0})`);
|
||||||
|
|
||||||
|
leaves.append('rect')
|
||||||
|
.attr('class', 'treemap-cell')
|
||||||
|
.attr('width', d => Math.max(0, d.x1 - d.x0))
|
||||||
|
.attr('height', d => Math.max(0, d.y1 - d.y0))
|
||||||
|
.attr('fill', d => {
|
||||||
|
const type = d.parent ? d.parent.data.name : 'wifi';
|
||||||
|
return color(type);
|
||||||
|
})
|
||||||
|
.attr('fill-opacity', 0.8)
|
||||||
|
.on('mouseover', (event, d) => {
|
||||||
|
const type = d.parent ? d.parent.data.name : '';
|
||||||
|
tooltip.show(
|
||||||
|
`<strong>${d.data.name}</strong><br>` +
|
||||||
|
`Type: ${type}<br>` +
|
||||||
|
`Devices: ${d.value}`,
|
||||||
|
event
|
||||||
|
);
|
||||||
|
d3.select(event.target).attr('fill-opacity', 1);
|
||||||
|
})
|
||||||
|
.on('mousemove', (event) => {
|
||||||
|
tooltip.move(event);
|
||||||
|
})
|
||||||
|
.on('mouseout', (event) => {
|
||||||
|
tooltip.hide();
|
||||||
|
d3.select(event.target).attr('fill-opacity', 0.8);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Labels only for cells big enough
|
||||||
|
leaves.append('text')
|
||||||
|
.attr('class', 'treemap-label')
|
||||||
|
.attr('x', 4)
|
||||||
|
.attr('y', 14)
|
||||||
|
.text(d => {
|
||||||
|
const w = d.x1 - d.x0;
|
||||||
|
const h = d.y1 - d.y0;
|
||||||
|
if (w < 40 || h < 18) return '';
|
||||||
|
const name = d.data.name;
|
||||||
|
const maxChars = Math.floor(w / 7);
|
||||||
|
return name.length > maxChars ? name.slice(0, maxChars - 1) + '\u2026' : name;
|
||||||
|
});
|
||||||
|
|
||||||
|
leaves.append('text')
|
||||||
|
.attr('class', 'treemap-label')
|
||||||
|
.attr('x', 4)
|
||||||
|
.attr('y', 28)
|
||||||
|
.style('font-size', '10px')
|
||||||
|
.style('opacity', 0.8)
|
||||||
|
.text(d => {
|
||||||
|
const w = d.x1 - d.x0;
|
||||||
|
const h = d.y1 - d.y0;
|
||||||
|
if (w < 30 || h < 32) return '';
|
||||||
|
return d.value;
|
||||||
|
});
|
||||||
|
})
|
||||||
|
.catch(err => {
|
||||||
|
container.innerHTML = `<div class="viz-empty">Failed to load data: ${err.message}</div>`;
|
||||||
|
});
|
||||||
|
}
|
||||||
30
templates/base.html
Normal file
30
templates/base.html
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en" data-theme="dark">
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||||
|
<title>{% block title %}ESP32-Web{% endblock %}</title>
|
||||||
|
<link rel="stylesheet" href="{{ url_for('static', filename='css/vendor/pico.min.css') }}">
|
||||||
|
<link rel="stylesheet" href="{{ url_for('static', filename='css/main.css') }}">
|
||||||
|
{% block head %}{% endblock %}
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<nav class="container-fluid">
|
||||||
|
<ul>
|
||||||
|
<li><strong>ESP32-Web</strong></li>
|
||||||
|
</ul>
|
||||||
|
<ul>
|
||||||
|
<li><a href="{{ url_for('dashboard.index') }}">Dashboard</a></li>
|
||||||
|
<li><a href="/docs">API Docs</a></li>
|
||||||
|
<li><a href="/health">Health</a></li>
|
||||||
|
</ul>
|
||||||
|
</nav>
|
||||||
|
<main class="container">
|
||||||
|
{% block content %}{% endblock %}
|
||||||
|
</main>
|
||||||
|
<script src="{{ url_for('static', filename='js/vendor/htmx.min.js') }}"></script>
|
||||||
|
<script src="{{ url_for('static', filename='js/vendor/d3.min.js') }}"></script>
|
||||||
|
<script src="{{ url_for('static', filename='js/main.js') }}"></script>
|
||||||
|
{% block scripts %}{% endblock %}
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
19
templates/dashboard/index.html
Normal file
19
templates/dashboard/index.html
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
{% block title %}Device Intelligence — ESP32-Web{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<hgroup>
|
||||||
|
<h2>Device Intelligence</h2>
|
||||||
|
<p>{{ total_devices }} devices tracked across {{ total_sensors }} sensors</p>
|
||||||
|
</hgroup>
|
||||||
|
|
||||||
|
<nav class="tab-nav">
|
||||||
|
<a href="#" hx-get="{{ url_for('dashboard.tab_vendor_treemap') }}" hx-target="#tab-content">Vendor Treemap</a>
|
||||||
|
<a href="#" hx-get="{{ url_for('dashboard.tab_ssid_graph') }}" hx-target="#tab-content">SSID Graph</a>
|
||||||
|
<a href="#" hx-get="{{ url_for('dashboard.tab_fingerprint_clusters') }}" hx-target="#tab-content">Fingerprint Clusters</a>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
<div id="tab-content">
|
||||||
|
<div class="viz-loading" aria-busy="true">Loading...</div>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
||||||
3
templates/dashboard/partials/fingerprint_clusters.html
Normal file
3
templates/dashboard/partials/fingerprint_clusters.html
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
<div class="viz-container" id="fingerprint-clusters"></div>
|
||||||
|
<script src="{{ url_for('static', filename='js/viz/fingerprint_clusters.js') }}"></script>
|
||||||
|
<script>renderFingerprintClusters('#fingerprint-clusters', '{{ url_for("api.fingerprint_clusters") }}');</script>
|
||||||
3
templates/dashboard/partials/ssid_graph.html
Normal file
3
templates/dashboard/partials/ssid_graph.html
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
<div class="viz-container" id="ssid-graph"></div>
|
||||||
|
<script src="{{ url_for('static', filename='js/viz/ssid_graph.js') }}"></script>
|
||||||
|
<script>renderSsidGraph('#ssid-graph', '{{ url_for("api.ssid_graph") }}');</script>
|
||||||
3
templates/dashboard/partials/vendor_treemap.html
Normal file
3
templates/dashboard/partials/vendor_treemap.html
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
<div class="viz-container" id="vendor-treemap"></div>
|
||||||
|
<script src="{{ url_for('static', filename='js/viz/vendor_treemap.js') }}"></script>
|
||||||
|
<script>renderVendorTreemap('#vendor-treemap', '{{ url_for("api.vendor_treemap") }}');</script>
|
||||||
17
tests/test_api/test_export.py
Normal file
17
tests/test_api/test_export.py
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
"""Export endpoint tests."""
|
||||||
|
|
||||||
|
|
||||||
|
def test_export_devices_csv_empty(client):
|
||||||
|
"""Test exporting devices CSV when empty."""
|
||||||
|
response = client.get('/api/v1/export/devices.csv')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.content_type == 'text/csv; charset=utf-8'
|
||||||
|
assert b'mac,type,vendor' in response.data
|
||||||
|
|
||||||
|
|
||||||
|
def test_export_devices_json_empty(client):
|
||||||
|
"""Test exporting devices JSON when empty."""
|
||||||
|
response = client.get('/api/v1/export/devices.json')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.content_type == 'application/json'
|
||||||
|
assert response.json == []
|
||||||
136
tests/test_api/test_intelligence.py
Normal file
136
tests/test_api/test_intelligence.py
Normal file
@@ -0,0 +1,136 @@
|
|||||||
|
"""Intelligence API tests."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
from esp32_web.extensions import db
|
||||||
|
from esp32_web.models import Device, Probe, Sighting, Sensor
|
||||||
|
|
||||||
|
|
||||||
|
def _seed_devices(app):
|
||||||
|
"""Create test devices, probes, and sightings."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.1')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.flush()
|
||||||
|
|
||||||
|
d1 = Device(mac='aa:bb:cc:dd:ee:01', device_type='wifi',
|
||||||
|
vendor='Apple, Inc.', first_seen=datetime.now(UTC),
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
d2 = Device(mac='aa:bb:cc:dd:ee:02', device_type='wifi',
|
||||||
|
vendor='Samsung', first_seen=datetime.now(UTC),
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
d3 = Device(mac='aa:bb:cc:dd:ee:03', device_type='ble',
|
||||||
|
company_id=0x004C, first_seen=datetime.now(UTC),
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
db.session.add_all([d1, d2, d3])
|
||||||
|
db.session.flush()
|
||||||
|
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
# Both d1 and d2 probe "HomeNet"
|
||||||
|
db.session.add_all([
|
||||||
|
Probe(device_id=d1.id, sensor_id=sensor.id, ssid='HomeNet',
|
||||||
|
rssi=-50, channel=6, timestamp=now),
|
||||||
|
Probe(device_id=d2.id, sensor_id=sensor.id, ssid='HomeNet',
|
||||||
|
rssi=-60, channel=6, timestamp=now),
|
||||||
|
Probe(device_id=d1.id, sensor_id=sensor.id, ssid='Office',
|
||||||
|
rssi=-55, channel=1, timestamp=now),
|
||||||
|
])
|
||||||
|
# Sightings
|
||||||
|
db.session.add_all([
|
||||||
|
Sighting(device_id=d1.id, sensor_id=sensor.id, rssi=-50, timestamp=now),
|
||||||
|
Sighting(device_id=d3.id, sensor_id=sensor.id, rssi=-70, timestamp=now),
|
||||||
|
])
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
|
||||||
|
# -- Vendor Treemap --
|
||||||
|
|
||||||
|
def test_vendor_treemap_empty(client):
|
||||||
|
"""Test treemap with no devices."""
|
||||||
|
response = client.get('/api/v1/intelligence/vendor-treemap')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['name'] == 'devices'
|
||||||
|
assert response.json['children'] == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_vendor_treemap_with_data(client, app):
|
||||||
|
"""Test treemap returns grouped vendor data."""
|
||||||
|
_seed_devices(app)
|
||||||
|
response = client.get('/api/v1/intelligence/vendor-treemap')
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.json
|
||||||
|
assert data['name'] == 'devices'
|
||||||
|
assert len(data['children']) > 0
|
||||||
|
|
||||||
|
# Find wifi group
|
||||||
|
wifi = next((c for c in data['children'] if c['name'] == 'wifi'), None)
|
||||||
|
assert wifi is not None
|
||||||
|
assert len(wifi['children']) >= 1
|
||||||
|
|
||||||
|
# Find ble group
|
||||||
|
ble = next((c for c in data['children'] if c['name'] == 'ble'), None)
|
||||||
|
assert ble is not None
|
||||||
|
assert len(ble['children']) >= 1
|
||||||
|
|
||||||
|
|
||||||
|
# -- SSID Graph --
|
||||||
|
|
||||||
|
def test_ssid_graph_empty(client):
|
||||||
|
"""Test graph with no probes."""
|
||||||
|
response = client.get('/api/v1/intelligence/ssid-graph')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['nodes'] == []
|
||||||
|
assert response.json['links'] == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_ssid_graph_with_data(client, app):
|
||||||
|
"""Test graph returns nodes and links for shared SSIDs."""
|
||||||
|
_seed_devices(app)
|
||||||
|
response = client.get('/api/v1/intelligence/ssid-graph?hours=24')
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.json
|
||||||
|
assert len(data['nodes']) >= 2
|
||||||
|
# d1 and d2 share "HomeNet" → at least one link
|
||||||
|
assert len(data['links']) >= 1
|
||||||
|
link = data['links'][0]
|
||||||
|
assert 'HomeNet' in link['shared_ssids']
|
||||||
|
assert link['weight'] >= 1
|
||||||
|
|
||||||
|
# SSID summary present
|
||||||
|
assert len(data['ssids']) >= 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_ssid_graph_params(client, app):
|
||||||
|
"""Test graph with custom parameters."""
|
||||||
|
_seed_devices(app)
|
||||||
|
response = client.get('/api/v1/intelligence/ssid-graph?hours=1&min_shared=5&limit=10')
|
||||||
|
assert response.status_code == 200
|
||||||
|
# With min_shared=5, no pairs should match
|
||||||
|
assert response.json['links'] == []
|
||||||
|
|
||||||
|
|
||||||
|
# -- Fingerprint Clusters --
|
||||||
|
|
||||||
|
def test_fingerprint_clusters_empty(client):
|
||||||
|
"""Test clusters with no activity."""
|
||||||
|
response = client.get('/api/v1/intelligence/fingerprint-clusters')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['clusters'] == []
|
||||||
|
assert response.json['total_devices'] == 0
|
||||||
|
|
||||||
|
|
||||||
|
def test_fingerprint_clusters_with_data(client, app):
|
||||||
|
"""Test clusters group devices by behavior."""
|
||||||
|
_seed_devices(app)
|
||||||
|
response = client.get('/api/v1/intelligence/fingerprint-clusters?hours=24')
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.json
|
||||||
|
assert data['total_devices'] >= 2
|
||||||
|
assert data['total_clusters'] >= 1
|
||||||
|
|
||||||
|
# Each cluster has expected structure
|
||||||
|
cluster = data['clusters'][0]
|
||||||
|
assert 'label' in cluster
|
||||||
|
assert 'device_count' in cluster
|
||||||
|
assert 'devices' in cluster
|
||||||
|
assert 'centroid' in cluster
|
||||||
|
assert 'probe_rate' in cluster['centroid']
|
||||||
|
assert 'sighting_rate' in cluster['centroid']
|
||||||
128
tests/test_api/test_pagination.py
Normal file
128
tests/test_api/test_pagination.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
"""Pagination tests for all list endpoints."""
|
||||||
|
from datetime import datetime, UTC
|
||||||
|
from esp32_web.extensions import db
|
||||||
|
from esp32_web.models import Sensor, Device, Alert, Event, Probe
|
||||||
|
|
||||||
|
|
||||||
|
def _create_sensors(app, n):
|
||||||
|
with app.app_context():
|
||||||
|
for i in range(n):
|
||||||
|
db.session.add(Sensor(hostname=f'sensor-{i:03d}', ip=f'192.168.1.{i}'))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def test_sensors_pagination_defaults(client, app):
|
||||||
|
"""Sensors endpoint returns total, limit, offset."""
|
||||||
|
_create_sensors(app, 3)
|
||||||
|
resp = client.get('/api/v1/sensors')
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json['total'] == 3
|
||||||
|
assert resp.json['limit'] == 100
|
||||||
|
assert resp.json['offset'] == 0
|
||||||
|
assert len(resp.json['sensors']) == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_sensors_pagination_limit(client, app):
|
||||||
|
"""Sensors limit param restricts returned items."""
|
||||||
|
_create_sensors(app, 5)
|
||||||
|
resp = client.get('/api/v1/sensors?limit=2')
|
||||||
|
assert resp.json['total'] == 5
|
||||||
|
assert resp.json['limit'] == 2
|
||||||
|
assert len(resp.json['sensors']) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_sensors_pagination_offset(client, app):
|
||||||
|
"""Sensors offset param skips items."""
|
||||||
|
_create_sensors(app, 5)
|
||||||
|
resp = client.get('/api/v1/sensors?limit=2&offset=3')
|
||||||
|
assert resp.json['total'] == 5
|
||||||
|
assert resp.json['offset'] == 3
|
||||||
|
assert len(resp.json['sensors']) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_sensors_pagination_max_limit(client, app):
|
||||||
|
"""Limit is capped at 1000."""
|
||||||
|
_create_sensors(app, 1)
|
||||||
|
resp = client.get('/api/v1/sensors?limit=5000')
|
||||||
|
assert resp.json['limit'] == 1000
|
||||||
|
|
||||||
|
|
||||||
|
def test_devices_pagination(client, app):
|
||||||
|
"""Devices endpoint includes total count."""
|
||||||
|
with app.app_context():
|
||||||
|
for i in range(3):
|
||||||
|
db.session.add(Device(
|
||||||
|
mac=f'aa:bb:cc:dd:ee:{i:02x}',
|
||||||
|
device_type='wifi',
|
||||||
|
last_seen=datetime.now(UTC),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
resp = client.get('/api/v1/devices?limit=2')
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json['total'] == 3
|
||||||
|
assert len(resp.json['devices']) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_alerts_pagination(client, app):
|
||||||
|
"""Alerts endpoint includes total count."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='s1', ip='10.0.0.1')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.flush()
|
||||||
|
for _ in range(4):
|
||||||
|
db.session.add(Alert(
|
||||||
|
sensor_id=sensor.id,
|
||||||
|
alert_type='deauth',
|
||||||
|
timestamp=datetime.now(UTC),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
resp = client.get('/api/v1/alerts?limit=2&hours=1')
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json['total'] == 4
|
||||||
|
assert len(resp.json['alerts']) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_events_pagination(client, app):
|
||||||
|
"""Events endpoint includes total count."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='s1', ip='10.0.0.1')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.flush()
|
||||||
|
for _ in range(3):
|
||||||
|
db.session.add(Event(
|
||||||
|
sensor_id=sensor.id,
|
||||||
|
event_type='presence',
|
||||||
|
timestamp=datetime.now(UTC),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
resp = client.get('/api/v1/events?hours=1')
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json['total'] == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_probes_pagination(client, app):
|
||||||
|
"""Probes endpoint includes total count."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='s1', ip='10.0.0.1')
|
||||||
|
device = Device(mac='aa:bb:cc:dd:ee:ff', device_type='wifi',
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
db.session.add_all([sensor, device])
|
||||||
|
db.session.flush()
|
||||||
|
for _ in range(3):
|
||||||
|
db.session.add(Probe(
|
||||||
|
device_id=device.id,
|
||||||
|
sensor_id=sensor.id,
|
||||||
|
ssid='TestNet',
|
||||||
|
rssi=-50,
|
||||||
|
channel=6,
|
||||||
|
timestamp=datetime.now(UTC),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
resp = client.get('/api/v1/probes?hours=1&limit=2')
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json['total'] == 3
|
||||||
|
assert len(resp.json['probes']) == 2
|
||||||
@@ -1,11 +1,17 @@
|
|||||||
"""Sensor API tests."""
|
"""Sensor API tests."""
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from esp32_web.extensions import db
|
||||||
|
from esp32_web.models import Sensor, Event
|
||||||
|
|
||||||
|
|
||||||
def test_list_sensors_empty(client):
|
def test_list_sensors_empty(client):
|
||||||
"""Test listing sensors when empty."""
|
"""Test listing sensors when empty."""
|
||||||
response = client.get('/api/v1/sensors')
|
response = client.get('/api/v1/sensors')
|
||||||
assert response.status_code == 200
|
assert response.status_code == 200
|
||||||
assert response.json == {'sensors': []}
|
assert response.json['sensors'] == []
|
||||||
|
assert response.json['total'] == 0
|
||||||
|
assert response.json['limit'] == 100
|
||||||
|
assert response.json['offset'] == 0
|
||||||
|
|
||||||
|
|
||||||
def test_get_sensor_not_found(client):
|
def test_get_sensor_not_found(client):
|
||||||
@@ -18,4 +24,317 @@ def test_health_check(client):
|
|||||||
"""Test health endpoint."""
|
"""Test health endpoint."""
|
||||||
response = client.get('/health')
|
response = client.get('/health')
|
||||||
assert response.status_code == 200
|
assert response.status_code == 200
|
||||||
assert response.json == {'status': 'ok'}
|
assert response.json['status'] == 'ok'
|
||||||
|
assert 'uptime' in response.json
|
||||||
|
assert 'uptime_seconds' in response.json
|
||||||
|
|
||||||
|
|
||||||
|
# Fleet Management Tests
|
||||||
|
|
||||||
|
def test_get_config_not_found(client):
|
||||||
|
"""Test getting config for non-existent sensor."""
|
||||||
|
response = client.get('/api/v1/sensors/nonexistent/config')
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_config_timeout(client, app):
|
||||||
|
"""Test config endpoint when sensor times out."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_sock.recvfrom.side_effect = TimeoutError()
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/test-sensor/config')
|
||||||
|
assert response.status_code == 504
|
||||||
|
assert 'not responding' in response.json['error']
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_config_success(client, app):
|
||||||
|
"""Test successful config retrieval."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_sock.recvfrom.return_value = (
|
||||||
|
b'OK STATUS rate=10 power=20 adaptive=on presence=off',
|
||||||
|
('192.168.1.100', 5501)
|
||||||
|
)
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/test-sensor/config')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['config']['rate'] == 10
|
||||||
|
assert response.json['config']['power'] == 20
|
||||||
|
assert response.json['config']['adaptive'] is True
|
||||||
|
assert response.json['config']['presence'] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_update_config_not_found(client):
|
||||||
|
"""Test updating config for non-existent sensor."""
|
||||||
|
response = client.put('/api/v1/sensors/nonexistent/config',
|
||||||
|
json={'rate': 5})
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_update_config_unknown_key(client, app):
|
||||||
|
"""Test updating config with unknown key."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket'):
|
||||||
|
response = client.put('/api/v1/sensors/test-sensor/config',
|
||||||
|
json={'invalid_key': 123})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert 'Unknown config key' in response.json['errors'][0]
|
||||||
|
|
||||||
|
|
||||||
|
def test_update_config_success(client, app):
|
||||||
|
"""Test successful config update."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.put('/api/v1/sensors/test-sensor/config',
|
||||||
|
json={'rate': 5, 'adaptive': True})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['results']['rate'] == 'ok'
|
||||||
|
assert response.json['results']['adaptive'] == 'ok'
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_ota_not_found(client):
|
||||||
|
"""Test OTA trigger for non-existent sensor."""
|
||||||
|
response = client.post('/api/v1/sensors/nonexistent/ota',
|
||||||
|
json={'url': 'http://example.com/fw.bin'})
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_ota_missing_url(client, app):
|
||||||
|
"""Test OTA trigger without URL."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/ota', json={})
|
||||||
|
assert response.status_code == 400
|
||||||
|
assert 'Missing OTA URL' in response.json['error']
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_ota_invalid_url(client, app):
|
||||||
|
"""Test OTA trigger with invalid URL scheme."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/ota',
|
||||||
|
json={'url': 'ftp://example.com/fw.bin'})
|
||||||
|
assert response.status_code == 400
|
||||||
|
assert 'Invalid URL scheme' in response.json['error']
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_ota_success(client, app):
|
||||||
|
"""Test successful OTA trigger."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/ota',
|
||||||
|
json={'url': 'https://example.com/fw.bin'})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['status'] == 'ota_triggered'
|
||||||
|
assert response.json['url'] == 'https://example.com/fw.bin'
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_calibrate_not_found(client):
|
||||||
|
"""Test calibration trigger for non-existent sensor."""
|
||||||
|
response = client.post('/api/v1/sensors/nonexistent/calibrate', json={})
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_calibrate_invalid_seconds(client, app):
|
||||||
|
"""Test calibration with invalid seconds."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/calibrate',
|
||||||
|
json={'seconds': 100})
|
||||||
|
assert response.status_code == 400
|
||||||
|
assert 'seconds must be 3-60' in response.json['error']
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_calibrate_success(client, app):
|
||||||
|
"""Test successful calibration trigger."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/calibrate',
|
||||||
|
json={'seconds': 15})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['status'] == 'calibration_started'
|
||||||
|
assert response.json['seconds'] == 15
|
||||||
|
|
||||||
|
|
||||||
|
def test_trigger_calibrate_default_seconds(client, app):
|
||||||
|
"""Test calibration with default seconds."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
with patch('esp32_web.api.sensors.socket.socket') as mock_socket:
|
||||||
|
mock_sock = MagicMock()
|
||||||
|
mock_socket.return_value = mock_sock
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/test-sensor/calibrate',
|
||||||
|
json={})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['seconds'] == 10
|
||||||
|
|
||||||
|
|
||||||
|
# Heartbeat Tests
|
||||||
|
|
||||||
|
def test_heartbeat_status_empty(client):
|
||||||
|
"""Test heartbeat status with no sensors."""
|
||||||
|
response = client.get('/api/v1/sensors/heartbeat')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['total'] == 0
|
||||||
|
assert response.json['sensors'] == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_heartbeat_status_with_sensors(client, app):
|
||||||
|
"""Test heartbeat status with sensors."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
|
||||||
|
with app.app_context():
|
||||||
|
# Online sensor (just now)
|
||||||
|
s1 = Sensor(hostname='sensor-online', ip='192.168.1.1',
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
# Stale sensor (3 minutes ago)
|
||||||
|
s2 = Sensor(hostname='sensor-stale', ip='192.168.1.2',
|
||||||
|
last_seen=datetime.now(UTC) - timedelta(minutes=3))
|
||||||
|
# Offline sensor (10 minutes ago)
|
||||||
|
s3 = Sensor(hostname='sensor-offline', ip='192.168.1.3',
|
||||||
|
last_seen=datetime.now(UTC) - timedelta(minutes=10))
|
||||||
|
db.session.add_all([s1, s2, s3])
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/heartbeat')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['total'] == 3
|
||||||
|
assert response.json['online'] == 1
|
||||||
|
assert response.json['stale'] == 1
|
||||||
|
assert response.json['offline'] == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_refresh_heartbeats(client, app):
|
||||||
|
"""Test refreshing heartbeat status."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
|
||||||
|
with app.app_context():
|
||||||
|
# Offline sensor but status still says 'online'
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.1',
|
||||||
|
last_seen=datetime.now(UTC) - timedelta(minutes=10),
|
||||||
|
status='online')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.post('/api/v1/sensors/heartbeat')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['status'] == 'updated'
|
||||||
|
assert response.json['offline'] == 1
|
||||||
|
|
||||||
|
# Verify status was updated
|
||||||
|
with app.app_context():
|
||||||
|
sensor = db.session.scalar(db.select(Sensor).where(Sensor.hostname == 'test-sensor'))
|
||||||
|
assert sensor.status == 'offline'
|
||||||
|
|
||||||
|
|
||||||
|
# Metrics Tests
|
||||||
|
|
||||||
|
def test_metrics_not_found(client):
|
||||||
|
"""Test metrics for non-existent sensor."""
|
||||||
|
response = client.get('/api/v1/sensors/nonexistent/metrics')
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_metrics_empty(client, app):
|
||||||
|
"""Test metrics for sensor with no activity."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/test-sensor/metrics')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['hostname'] == 'test-sensor'
|
||||||
|
assert response.json['hours'] == 24
|
||||||
|
assert response.json['activity']['sightings'] == 0
|
||||||
|
assert response.json['activity']['alerts'] == 0
|
||||||
|
assert response.json['activity']['events'] == 0
|
||||||
|
assert response.json['recent_events'] == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_metrics_with_events(client, app):
|
||||||
|
"""Test metrics with sensor events."""
|
||||||
|
from datetime import datetime, UTC
|
||||||
|
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.flush()
|
||||||
|
|
||||||
|
# Add some events
|
||||||
|
event1 = Event(sensor_id=sensor.id, event_type='presence',
|
||||||
|
payload_json='{"state": "detected"}',
|
||||||
|
timestamp=datetime.now(UTC))
|
||||||
|
event2 = Event(sensor_id=sensor.id, event_type='calibration',
|
||||||
|
payload_json='{"nsub": 52}',
|
||||||
|
timestamp=datetime.now(UTC))
|
||||||
|
db.session.add_all([event1, event2])
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/test-sensor/metrics')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['activity']['events'] == 2
|
||||||
|
assert len(response.json['recent_events']) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_metrics_custom_hours(client, app):
|
||||||
|
"""Test metrics with custom time range."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='test-sensor', ip='192.168.1.100')
|
||||||
|
db.session.add(sensor)
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
response = client.get('/api/v1/sensors/test-sensor/metrics?hours=48')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json['hours'] == 48
|
||||||
|
|||||||
0
tests/test_dashboard/__init__.py
Normal file
0
tests/test_dashboard/__init__.py
Normal file
29
tests/test_dashboard/test_views.py
Normal file
29
tests/test_dashboard/test_views.py
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
"""Dashboard view tests."""
|
||||||
|
|
||||||
|
|
||||||
|
def test_dashboard_index(client):
|
||||||
|
"""Test main dashboard page loads."""
|
||||||
|
response = client.get('/dashboard/')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert b'Device Intelligence' in response.data
|
||||||
|
|
||||||
|
|
||||||
|
def test_dashboard_tab_vendor_treemap(client):
|
||||||
|
"""Test vendor treemap partial loads."""
|
||||||
|
response = client.get('/dashboard/tab/vendor-treemap')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert b'vendor-treemap' in response.data
|
||||||
|
|
||||||
|
|
||||||
|
def test_dashboard_tab_ssid_graph(client):
|
||||||
|
"""Test SSID graph partial loads."""
|
||||||
|
response = client.get('/dashboard/tab/ssid-graph')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert b'ssid-graph' in response.data
|
||||||
|
|
||||||
|
|
||||||
|
def test_dashboard_tab_fingerprint_clusters(client):
|
||||||
|
"""Test fingerprint clusters partial loads."""
|
||||||
|
response = client.get('/dashboard/tab/fingerprint-clusters')
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert b'fingerprint-clusters' in response.data
|
||||||
0
tests/test_services/__init__.py
Normal file
0
tests/test_services/__init__.py
Normal file
127
tests/test_services/test_retention.py
Normal file
127
tests/test_services/test_retention.py
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
"""Data retention service tests."""
|
||||||
|
from datetime import datetime, UTC, timedelta
|
||||||
|
from esp32_web.extensions import db
|
||||||
|
from esp32_web.models import Sensor, Device, Sighting, Probe, Event, Alert
|
||||||
|
from esp32_web.services.retention import cleanup_old_data
|
||||||
|
|
||||||
|
|
||||||
|
def _setup_sensor_and_device(app):
|
||||||
|
"""Create a sensor and device for FK references."""
|
||||||
|
with app.app_context():
|
||||||
|
sensor = Sensor(hostname='s1', ip='10.0.0.1')
|
||||||
|
device = Device(mac='aa:bb:cc:dd:ee:ff', device_type='wifi',
|
||||||
|
last_seen=datetime.now(UTC))
|
||||||
|
db.session.add_all([sensor, device])
|
||||||
|
db.session.commit()
|
||||||
|
return sensor.id, device.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_deletes_old_sightings(app):
|
||||||
|
"""Sightings older than retention period are deleted."""
|
||||||
|
sensor_id, device_id = _setup_sensor_and_device(app)
|
||||||
|
with app.app_context():
|
||||||
|
# Old sighting (30 days ago, retention=14)
|
||||||
|
db.session.add(Sighting(
|
||||||
|
device_id=device_id, sensor_id=sensor_id, rssi=-50,
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=30),
|
||||||
|
))
|
||||||
|
# Recent sighting (1 day ago)
|
||||||
|
db.session.add(Sighting(
|
||||||
|
device_id=device_id, sensor_id=sensor_id, rssi=-60,
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=1),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
assert counts['sightings'] == 1
|
||||||
|
|
||||||
|
remaining = db.session.scalar(
|
||||||
|
db.select(db.func.count(Sighting.id))
|
||||||
|
)
|
||||||
|
assert remaining == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_deletes_old_probes(app):
|
||||||
|
"""Probes older than retention period are deleted."""
|
||||||
|
sensor_id, device_id = _setup_sensor_and_device(app)
|
||||||
|
with app.app_context():
|
||||||
|
db.session.add(Probe(
|
||||||
|
device_id=device_id, sensor_id=sensor_id,
|
||||||
|
ssid='OldNet', rssi=-50, channel=6,
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=30),
|
||||||
|
))
|
||||||
|
db.session.add(Probe(
|
||||||
|
device_id=device_id, sensor_id=sensor_id,
|
||||||
|
ssid='NewNet', rssi=-40, channel=1,
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=1),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
assert counts['probes'] == 1
|
||||||
|
|
||||||
|
remaining = db.session.scalar(
|
||||||
|
db.select(db.func.count(Probe.id))
|
||||||
|
)
|
||||||
|
assert remaining == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_deletes_old_events(app):
|
||||||
|
"""Events older than 60 days are deleted."""
|
||||||
|
sensor_id, _ = _setup_sensor_and_device(app)
|
||||||
|
with app.app_context():
|
||||||
|
db.session.add(Event(
|
||||||
|
sensor_id=sensor_id, event_type='presence',
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=90),
|
||||||
|
))
|
||||||
|
db.session.add(Event(
|
||||||
|
sensor_id=sensor_id, event_type='presence',
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=10),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
assert counts['events'] == 1
|
||||||
|
|
||||||
|
remaining = db.session.scalar(
|
||||||
|
db.select(db.func.count(Event.id))
|
||||||
|
)
|
||||||
|
assert remaining == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_deletes_old_alerts(app):
|
||||||
|
"""Alerts older than 365 days are deleted."""
|
||||||
|
sensor_id, _ = _setup_sensor_and_device(app)
|
||||||
|
with app.app_context():
|
||||||
|
db.session.add(Alert(
|
||||||
|
sensor_id=sensor_id, alert_type='deauth',
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=400),
|
||||||
|
))
|
||||||
|
db.session.add(Alert(
|
||||||
|
sensor_id=sensor_id, alert_type='deauth',
|
||||||
|
timestamp=datetime.now(UTC) - timedelta(days=100),
|
||||||
|
))
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
assert counts['alerts'] == 1
|
||||||
|
|
||||||
|
remaining = db.session.scalar(
|
||||||
|
db.select(db.func.count(Alert.id))
|
||||||
|
)
|
||||||
|
assert remaining == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_no_expired_data(app):
|
||||||
|
"""Cleanup with no expired data returns zero counts."""
|
||||||
|
with app.app_context():
|
||||||
|
counts = cleanup_old_data()
|
||||||
|
assert all(v == 0 for v in counts.values())
|
||||||
|
|
||||||
|
|
||||||
|
def test_cleanup_cli_command(app):
|
||||||
|
"""CLI command runs and outputs results."""
|
||||||
|
runner = app.test_cli_runner()
|
||||||
|
result = runner.invoke(args=['cleanup-data'])
|
||||||
|
assert result.exit_code == 0
|
||||||
|
assert 'No expired data found' in result.output
|
||||||
1
tests/test_utils/__init__.py
Normal file
1
tests/test_utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Utils tests."""
|
||||||
25
tests/test_utils/test_ble_companies.py
Normal file
25
tests/test_utils/test_ble_companies.py
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
"""BLE company lookup tests."""
|
||||||
|
from esp32_web.utils.ble_companies import lookup_ble_company, get_all_companies
|
||||||
|
|
||||||
|
|
||||||
|
def test_lookup_apple():
|
||||||
|
"""Test Apple company ID lookup."""
|
||||||
|
assert lookup_ble_company(0x004C) == 'Apple, Inc.'
|
||||||
|
|
||||||
|
|
||||||
|
def test_lookup_google():
|
||||||
|
"""Test Google company ID lookup."""
|
||||||
|
assert lookup_ble_company(0x00E0) == 'Google'
|
||||||
|
|
||||||
|
|
||||||
|
def test_lookup_unknown():
|
||||||
|
"""Test unknown company ID lookup."""
|
||||||
|
assert lookup_ble_company(0xFFFF) is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_all_companies():
|
||||||
|
"""Test getting all companies."""
|
||||||
|
companies = get_all_companies()
|
||||||
|
assert isinstance(companies, dict)
|
||||||
|
assert len(companies) > 0
|
||||||
|
assert 0x004C in companies
|
||||||
18
tests/test_utils/test_oui.py
Normal file
18
tests/test_utils/test_oui.py
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
"""OUI lookup tests."""
|
||||||
|
from esp32_web.utils.oui import _normalize_mac, lookup_vendor
|
||||||
|
|
||||||
|
|
||||||
|
def test_normalize_mac():
|
||||||
|
"""Test MAC normalization."""
|
||||||
|
assert _normalize_mac('aa:bb:cc:dd:ee:ff') == 'AABBCC'
|
||||||
|
assert _normalize_mac('AA-BB-CC-DD-EE-FF') == 'AABBCC'
|
||||||
|
assert _normalize_mac('aabbccddeeff') == 'AABBCC'
|
||||||
|
assert _normalize_mac('short') == ''
|
||||||
|
|
||||||
|
|
||||||
|
def test_lookup_vendor_no_db():
|
||||||
|
"""Test vendor lookup without database."""
|
||||||
|
# Should return None when no database loaded
|
||||||
|
result = lookup_vendor('aa:bb:cc:dd:ee:ff')
|
||||||
|
# Result depends on whether OUI db exists
|
||||||
|
assert result is None or isinstance(result, str)
|
||||||
Reference in New Issue
Block a user