diff --git a/README.md b/README.md index 8606798343aa7a0acac497848ed7f0844c3bd84e..b7529c118c7f0a4801216b74da44eddc7da26ee6 100644 --- a/README.md +++ b/README.md @@ -1,297 +1,327 @@ --- -title: LeRobot Arena Frontend +title: RobotHub Arena Frontend +tags: + - robotics + - control + - simulation + - svelte + - frontend + - realtime emoji: πŸ€– colorFrom: blue colorTo: purple sdk: docker app_port: 8000 pinned: true -fullWidth: true license: mit -short_description: A web-based robotics control and simulation platform -tags: - - robotics - - control - - simulation - - svelte - - static - - frontend +fullWidth: true +short_description: Web interface of the RobotHub platform – build, monitor & control robots with AI assistance --- -# πŸ€– LeRobot Arena +# πŸ€– RobotHub Arena – Frontend -A web-based robotics control and simulation platform that bridges digital twins and physical robots. Built with Svelte for the frontend and FastAPI for the backend. +RobotHub is an **open-source, end-to-end robotics stack** that combines real-time communication, 3-D visualisation, and modern AI policies to control both simulated and physical robots. -## πŸš€ Simple Deployment Options +**This repository contains the *Frontend*** – a SvelteKit web application that runs completely in the browser (or inside Electron / Tauri). It talks to two backend micro-services that live in their own repositories: -Here are the easiest ways to deploy this Svelte frontend: +1. **[RobotHub Transport Server](https://github.com/julien-blanchon/RobotHub-TransportServer)** + – WebSocket / WebRTC switch-board for video streams & robot joint messages. +2. **[RobotHub Inference Server](https://github.com/julien-blanchon/RobotHub-InferenceServer)** + – FastAPI service that loads large language- and vision-based policies (ACT, Pi-0, SmolVLA, …) and turns camera images + state into joint commands. -### πŸ† Option 1: Hugging Face Spaces (Static) - RECOMMENDED ✨ +```text +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ RobotHub Frontend β”‚ HTTP β”‚ Transport Server β”‚ WebSocket β”‚ Robot / Camera HW β”‚ +β”‚ (this repo) β”‚ <────► β”‚ (rooms, WS, WebRTC) β”‚ ◄──────────►│ – servo bus, USB… β”‚ +β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +β”‚ 3-D scene (Threlte)β”‚ +β”‚ UI / Settings β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ Svelte 5 runes β”‚ HTTP β”‚ Inference Server β”‚ HTTP/WS β”‚ GPU (Torch, HF models) β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ <────► β”‚ (FastAPI, PyTorch) β”‚ β—„β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ίβ””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` -**Automatic deployment** (easiest): -1. **Fork this repository** to your GitHub account -2. **Create a new Space** on [Hugging Face Spaces](https://huggingface.co/spaces) -3. **Connect your GitHub repo** - it will auto-detect the static SDK -4. **Push to main branch** - auto-builds and deploys! +--- -The frontmatter is already configured with: -```yaml -sdk: static -app_build_command: bun install && bun run build -app_file: build/index.html -``` +## ✨ Key Features -**Manual upload**: -1. Run `bun install && bun run build` locally -2. Create a Space with "Static HTML" SDK -3. Upload all files from `build/` folder - -### πŸš€ Option 2: Vercel - One-Click Deploy - -[![Deploy to Vercel](https://vercel.com/button)](https://vercel.com/new) - -Settings: Build command `bun run build`, Output directory `build` - -### πŸ“ Option 3: Netlify - Drag & Drop - -1. Build locally: `bun install && bun run build` -2. Drag `build/` folder to [Netlify](https://netlify.com) - -### πŸ†“ Option 4: GitHub Pages - -Add this workflow file (`.github/workflows/deploy.yml`): -```yaml -name: Deploy to GitHub Pages -on: - push: - branches: [ main ] -jobs: - deploy: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - - uses: oven-sh/setup-bun@v1 - - run: bun install --frozen-lockfile - - run: bun run build - - uses: peaceiris/actions-gh-pages@v3 - with: - github_token: ${{ secrets.GITHUB_TOKEN }} - publish_dir: ./build -``` +β€’ **Digital-Twin 3-D Scene** – inspect robots, cameras & AI compute blocks in real-time. +β€’ **Multi-Workspace Collaboration** – share a hash URL and others join the *same* WS rooms instantly. +β€’ **Drag-&-Drop Add-ons** – spawn robots, cameras or AI models from the toolbar. +β€’ **Transport-Agnostic** – control physical hardware over USB, or send/receive via WebRTC rooms. +β€’ **Model Agnostic** – any policy exposed by the Inference Server can be used (ACT, Diffusion, …). +β€’ **Reactive Core** – built with *Svelte 5 runes* – state is automatically pushed into the UI. -### 🐳 Option 5: Docker (Optional) +--- -For local development or custom hosting: -```bash -docker build -t lerobot-arena-frontend . -docker run -p 3000:3000 lerobot-arena-frontend -``` +## πŸ“‚ Repository Layout (short) -The Docker setup uses Bun's simple static server - much simpler than the complex server.js approach! +| Path | Purpose | +|-------------------------------|---------| +| `src/` | SvelteKit app (routes, components) | +| `src/lib/elements` | Runtime domain logic (robots, video, compute) | +| `external/RobotHub-*` | Git sub-modules for the backend services – used for generated clients & tests | +| `static/` | URDFs, STL meshes, textures, favicon | -## πŸ› οΈ Development Setup +A more in-depth component overview can be found in `/src/lib/components/**` – every major popup/modal has its own Svelte file. -For local development with hot-reload capabilities: +--- -### Frontend Development +## πŸš€ Quick Start (dev) ```bash -# Install dependencies -bun install +# 1. clone with submodules (transport + inference) +$ git clone --recurse-submodules https://github.com/julien-blanchon/RobotHub-Frontend robothub-frontend +$ cd robothub-frontend -# Start the development server -bun run dev +# 2. install deps (uses Bun) +$ bun install -# Or open in browser automatically -bun run dev -- --open +# 3. start dev server (http://localhost:5173) +$ bun run dev -- --open ``` -### Backend Development +### Running the full stack locally ```bash -# Navigate to Python backend -cd src-python - -# Install Python dependencies (using uv) -uv sync +# 1. start Transport Server (rooms & streaming) +$ cd external/RobotHub-InferenceServer/external/RobotHub-TransportServer/server +$ uv run launch_with_ui.py # β†’ http://localhost:8000 -# Or using pip -pip install -e . +# 2. start Inference Server (AI brains) +$ cd ../../.. +$ python launch_simple.py # β†’ http://localhost:8001 -# Start the backend server -python start_server.py +# 3. frontend (separate terminal) +$ bun run dev -- --open # β†’ http://localhost:5173 (hash = workspace-id) ``` -### Building Standalone Executable +The **workspace-id** in the URL hash ties all three services together. Share `http://localhost:5173/#` and a collaborator instantly joins the same set of rooms. -The backend can be packaged as a standalone executable using box-packager: +--- -```bash -# Navigate to Python backend -cd src-python +## πŸ› οΈ Usage Walk-Through -# Install box-packager (if not already installed) -uv pip install box-packager +1. **Open the web-app** β†’ a fresh *workspace* is created (☝ left corner shows 🌐 ID). +2. Click *Add Robot* β†’ spawns an SO-100 6-DoF arm (URDF). +3. Click *Add Sensor β†’ Camera* β†’ creates a virtual camera element. +4. Click *Add Model β†’ ACT* β†’ spawns a *Compute* block. +5. On the Compute block choose *Create Session* – select model path (`./checkpoints/act_so101_beyond`) and cameras (`front`). +6. Connect: + β€’ *Video Input* – local webcam β†’ `front` room. + β€’ *Robot Input* – robot β†’ *joint-input* room (producer). + β€’ *Robot Output* – robot ← AI predictions (consumer). +7. Press *Start Inference* – the model will predict the next joint trajectory every few frames. πŸŽ‰ -# Package the application -box package +All modals (`AISessionConnectionModal`, `RobotInputConnectionModal`, …) expose precisely what is happening under the hood: which room ID, whether you are *producer* or *consumer*, and the live status. -# The executable will be in target/release/lerobot-arena-server -./target/release/lerobot-arena-server -``` +--- -Note: Requires [Rust/Cargo](https://rustup.rs/) to be installed for box-packager to work. +## 🧩 Package Relations -## πŸ“‹ Project Structure +| Package | Role | Artifacts exposed to this repo | +|---------|------|--------------------------------| +| **Transport Server** | Low-latency switch-board (WS/WebRTC). Creates *rooms* for video & joint messages. | TypeScript & Python client libraries (imported from sub-module) | +| **Inference Server** | Loads checkpoints (ACT, Pi-0, …) and manages *sessions*. Each session automatically asks the Transport Server to create dedicated rooms. | Generated TS SDK (`@robothub/inference-server-client`) – auto-called from `RemoteComputeManager` | +| **Frontend (this repo)** | UI + 3-D scene. Manages *robots*, *videos* & *compute* blocks and connects them to the correct rooms. | – | -``` -lerobot-arena/ -β”œβ”€β”€ src/ # Svelte frontend source -β”‚ β”œβ”€β”€ lib/ # Reusable components and utilities -β”‚ β”œβ”€β”€ routes/ # SvelteKit routes -β”‚ └── app.html # App template -β”œβ”€β”€ src-python/ # Python backend -β”‚ β”œβ”€β”€ src/ # Python source code -β”‚ β”œβ”€β”€ start_server.py # Server entry point -β”‚ β”œβ”€β”€ target/ # Box-packager build output (excluded from git) -β”‚ └── pyproject.toml # Python dependencies -β”œβ”€β”€ static/ # Static assets -β”œβ”€β”€ Dockerfile # Docker configuration -β”œβ”€β”€ docker-compose.yml # Docker Compose setup -└── package.json # Node.js dependencies -``` +> Because the two backend repos are included as git sub-modules you can develop & debug the whole trio in one repo clone. -## 🐳 Docker Information +--- -The Docker setup includes: +## πŸ“œ Important Components (frontend) -- **Multi-stage build**: Optimized for production using Bun and uv -- **Automatic startup**: Both services start together -- **Port mapping**: Backend on 8080, Frontend on 3000 (HF Spaces compatible) -- **Static file serving**: Compiled Svelte app served efficiently -- **User permissions**: Properly configured for Hugging Face Spaces -- **Standalone executable**: Backend packaged with box-packager for faster startup +β€’ `RemoteComputeManager` – wraps the Inference Server REST API. +β€’ `RobotManager` – talks to Transport Server and USB drivers. +β€’ `VideoManager` – handles local/remote camera streams and WebRTC. -For detailed Docker documentation, see [DOCKER_README.md](./DOCKER_README.md). +Each element is a small class with `$state` fields which Svelte 5 picks up automatically. The modals listed below are *thin* UI shells around those classes: -## πŸ”§ Building for Production +``` +AISessionConnectionModal – create / start / stop AI sessions +RobotInputConnectionModal – joint-states β†’ AI +RobotOutputConnectionModal – AI commands β†’ robot +VideoInputConnectionModal – camera β†’ AI or screen +ManualControlSheet – slider control, runs when no consumer connected +SettingsSheet – configure base URLs of the two servers +``` -### Frontend Only +--- -```bash -bun run build -``` +## 🐳 Docker -### Backend Standalone Executable +A production-grade image is provided (multi-stage, 24 MB with Bun runtime): ```bash -cd src-python -box package +$ docker build -t robothub-frontend . +$ docker run -p 8000:8000 robothub-frontend # served by vite-preview ``` -### Complete Docker Build +See `Dockerfile` for the full build – it also performs `bun test` & `bun run build` for the TS clients inside the sub-modules so that the image is completely self-contained. -```bash -docker-compose up --build -``` +--- -## 🌐 What's Included +## πŸ§‘β€πŸ’» Contributing -- **Real-time Robot Control**: WebSocket-based communication -- **3D Visualization**: Three.js integration for robot visualization -- **URDF Support**: Load and display robot models -- **Multi-robot Management**: Control multiple robots simultaneously -- **WebSocket API**: Real-time bidirectional communication -- **Standalone Distribution**: Self-contained executable with box-packager +PRs are welcome! The codebase is organised into **domain managers** (robot / video / compute) and **pure-UI** components. If you add a new feature, create a manager first so that business logic can be unit-tested without DOM. -## 🚨 Troubleshooting +1. `bun test` – unit tests. +2. `bun run typecheck` – strict TS config. -### Port Conflicts +Please run `bun format` before committing – ESLint + Prettier configs are included. -If ports 8080 or 3000 are already in use: +--- -```bash -# Check what's using the ports -lsof -i :8080 -lsof -i :3000 +## πŸ™ Special Thanks -# Use different ports -docker run -p 8081:8080 -p 7861:3000 lerobot-arena -``` +Huge gratitude to [Tim Qian](https://github.com/timqian) ([X/Twitter](https://x.com/tim_qian)) and the +[bambot project](https://bambot.org/) for open-sourcing **feetech.js** – the +delightful js driver that powers our USB communication layer. +--- -### Container Issues +## πŸ“„ License -```bash -# View logs -docker-compose logs lerobot-arena +MIT – see `LICENSE` in the root. -# Rebuild without cache -docker-compose build --no-cache -docker-compose up -``` +## 🌱 Project Philosophy -### Development Issues +RobotHub follows a **separation-of-concerns** design: -```bash -# Clear node modules and reinstall -rm -rf node_modules -bun install +* **Transport Server** is the single source of truth for *real-time* data – video frames, joint values, heart-beats. Every participant (browser, Python script, robot firmware) only needs one WebSocket/WebRTC connection, no matter how many peers join later. +* **Inference Server** is stateless with regard to connectivity; it spins up / tears down *sessions* that rely on rooms in the Transport Server. This lets heavy AI models live on a GPU box while cameras and robots stay on the edge. +* **Frontend** stays 100 % in the browser – no secret keys or device drivers required – and simply wires together rooms that already exist. + +> By decoupling the pipeline we can deploy each piece on separate hardware or even different clouds, swap alternative implementations (e.g. ROS bridge instead of WebRTC) and scale each micro-service independently. + +--- + +## πŸ›° Transport Server – Real-Time Router -# Clear Svelte kit cache -rm -rf .svelte-kit -bun run dev +``` +Browser / Robot ⟷ 🌐 Transport Server ⟷ Other Browser / AI / HW ``` -### Box-packager Issues +* **Creates rooms** – `POST /robotics/workspaces/{ws}/rooms` or `POST /video/workspaces/{ws}/rooms`. +* **Manages roles** – every WebSocket identifies as *producer* (source) or *consumer* (sink). +* **Does zero processing** – it only forwards JSON (robotics) or WebRTC SDP/ICE (video). +* **Health-check** – `GET /api/health` returns a JSON heartbeat. -```bash -# Clean build artifacts -cd src-python -box clean +Why useful? -# Rebuild executable -box package +* You never expose robot hardware directly to the internet – it only speaks to the Transport Server. +* Multiple followers can subscribe to the *same* producer without extra bandwidth on the producer side (server fans out messages). +* Works across NAT thanks to WebRTC TURN support. -# Install cargo if missing -curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh +## 🏒 Workspaces – Lightweight Multi-Tenant Isolation + +A **workspace** is simply a UUID namespace in the Transport Server. Every room URL starts with: + +``` +/robotics/workspaces/{workspace_id}/rooms/{room_id} +/video/workspaces/{workspace_id}/rooms/{room_id} ``` -## πŸš€ Hugging Face Spaces Deployment +Why bother? -This project is configured for **Static HTML** deployment on Hugging Face Spaces (much simpler than Docker!): +1. **Privacy / Security** – clients in workspace *A* can neither list nor join rooms from workspace *B*. A workspace id is like a private password that keeps the rooms in the same workspace isolated from each other. +2. **Organisation** – keep each class, project or experiment separated without spinning up extra servers. +3. **Zero-config sharing** – the Frontend stores the workspace ID in the URL hash (e.g. `/#d742e85d-c9e9-4f7b-…`). Send that link to a teammate and they automatically connect to the *same* namespace – all existing video feeds, robot rooms and AI sessions become visible. +4. **Stateless Scale-out** – Transport Server holds no global state; deleting a workspace removes all rooms in one call. -**Manual Upload (Easiest):** -1. Run `bun install && bun run build` locally -2. Create a new Space with "Static HTML" SDK -3. Upload all files from `build/` folder -4. Your app is live! +Typical lifecycle: -**GitHub Integration:** -1. Fork this repository -2. Create a Space and connect your GitHub repo -3. The Static HTML SDK will be auto-detected from the README frontmatter -4. Push changes to auto-deploy +* **Create** – Frontend generates `crypto.randomUUID()` if the hash is empty. Back-end rooms are lazily created when the first producer/consumer calls the REST API. +* **Share** – click the *#workspace* badge β†’ *Copy URL* (handled by `WorkspaceIdButton.svelte`) -No Docker, no complex setup - just static files! πŸŽ‰ +> Practical tip: Use one workspace per demo to prevent collisions, then recycle it afterwards. -## πŸ“š Additional Documentation +--- -- [Docker Setup Guide](./DOCKER_README.md) - Detailed Docker instructions -- [Robot Architecture](./ROBOT_ARCHITECTURE.md) - System architecture overview -- [Robot Instancing Guide](./ROBOT_INSTANCING_README.md) - Multi-robot setup +## 🧠 Inference Server – Session Lifecycle + +1. **Create session** + `POST /api/sessions` with JSON: + ```jsonc + { + "session_id": "pick_place_demo", + "policy_path": "./checkpoints/act_so101_beyond", + "camera_names": ["front", "wrist"], + "transport_server_url": "http://localhost:8000", + "workspace_id": "" // optional + } + ``` +2. **Receive response** + ```jsonc + { + "workspace_id": "ws-uuid", + "camera_room_ids": { "front": "room-id-a", "wrist": "room-id-b" }, + "joint_input_room_id": "room-id-c", + "joint_output_room_id": "room-id-d" + } + ``` +3. **Wire connections** + * Camera PC joins `front` / `wrist` rooms as **producer** (WebRTC). + * Robot joins `joint_input_room_id` as **producer** (joint states). + * Robot (or simulator) joins `joint_output_room_id` as **consumer** (commands). +4. **Start inference** + `POST /api/sessions/{id}/start` – server loads the model and begins publishing commands. +5. **Stop / delete** as needed. Stats & health are available via `GET /api/sessions`. + +The Frontend automates steps 1-4 via the *AI Session* modal – you only click buttons. -## 🀝 Contributing +--- -1. Fork the repository -2. Create a feature branch -3. Make your changes -4. Test with Docker: `docker-compose up --build` -5. Submit a pull request +## 🌐 Hosted Demo End-Points -## πŸ“„ License +| Service | URL | Status | +|---------|-----|--------| +| Transport Server | | Public & healthy | +| Inference Server | | `{"status":"healthy"}` | +| Frontend (read-only preview) | | latest `main` | -This project is licensed under the MIT License. +Point the *Settings β†’ Server Configuration* panel to these URLs and you can play without any local backend. --- -**Built with ❀️ for the robotics community** πŸ€– +## 🎯 Main Use-Cases + +Below are typical connection patterns you can set-up **entirely from the UI**. Each example lists the raw data-flow (β†’ = producer to consumer/AI) plus a video placeholder you can swap for a screen-capture. + +### Direct Tele-Operation (Leader ➜ Follower) +*Leader PC* `USB` ➜ **Robot A** ➜ `Remote producer` β†’ **Transport room** β†’ `Remote consumer` ➜ **Robot B** (`USB`) + +> One human moves Robot A, Robot B mirrors the motion in real-time. Works with any number of followers – just add more consumers to the same room. +> +> πŸ“Ί *demo-teleop-1.mp4* + +### Web-UI Manual Control +**Browser sliders** (`ManualControlSheet`) β†’ `Remote producer` β†’ **Robot (USB)** + +> No physical master arm needed – drive joints from any device. +> +> πŸ“Ί *demo-webui.mp4* + +### AI Inference Loop +**Robot (USB)** ➜ `Remote producer` β†’ **joint-input room** +**Camera PC** ➜ `Video producer` β†’ **camera room(s)** +**Inference Server** (consumer) β†’ processes β†’ publishes to **joint-output room** β†’ `Remote consumer` ➜ **Robot** + +> Lets a low-power robot PC stream data while a beefy GPU node does the heavy lifting. +> +> πŸ“Ί *demo-inference.mp4* + +### Hybrid Classroom (Multi-Follower AI) +*Same as AI Inference Loop* with additional **Robot C, D…** subscribing to `joint_output_room_id` to run the same policy in parallel. + +> Useful for swarm behaviours or classroom demonstrations. +> +> πŸ“Ί *demo-classroom.mp4* + +### Split Video / Robot Across Machines +**Laptop A** (near cameras) β†’ streams video β†’ Transport +**Laptop B** (near robot) β†’ joins joint rooms +**Browser** anywhere β†’ watches video consumer & sends manual overrides + +> Ideal when the camera PC stays close to sensors and you want minimal upstream bandwidth. +> +> πŸ“Ί *demo-splitio.mp4* diff --git a/docker-build.sh b/docker-build.sh deleted file mode 100755 index 7a4e78cc31db52c7c2c9d316462f726eeeaf7668..0000000000000000000000000000000000000000 --- a/docker-build.sh +++ /dev/null @@ -1,30 +0,0 @@ -#!/bin/bash - -# Build and run the LeRobot Arena Frontend Docker container - -set -e - -echo "πŸ—οΈ Building LeRobot Arena Frontend Docker image..." - -# Build the image -docker build -t lerobot-arena-svelte-frontend . - -echo "βœ… Build completed successfully!" - -echo "πŸš€ Starting the container..." - -# Run the container -docker run -d \ - --name lerobot-arena-svelte-frontend \ - -p 3000:3000 \ - --restart unless-stopped \ - lerobot-arena-svelte-frontend - -echo "βœ… Container started successfully!" -echo "🌐 Frontend is available at: http://localhost:3000" -echo "" -echo "πŸ“‹ Useful commands:" -echo " β€’ View logs: docker logs -f lerobot-arena-svelte-frontend" -echo " β€’ Stop: docker stop lerobot-arena-svelte-frontend" -echo " β€’ Remove: docker rm lerobot-arena-svelte-frontend" -echo " β€’ Health check: docker inspect --format='{{.State.Health.Status}}' lerobot-arena-svelte-frontend" \ No newline at end of file diff --git a/log.txt b/log.txt deleted file mode 100644 index cc51ceb6bf2467f5fcef1bad25470b7f158a16a1..0000000000000000000000000000000000000000 --- a/log.txt +++ /dev/null @@ -1 +0,0 @@ -.venv/bin/python: can't open file '/Users/julienblanchon/Git/lerobot-arena/lerobot-arena/src-python-video/src/main.py': [Errno 2] No such file or directory diff --git a/src/lib/components/3d/Floor.svelte b/src/lib/components/3d/Floor.svelte index 531fe67787235b900cfb9658fe2cafe3fa8eb5db..b890040374cf636076dbfccbb6b9cccd4aba0b73 100644 --- a/src/lib/components/3d/Floor.svelte +++ b/src/lib/components/3d/Floor.svelte @@ -1,30 +1,24 @@ - + - - - diff --git a/src/lib/components/3d/elements/compute/ComputeGridItem.svelte b/src/lib/components/3d/elements/compute/ComputeGridItem.svelte index 447bc098235b20f0dc7f7255538cd92cf17f6fc5..aa93f4b213920bb3b481be22e7d1d4bb5f13114e 100644 --- a/src/lib/components/3d/elements/compute/ComputeGridItem.svelte +++ b/src/lib/components/3d/elements/compute/ComputeGridItem.svelte @@ -12,7 +12,8 @@ onRobotOutputBoxClick: (compute: RemoteCompute) => void; } - let { compute, onVideoInputBoxClick, onRobotInputBoxClick, onRobotOutputBoxClick }: Props = $props(); + let { compute, onVideoInputBoxClick, onRobotInputBoxClick, onRobotOutputBoxClick }: Props = + $props(); const { onPointerEnter, onPointerLeave, hovering } = useCursor(); interactivity(); @@ -31,21 +32,16 @@ position.z={compute.position.z} scale={[1, 1, 1]} > - + - \ No newline at end of file + diff --git a/src/lib/components/3d/elements/compute/GPU.svelte b/src/lib/components/3d/elements/compute/GPU.svelte index 0780bbbec8650a386f15e3bed3b4e55573a6dfee..413cca3dc716cb26c8a4e9ca99d1df4051265fb3 100644 --- a/src/lib/components/3d/elements/compute/GPU.svelte +++ b/src/lib/components/3d/elements/compute/GPU.svelte @@ -1,13 +1,9 @@ @@ -116,7 +115,7 @@ - Video Input - {compute.name || 'No Compute Selected'} + Video Input - {compute.name || "No Compute Selected"} Connect camera streams to provide visual input for AI inference @@ -150,8 +149,8 @@ - You need to create an Inference Session before connecting video inputs. - The session defines which camera names are available for connection. + You need to create an Inference Session before connecting video inputs. The session + defines which camera names are available for connection. {:else} @@ -170,9 +169,9 @@
{#each compute.sessionConfig?.cameraNames || [] as cameraName}
{#if !localStream} @@ -201,10 +200,10 @@ size="sm" onclick={handleConnectLocalCamera} disabled={isConnecting} - class="bg-green-600 hover:bg-green-700 text-xs disabled:opacity-50" + class="bg-green-600 text-xs hover:bg-green-700 disabled:opacity-50" > {#if isConnecting} - + Connecting... {:else} @@ -229,12 +228,14 @@ {#if localStream}
Live Preview:
-
+
- - - {#if robot.hasConsumer && robot.consumer?.name?.includes('Remote Consumer')} - -
+ + + + + +
-

Room Connected

-

Receiving remote commands

+ + + Remote Collaboration (Rooms) + + + Receive commands from AI systems, remote users, or other software +
-
- {:else} - -
-
-
- -

Create New Room

+ + + {#if robot.hasConsumer && robot.consumer?.name?.includes("Remote Consumer")} + +
+
+
+

+ Room Connected +

+

+ Receiving remote commands +

+
+ +
-

- Create a room where others can send commands to this robot -

- -
- - + {:else} + +
+
+
+ +

+ Create New Room +

+
+

+ Create a room where others can send commands to this robot +

+ +
+ + +
+
-
-
- -
-
- Join Existing Room: - - {robotManager.rooms.length} room{robotManager.rooms.length !== 1 ? 's' : ''} available - -
- -
- {#if robotManager.rooms.length === 0} -
- {robotManager.roomsLoading ? 'Loading rooms...' : 'No rooms available. Create one to get started.'} + +
+
+ Join Existing Room: + + {robotManager.rooms.length} room{robotManager.rooms.length !== 1 ? "s" : ""} available +
- {:else} - {#each robotManager.rooms as room} -
-
-
-

- {room.id} -

-
- {room.has_producer ? 'πŸ“€ Has Output' : 'πŸ“₯ No Output'} - πŸ‘₯ {room.participants?.total || 0} users + +
+ {#if robotManager.rooms.length === 0} +
+ {robotManager.roomsLoading + ? "Loading rooms..." + : "No rooms available. Create one to get started."} +
+ {:else} + {#each robotManager.rooms as room} +
+
+
+

+ {room.id} +

+
+ {room.has_producer ? "πŸ“€ Has Output" : "πŸ“₯ No Output"} + πŸ‘₯ {room.participants?.total || 0} users +
+
+
- -
-
- {/each} + {/each} + {/if} +
+
+ + {#if robot.hasConsumer} +

+ Disconnect current input to join a room +

{/if} -
-
- - {#if robot.hasConsumer} -

- Disconnect current input to join a room -

- {/if} - {/if} - - - - - - - Input Sources - - USB: Read physical movements β€’ Remote: Receive network commands β€’ Only one active at a time - - - {/if} + {/if} + + + + + + + Input Sources + + USB: Read physical movements β€’ Remote: Receive network + commands β€’ Only one active at a time + + + {/if}
- \ No newline at end of file + diff --git a/src/lib/components/3d/elements/robot/modal/ManualControlSheet.svelte b/src/lib/components/3d/elements/robot/modal/ManualControlSheet.svelte index 3fcb5a616cf7679c80aa5c14177e5fbb1419dc8c..4df71fcbf8efbbb73b7f647e9b0960ecec5f5b50 100644 --- a/src/lib/components/3d/elements/robot/modal/ManualControlSheet.svelte +++ b/src/lib/components/3d/elements/robot/modal/ManualControlSheet.svelte @@ -19,16 +19,22 @@ - +
- Manual Control -

Direct robot joint manipulation

+ Manual Control +

+ Direct robot joint manipulation +

@@ -36,41 +42,54 @@ {#if robot} -
+
{#if robot.isManualControlEnabled}
- -

Joint Controls

+ +

+ Joint Controls +

{robot.jointArray.length}

- Each joint can be moved independently using sliders. Values are normalized percentages. + Each joint can be moved independently using sliders. Values are normalized + percentages.

{#if robot.jointArray.length === 0} -

No joints available

+

+ No joints available +

{:else}
{#each robot.jointArray as joint (joint.name)} - {@const isGripper = joint.name.toLowerCase() === 'jaw' || joint.name.toLowerCase() === 'gripper'} + {@const isGripper = + joint.name.toLowerCase() === "jaw" || joint.name.toLowerCase() === "gripper"} {@const minValue = isGripper ? 0 : -100} {@const maxValue = isGripper ? 100 : 100} - -
+ +
- {joint.name} + {joint.name}
- {joint.value.toFixed(1)}{isGripper ? '%' : '%'} + {joint.value.toFixed(1)}{isGripper ? "%" : "%"} {#if joint.limits} - + ({joint.limits.lower.toFixed(1)}Β° to {joint.limits.upper.toFixed(1)}Β°) {/if} @@ -89,9 +108,11 @@ }} class="slider h-2 w-full cursor-pointer appearance-none rounded-lg bg-slate-300 dark:bg-slate-600" /> -
- {minValue}{isGripper ? '% (closed)' : '%'} - {maxValue}{isGripper ? '% (open)' : '%'} +
+ {minValue}{isGripper ? "% (closed)" : "%"} + {maxValue}{isGripper ? "% (open)" : "%"}
@@ -101,17 +122,24 @@
{:else}
-
-

Input Control Active

-
+
+

+ Input Control Active +

+
- - Input Control Active - - Robot controlled by: {robot.consumer?.name || 'External Input'}
- Disconnect input to enable manual control. -
-
+ + Input Control Active + + Robot controlled by: {robot.consumer?.name || "External Input"}
+ Disconnect input to enable manual control. +
+
{/if}
diff --git a/src/lib/components/3d/elements/robot/modal/OutputConnectionModal.svelte b/src/lib/components/3d/elements/robot/modal/OutputConnectionModal.svelte index 1e1859e924c2783c2b67ebcfe2cd75ce370259d4..2634a353acd3885bc7937575985acb88da9bba2d 100644 --- a/src/lib/components/3d/elements/robot/modal/OutputConnectionModal.svelte +++ b/src/lib/components/3d/elements/robot/modal/OutputConnectionModal.svelte @@ -25,11 +25,11 @@ // USB connection flow state let showUSBCalibration = $state(false); - let pendingUSBConnection: 'output' | null = $state(null); + let pendingUSBConnection: "output" | null = $state(null); // Room management state - let selectedRoomId = $state(''); - let customRoomId = $state(''); + let selectedRoomId = $state(""); + let customRoomId = $state(""); let hasLoadedRooms = $state(false); // Reactive state from robot @@ -42,7 +42,7 @@ refreshRooms(); hasLoadedRooms = true; } - + // Reset when modal closes if (!open) { hasLoadedRooms = false; @@ -63,7 +63,7 @@ error = null; await robotManager.refreshRooms(workspaceId); } catch (err) { - error = err instanceof Error ? err.message : 'Failed to refresh rooms'; + error = err instanceof Error ? err.message : "Failed to refresh rooms"; } } @@ -73,18 +73,18 @@ error = null; const roomId = customRoomId.trim() || robot.id; const result = await robotManager.createRoboticsRoom(workspaceId, roomId); - + if (result.success) { - customRoomId = ''; + customRoomId = ""; await refreshRooms(); toast.success("Room Created", { description: `Successfully created room ${result.roomId}` }); } else { - error = result.error || 'Failed to create room'; + error = result.error || "Failed to create room"; } } catch (err) { - error = err instanceof Error ? err.message : 'Failed to create room'; + error = err instanceof Error ? err.message : "Failed to create room"; } finally { isConnecting = false; } @@ -92,10 +92,10 @@ async function joinRoomAsOutput() { if (!selectedRoomId) { - error = 'Please select a room'; + error = "Please select a room"; return; } - + try { isConnecting = true; error = null; @@ -104,7 +104,7 @@ description: `Successfully joined room ${selectedRoomId} - now sending commands` }); } catch (err) { - error = err instanceof Error ? err.message : 'Failed to join room as output'; + error = err instanceof Error ? err.message : "Failed to join room as output"; } finally { isConnecting = false; } @@ -116,18 +116,18 @@ error = null; const roomId = customRoomId.trim() || robot.id; const result = await robotManager.connectProducerAsProducer(workspaceId, robot.id, roomId); - + if (result.success) { - customRoomId = ''; + customRoomId = ""; await refreshRooms(); toast.success("Room Created & Joined", { description: `Successfully created and joined room ${result.roomId} - ready to send commands` }); } else { - error = result.error || 'Failed to create room and join as output'; + error = result.error || "Failed to create room and join as output"; } } catch (err) { - error = err instanceof Error ? err.message : 'Failed to create room and join as output'; + error = err instanceof Error ? err.message : "Failed to create room and join as output"; } finally { isConnecting = false; } @@ -140,21 +140,21 @@ // Check if calibration is needed if (robot.calibrationManager.needsCalibration) { - pendingUSBConnection = 'output'; + pendingUSBConnection = "output"; showUSBCalibration = true; return; } await robot.addProducer({ - type: 'usb', + type: "usb", baudRate: 1000000 }); - + toast.success("USB Output Connected", { description: "Successfully connected to physical robot hardware" }); } catch (err) { - error = err instanceof Error ? err.message : 'Unknown error'; + error = err instanceof Error ? err.message : "Unknown error"; toast.error("Failed to Connect USB Output", { description: `Could not connect to robot hardware: ${error}` }); @@ -168,12 +168,12 @@ isConnecting = true; error = null; await robot.removeProducer(producerId); - + toast.success("Output Disconnected", { description: "Successfully disconnected output" }); } catch (err) { - error = err instanceof Error ? err.message : 'Unknown error'; + error = err instanceof Error ? err.message : "Unknown error"; toast.error("Failed to Disconnect Output", { description: `Could not disconnect output: ${error}` }); @@ -185,11 +185,11 @@ // Handle calibration completion async function onCalibrationComplete() { showUSBCalibration = false; - - if (pendingUSBConnection === 'output') { + + if (pendingUSBConnection === "output") { await connectUSBOutput(); } - + pendingUSBConnection = null; } @@ -205,12 +205,15 @@ class="max-h-[85vh] max-w-4xl overflow-hidden border-slate-300 bg-slate-100 text-slate-900 dark:border-slate-600 dark:bg-slate-900 dark:text-slate-100" > - + Output Connection - Robot {robot.id} - Configure where this robot sends its movements. Multiple outputs can be active simultaneously. + Configure where this robot sends its movements. Multiple outputs can be active + simultaneously. @@ -218,10 +221,12 @@
{#if error} - + Connection Error - + {error} @@ -229,9 +234,11 @@ {#if showUSBCalibration} - + -
+
Hardware Calibration Required @@ -244,14 +251,18 @@
- - - - Before connecting to the physical robot, calibration is required to map the servo positions to software values. This ensures accurate control. + + + + Before connecting to the physical robot, calibration is required to map the servo + positions to software values. This ensures accurate control. - {:else} - - - - -
-
- - Active Outputs -
- - {outputDriverCount} Connected - -
-
-
- - - - - - - Local Hardware (USB) - - - Send commands directly to physical robot hardware - - - - - - - - - - -
-
- - - Remote Collaboration (Rooms) - - - Broadcast robot movements to remote systems and AI - -
- -
-
- - -
-
+ + + +
- -

Create New Room

-
-

- Create a room to broadcast this robot's movements -

- -
- -
+ + {outputDriverCount} Connected +
-
- - -
-
- Join Existing Room: - - {robotManager.rooms.length} room{robotManager.rooms.length !== 1 ? 's' : ''} available - -
- -
- {#if robotManager.rooms.length === 0} -
- {robotManager.roomsLoading ? 'Loading rooms...' : 'No rooms available. Create one to get started.'} -
- {:else} - {#each robotManager.rooms as room} -
-
-
-

- {room.id} -

-
- {room.has_producer ? 'πŸ”΄ Occupied' : '🟒 Available'} - πŸ‘₯ {room.participants?.total || 0} users -
-
- {#if !room.has_producer} - - {:else} - - {/if} -
-
- {/each} - {/if} -
-
- - + + - - {#if producers.length > 0} - + + - - - Connected Outputs + + + Local Hardware (USB) + + Send commands directly to physical robot hardware + - -
- {#each producers as producer} -
-
- - {producer.name} - {producer.id.slice(0, 12)} -
+ + + + + + + + +
+
+ + + Remote Collaboration (Rooms) + + + Broadcast robot movements to remote systems and AI + +
+ +
+
+ + +
+
+
+ +

+ Create New Room +

+
+

+ Create a room to broadcast this robot's movements +

+ +
+
- {/each} +
+
+ + +
+
+ Join Existing Room: + + {robotManager.rooms.length} room{robotManager.rooms.length !== 1 ? "s" : ""} available + +
+ +
+ {#if robotManager.rooms.length === 0} +
+ {robotManager.roomsLoading + ? "Loading rooms..." + : "No rooms available. Create one to get started."} +
+ {:else} + {#each robotManager.rooms as room} +
+
+
+

+ {room.id} +

+
+ {room.has_producer ? "πŸ”΄ Occupied" : "🟒 Available"} + πŸ‘₯ {room.participants?.total || 0} users +
+
+ {#if !room.has_producer} + + {:else} + + {/if} +
+
+ {/each} + {/if} +
- {/if} - - - - Output Sources - - USB: Control physical hardware β€’ Remote: Broadcast to network β€’ Multiple outputs can be active - - + + {#if producers.length > 0} + + + + + Connected Outputs + + + +
+ {#each producers as producer} +
+
+ + {producer.name} + {producer.id.slice(0, 12)} +
+ +
+ {/each} +
+
+
+ {/if} + + + + + Output Sources + + USB: Control physical hardware β€’ Remote: Broadcast to + network β€’ Multiple outputs can be active + + {/if}
- \ No newline at end of file + diff --git a/src/lib/components/3d/elements/robot/status/ConnectionFlowBoxUIkit.svelte b/src/lib/components/3d/elements/robot/status/ConnectionFlowBoxUIkit.svelte index ac5015318e5827eec3438fd45f7513d2ba5d530a..76f4e50b1307ca3ee4acde500d4e081a2b4fd862 100644 --- a/src/lib/components/3d/elements/robot/status/ConnectionFlowBoxUIkit.svelte +++ b/src/lib/components/3d/elements/robot/status/ConnectionFlowBoxUIkit.svelte @@ -15,27 +15,16 @@ let { robot, onInputBoxClick, onRobotBoxClick, onOutputBoxClick }: Props = $props(); - // Colors const inputColor = "rgb(34, 197, 94)"; const outputColor = "rgb(59, 130, 246)"; - - - + diff --git a/src/lib/components/3d/elements/robot/status/InputBoxUIKit.svelte b/src/lib/components/3d/elements/robot/status/InputBoxUIKit.svelte index 15000e042ac6aa773fc8c9806499e57953bdc258..b2d3098614fd06f33195f24ce59f8788dd8a1a51 100644 --- a/src/lib/components/3d/elements/robot/status/InputBoxUIKit.svelte +++ b/src/lib/components/3d/elements/robot/status/InputBoxUIKit.svelte @@ -1,10 +1,10 @@ - - - + - {#if robot.consumer?.constructor.name} - - + {/if} - \ No newline at end of file + diff --git a/src/lib/components/3d/elements/robot/status/ManualControlBoxUIKit.svelte b/src/lib/components/3d/elements/robot/status/ManualControlBoxUIKit.svelte index 8a3271c07ab975a4e210cda560f1c02b9e351aa6..f554c147d0ff497752344def7324439df5ac2428 100644 --- a/src/lib/components/3d/elements/robot/status/ManualControlBoxUIKit.svelte +++ b/src/lib/components/3d/elements/robot/status/ManualControlBoxUIKit.svelte @@ -18,11 +18,10 @@ } } - // Manual control theme color (purple) const manualColor = "rgb(147, 51, 234)"; - {#if isDisabled} - - {:else if robot.isManualControlEnabled} - - {:else} - - + - - 0 ? 0.8 : 0.4} backgroundOpacity={robot.outputDriverCount > 0 ? 0.3 : 0.15} @@ -33,15 +31,15 @@ > {#if robot.outputDriverCount > 0} - - 0} - + {#snippet children()} {#each robot.producers.slice(0, 2) as producer} - 2} - - - + - {/if} - \ No newline at end of file + diff --git a/src/lib/components/3d/elements/robot/status/RobotBoxUIKit.svelte b/src/lib/components/3d/elements/robot/status/RobotBoxUIKit.svelte index 5b9da1961c4e955378577f86f1116f1c00535317..b218d0a1810b5fd9998797120973b99c641235a9 100644 --- a/src/lib/components/3d/elements/robot/status/RobotBoxUIKit.svelte +++ b/src/lib/components/3d/elements/robot/status/RobotBoxUIKit.svelte @@ -15,7 +15,6 @@ let { robot, onRobotBoxClick }: Props = $props(); - // Robot theme color (orange) const robotColor = "rgb(245, 158, 11)"; diff --git a/src/lib/components/3d/elements/video/Video.svelte b/src/lib/components/3d/elements/video/Video.svelte index 1f8a5e0e82fd77237017a04378874df11db803f8..fdebf30c1289b2c294795eb9cf0931acfa1bc146 100644 --- a/src/lib/components/3d/elements/video/Video.svelte +++ b/src/lib/components/3d/elements/video/Video.svelte @@ -1,7 +1,16 @@ @@ -169,9 +173,12 @@ class="max-h-[85vh] max-w-4xl overflow-hidden border-slate-300 bg-slate-100 text-slate-900 dark:border-slate-600 dark:bg-slate-900 dark:text-slate-100" > - - - Video Input - {video?.name || 'No Video Selected'} + + + Video Input - {video?.name || "No Video Selected"} Configure video input source: local camera for recording or remote streams from rooms @@ -182,304 +189,358 @@
{#if error} - + Connection Error - + {error} {/if} - - - -
-
- - Current Video Input -
- {#if video?.hasInput} - - {video.input.type === 'local-camera' ? 'Local Camera' : 'Remote Stream'} - - {:else} - No Input Connected - {/if} -
- {#if video?.hasInput} -
- {#if video.input.roomId} - Room: {video.input.roomId} + + + +
+
+ + Current Video Input +
+ {#if video?.hasInput} + + {video.input.type === "local-camera" ? "Local Camera" : "Remote Stream"} + {:else} - Source: Local Device Camera + No Input Connected {/if}
- {/if} -
-
+ {#if video?.hasInput} +
+ {#if video.input.roomId} + Room: {video.input.roomId} + {:else} + Source: Local Device Camera + {/if} +
+ {/if} + + - - {#if video?.hasInput} - + + {#if video?.hasInput} + + + + + Current Input + + + +
+
+
+

+ {video.input.type === "local-camera" ? "Local Camera" : "Remote Stream"} +

+ {#if video.input.roomId} +

+ Room: {video.input.roomId} +

+ {/if} + {#if video.input.stream} +

+ Video: {video.input.stream.getVideoTracks().length} tracks +

+

+ Audio: {video.input.stream.getAudioTracks().length} tracks +

+ {/if} +
+ +
+
+
+
+ {/if} + + + - - - Current Input + + + Local Camera + + Use your device camera for direct video capture and recording + - -
-
-
-

- {video.input.type === 'local-camera' ? 'Local Camera' : 'Remote Stream'} -

- {#if video.input.roomId} -

- Room: {video.input.roomId} -

- {/if} - {#if video.input.stream} -

- Video: {video.input.stream.getVideoTracks().length} tracks + + {#if video?.hasInput && video.input.type === "local-camera"} + +

+
+
+

+ Camera Connected

-

- Audio: {video.input.stream.getAudioTracks().length} tracks +

+ Local device camera active

- {/if} +
+
-
-
- - - {/if} + {:else} + + - - - - - - Local Camera - - - Use your device camera for direct video capture and recording - - - - {#if video?.hasInput && video.input.type === 'local-camera'} - -
-
-
-

Camera Connected

-

Local device camera active

-
- -
-
- {:else} - - - - {#if video?.hasInput} -

- Disconnect current input to connect camera -

+ {#if video?.hasInput} +

+ Disconnect current input to connect camera +

+ {/if} {/if} - {/if} -
-
+ + - - - -
-
- - - Remote Collaboration (Rooms) - - - Receive video streams from remote cameras or AI systems - -
- -
-
- - {#if video?.hasInput && video.input.type !== 'local-camera'} - -
-
-
-

Room Connected

-

Receiving remote video stream

-
- + + Remote Collaboration (Rooms) + + + Receive video streams from remote cameras or AI systems +
+
- {:else} - -
-
-
- -

Create New Room

-
-

- Create a room to receive video from others -

- -
+ + + {#if video?.hasInput && video.input.type !== "local-camera"} + +
+
+
+

+ Room Connected +

+

+ Receiving remote video stream +

+
-
+ {:else} + +
+
+
+ +

+ Create New Room +

+
+

+ Create a room to receive video from others +

+ - Create & Connect - + class="w-full rounded border border-slate-300 bg-slate-50 px-2 py-1 text-xs text-slate-900 disabled:opacity-50 dark:border-slate-600 dark:bg-slate-700 dark:text-slate-100" + /> +
+ + +
-
- -
-
- Join Existing Room: - - {videoManager.rooms.length} room{videoManager.rooms.length !== 1 ? 's' : ''} available - -
- -
- {#if videoManager.rooms.length === 0} -
- {videoManager.roomsLoading ? 'Loading rooms...' : 'No rooms available. Create one to get started.'} -
- {:else} - {#each videoManager.rooms as room} -
-
-
-

- {room.id} -

-
- {room.participants?.producer ? 'πŸ“Ή Has Output' : 'πŸ“­ No Output'} - πŸ‘₯ {room.participants?.consumers?.length || 0} inputs + +
+
+ Join Existing Room: + + {videoManager.rooms.length} room{videoManager.rooms.length !== 1 ? "s" : ""} available + +
+ +
+ {#if videoManager.rooms.length === 0} +
+ {videoManager.roomsLoading + ? "Loading rooms..." + : "No rooms available. Create one to get started."} +
+ {:else} + {#each videoManager.rooms as room} +
+
+
+

+ {room.id} +

+
+ {room.participants?.producer + ? "πŸ“Ή Has Output" + : "πŸ“­ No Output"} + πŸ‘₯ {room.participants?.consumers?.length || 0} inputs +
+ {#if room.participants?.producer} + + {:else} + + {/if}
- {#if room.participants?.producer} - - {:else} - - {/if}
-
- {/each} - {/if} + {/each} + {/if} +
-
- {#if video?.hasInput} -

- Disconnect current input to join a room -

+ {#if video?.hasInput} +

+ Disconnect current input to join a room +

+ {/if} {/if} - {/if} - - + + - - - - Video Input Sources - - Camera: Local device camera β€’ Remote: Video streams from rooms β€’ Only one active at a time - - + + + + Video Input Sources + + Camera: Local device camera β€’ Remote: Video streams from + rooms β€’ Only one active at a time + +
- \ No newline at end of file + diff --git a/src/lib/components/3d/elements/video/modal/VideoOutputConnectionModal.svelte b/src/lib/components/3d/elements/video/modal/VideoOutputConnectionModal.svelte index 40ab2bd220138d356b2c87c455a04f51dd68e856..207745a9c15dc1c6051ffb824e5984bd7363c406 100644 --- a/src/lib/components/3d/elements/video/modal/VideoOutputConnectionModal.svelte +++ b/src/lib/components/3d/elements/video/modal/VideoOutputConnectionModal.svelte @@ -16,112 +16,169 @@ let { open = $bindable(), video, workspaceId }: Props = $props(); - let isConnecting = $state(false); - let error = $state(null); - let customRoomId = $state(''); - let hasLoadedRooms = $state(false); + let isConnecting = $state(false); + let error = $state(null); + let customRoomId = $state(""); + let hasLoadedRooms = $state(false); + let mediaRecorder: MediaRecorder | null = null; + let recordedChunks: Blob[] = []; - // Auto-load rooms when modal opens (only once per modal session) - $effect(() => { - if (open && !hasLoadedRooms && !videoManager.roomsLoading) { - refreshRooms(); - hasLoadedRooms = true; - } - - // Reset when modal closes - if (!open) { - hasLoadedRooms = false; - error = null; - } - }); + // Auto-load rooms when modal opens (only once per modal session) + $effect(() => { + if (open && !hasLoadedRooms && !videoManager.roomsLoading) { + refreshRooms(); + hasLoadedRooms = true; + } - async function refreshRooms() { - try { - error = null; - await videoManager.refreshRooms(workspaceId); - } catch (err) { - error = err instanceof Error ? err.message : 'Failed to refresh rooms'; - } - } + // Reset when modal closes + if (!open) { + hasLoadedRooms = false; + error = null; + } + }); - async function handleStartOutputToRoom(roomId: string) { - try { - isConnecting = true; - error = null; - const result = await videoManager.startVideoOutputToRoom(workspaceId, video.id, roomId); - if (result.success) { - toast.success("Broadcasting Started", { - description: `Successfully started broadcasting to room ${roomId}` - }); - } else { - error = result.error || 'Failed to start output to room'; - } - } catch (err) { - error = err instanceof Error ? err.message : 'Failed to start output to room'; - } finally { - isConnecting = false; - } - } + async function refreshRooms() { + try { + error = null; + await videoManager.refreshRooms(workspaceId); + } catch (err) { + error = err instanceof Error ? err.message : "Failed to refresh rooms"; + } + } + + async function handleStartOutputToRoom(roomId: string) { + try { + isConnecting = true; + error = null; + const result = await videoManager.startVideoOutputToRoom(workspaceId, video.id, roomId); + if (result.success) { + toast.success("Broadcasting Started", { + description: `Successfully started broadcasting to room ${roomId}` + }); + } else { + error = result.error || "Failed to start output to room"; + } + } catch (err) { + error = err instanceof Error ? err.message : "Failed to start output to room"; + } finally { + isConnecting = false; + } + } + + async function createRoom() { + try { + isConnecting = true; + error = null; + const roomId = customRoomId.trim() || video.id; + const result = await videoManager.createVideoRoom(workspaceId, roomId); + + if (result.success) { + customRoomId = ""; + await refreshRooms(); + toast.success("Room Created", { + description: `Successfully created room ${result.roomId}` + }); + } else { + error = result.error || "Failed to create room"; + } + } catch (err) { + error = err instanceof Error ? err.message : "Failed to create room"; + } finally { + isConnecting = false; + } + } + + async function createRoomAndStartOutput() { + try { + isConnecting = true; + error = null; + const roomId = customRoomId.trim() || video.id; + const result = await videoManager.startVideoOutputAsProducer(workspaceId, video.id); + if (result.success) { + customRoomId = ""; + await refreshRooms(); + toast.success("Room Created & Broadcasting", { + description: `Successfully created room and started broadcasting` + }); + } else { + error = result.error || "Failed to create room and start output"; + } + } catch (err) { + error = err instanceof Error ? err.message : "Failed to create room and start output"; + } finally { + isConnecting = false; + } + } + + async function handleStartRecording() { + try { + if (!video.canOutput || !video.input.stream) { + error = "No local camera input available for recording"; + return; + } + + isConnecting = true; + error = null; + + recordedChunks = []; + mediaRecorder = new MediaRecorder(video.input.stream, { mimeType: "video/webm; codecs=vp9" }); + + mediaRecorder.ondataavailable = (event) => { + if (event.data.size > 0) recordedChunks.push(event.data); + }; + + mediaRecorder.onstop = () => { + const blob = new Blob(recordedChunks, { type: "video/webm" }); + const url = URL.createObjectURL(blob); + const a = document.createElement("a"); + a.href = url; + a.download = `${video.name || video.id}.webm`; + a.click(); + URL.revokeObjectURL(url); + }; + + mediaRecorder.start(); - async function createRoom() { - try { - isConnecting = true; - error = null; - const roomId = customRoomId.trim() || video.id; - const result = await videoManager.createVideoRoom(workspaceId, roomId); - - if (result.success) { - customRoomId = ''; - await refreshRooms(); - toast.success("Room Created", { - description: `Successfully created room ${result.roomId}` - }); - } else { - error = result.error || 'Failed to create room'; - } - } catch (err) { - error = err instanceof Error ? err.message : 'Failed to create room'; - } finally { - isConnecting = false; - } - } + // Update video output state locally + video.output.active = true; + video.output.type = "recording"; + video.output.stream = video.input.stream; + video.output.roomId = null; + video.output.client = null; - async function createRoomAndStartOutput() { - try { - isConnecting = true; - error = null; - const roomId = customRoomId.trim() || video.id; - const result = await videoManager.startVideoOutputAsProducer(workspaceId, video.id); - if (result.success) { - customRoomId = ''; - await refreshRooms(); - toast.success("Room Created & Broadcasting", { - description: `Successfully created room and started broadcasting` - }); - } else { - error = result.error || 'Failed to create room and start output'; - } - } catch (err) { - error = err instanceof Error ? err.message : 'Failed to create room and start output'; - } finally { - isConnecting = false; - } - } + toast.success("Recording Started", { description: "Local recording has started" }); + } catch (err) { + error = err instanceof Error ? err.message : "Failed to start recording"; + } finally { + isConnecting = false; + } + } + + async function handleStopOutput() { + try { + isConnecting = true; + error = null; - async function handleStopOutput() { - try { - isConnecting = true; - error = null; - await videoManager.stopVideoOutput(video.id); - toast.success("Broadcasting Stopped", { - description: "Successfully stopped video broadcasting" - }); - } catch (err) { - error = err instanceof Error ? err.message : 'Failed to stop broadcasting'; - } finally { - isConnecting = false; - } - } + if (video.output.type === "recording") { + if (mediaRecorder && mediaRecorder.state !== "inactive") { + mediaRecorder.stop(); + } + video.output.active = false; + video.output.type = null; + video.output.stream = null; + toast.success("Recording Stopped", { description: "Recording saved to file" }); + } else { + await videoManager.stopVideoOutput(video.id); + toast.success("Broadcasting Stopped", { + description: "Successfully stopped video broadcasting" + }); + } + } catch (err) { + error = err instanceof Error ? err.message : "Failed to stop output"; + } finally { + isConnecting = false; + } + } @@ -129,9 +186,12 @@ class="max-h-[85vh] max-w-4xl overflow-hidden border-slate-300 bg-slate-100 text-slate-900 dark:border-slate-600 dark:bg-slate-900 dark:text-slate-100" > - - - Video Output - {video?.name || 'No Video Selected'} + + + Video Output - {video?.name || "No Video Selected"} Configure video output: local recording or remote broadcast to rooms @@ -142,302 +202,356 @@
{#if error} - + Connection Error - + {error} {/if} - - - -
-
- - Current Video Output -
- {#if video?.hasOutput} - - {video.output.type === 'recording' ? 'Recording' : 'Remote Broadcast'} - - {:else} - No Output Active - {/if} -
- {#if video?.hasOutput} -
- {#if video.output.roomId} - Broadcasting to Room: {video.output.roomId} + + + +
+
+ + Current Video Output +
+ {#if video?.hasOutput} + + {video.output.type === "recording" ? "Recording" : "Remote Broadcast"} + {:else} - Recording to local storage + No Output Active {/if}
- {/if} -
-
+ {#if video?.hasOutput} +
+ {#if video.output.roomId} + Broadcasting to Room: {video.output.roomId} + {:else} + Recording to local storage + {/if} +
+ {/if} + + + + + {#if video?.hasOutput} + + + + + Current Output + + + +
+
+
+

+ {video.output.type === "recording" ? "Local Recording" : "Remote Broadcast"} +

+ {#if video.output.roomId} +

+ Room: {video.output.roomId} +

+ {/if} + {#if video.output.stream} +

+ Status: Active β€’ {video.output.stream.getVideoTracks().length} video tracks +

+ {/if} +
+ +
+
+
+
+ {/if} - - {#if video?.hasOutput} - + + - - - Current Output + + + Local Recording + + Record video directly to your device for later use + - -
-
-
-

- {video.output.type === 'recording' ? 'Local Recording' : 'Remote Broadcast'} -

- {#if video.output.roomId} -

- Room: {video.output.roomId} + + {#if video?.hasOutput && video.output.type === "recording"} + +

+
+
+

+ Recording Active

- {/if} - {#if video.output.stream} -

- Status: Active β€’ {video.output.stream.getVideoTracks().length} video tracks +

+ Saving to local device

- {/if} +
+
-
-
- - - {/if} + {:else} + + - - - - - - Local Recording - - - Record video directly to your device for later use - - - - {#if video?.hasOutput && video.output.type === 'recording'} - -
-
-
-

Recording Active

-

Saving to local device

-
- -
-
- {:else} - - - - {#if video?.hasOutput} -

- Stop current output to start recording -

+ {#if video?.hasOutput} +

+ Stop current output to start recording +

+ {/if} {/if} - {/if} -
-
+ + - - - -
-
- - - Remote Collaboration (Rooms) - - - Broadcast video stream to remote systems and users - -
- -
-
- - {#if video?.hasOutput && video.output.type !== 'recording'} - -
-
-
-

Broadcasting to Room

-

Video stream active

-
- + + Remote Collaboration (Rooms) + + + Broadcast video stream to remote systems and users +
+
- {:else} - -
-
-
- -

Create New Room

-
-

- Create a room to broadcast your video -

- -
+ + + {#if video?.hasOutput && video.output.type !== "recording"} + +
+
+
+

+ Broadcasting to Room +

+

+ Video stream active +

+
-
+ {:else} + +
+
+
+ +

+ Create New Room +

+
+

+ Create a room to broadcast your video +

+ - Create & Broadcast - + class="w-full rounded border border-slate-300 bg-slate-50 px-2 py-1 text-xs text-slate-900 disabled:opacity-50 dark:border-slate-600 dark:bg-slate-700 dark:text-slate-100" + /> +
+ + +
-
- -
-
- Join Existing Room: - - {videoManager.rooms.length} room{videoManager.rooms.length !== 1 ? 's' : ''} available - -
- -
- {#if videoManager.rooms.length === 0} -
- {videoManager.roomsLoading ? 'Loading rooms...' : 'No rooms available. Create one to get started.'} -
- {:else} - {#each videoManager.rooms as room} -
-
-
-

- {room.id} -

-
- {room.participants?.producer ? 'πŸ”΄ Has Output' : '🟒 Available'} - πŸ‘₯ {room.participants?.consumers?.length || 0} inputs + +
+
+ Join Existing Room: + + {videoManager.rooms.length} room{videoManager.rooms.length !== 1 ? "s" : ""} available + +
+ +
+ {#if videoManager.rooms.length === 0} +
+ {videoManager.roomsLoading + ? "Loading rooms..." + : "No rooms available. Create one to get started."} +
+ {:else} + {#each videoManager.rooms as room} +
+
+
+

+ {room.id} +

+
+ {room.participants?.producer + ? "πŸ”΄ Has Output" + : "🟒 Available"} + πŸ‘₯ {room.participants?.consumers?.length || 0} inputs +
+ {#if !room.participants?.producer} + + {:else} + + {/if}
- {#if !room.participants?.producer} - - {:else} - - {/if}
-
- {/each} - {/if} + {/each} + {/if} +
-
- {#if video?.hasOutput} -

- Stop current output to join a room -

+ {#if video?.hasOutput} +

+ Stop current output to join a room +

+ {/if} {/if} - {/if} - - + + - - - - Video Output Options - - Recording: Save locally β€’ Remote: Broadcast to rooms β€’ Only one active at a time - - + + + + Video Output Options + + Recording: Save locally β€’ Remote: Broadcast to rooms β€’ + Only one active at a time + +
- \ No newline at end of file + diff --git a/src/lib/components/3d/elements/video/status/InputVideoBoxUIKit.svelte b/src/lib/components/3d/elements/video/status/InputVideoBoxUIKit.svelte index 312605027cf59a62df4f1edcb68c23eddf5da5e1..b7553e09625901742aed450d6e2758f5df871525 100644 --- a/src/lib/components/3d/elements/video/status/InputVideoBoxUIKit.svelte +++ b/src/lib/components/3d/elements/video/status/InputVideoBoxUIKit.svelte @@ -1,7 +1,13 @@ - - - {#if video.hasInput} - {#if video.input.type === 'local-camera'} - {:else} - {/if} - {:else} - - + - {/if} - \ No newline at end of file + diff --git a/src/lib/components/3d/elements/video/status/OutputVideoBoxUIKit.svelte b/src/lib/components/3d/elements/video/status/OutputVideoBoxUIKit.svelte index 4261b643c65c638b13e0853e9fd2d932d19ab475..ac000295655f9a5a6fbea8b55853939dc90572ca 100644 --- a/src/lib/components/3d/elements/video/status/OutputVideoBoxUIKit.svelte +++ b/src/lib/components/3d/elements/video/status/OutputVideoBoxUIKit.svelte @@ -1,7 +1,13 @@ - - - {#if video.hasOutput} - - {:else} - - - {/if} - \ No newline at end of file + diff --git a/src/lib/components/3d/elements/video/status/VideoBoxUIKit.svelte b/src/lib/components/3d/elements/video/status/VideoBoxUIKit.svelte index 04aa597b01edc08ff00cdd9e34fb201a69374d1a..616567c66c9ca6675d56573c7adbdc8560c87174 100644 --- a/src/lib/components/3d/elements/video/status/VideoBoxUIKit.svelte +++ b/src/lib/components/3d/elements/video/status/VideoBoxUIKit.svelte @@ -6,7 +6,6 @@ StatusHeader, StatusContent } from "$lib/components/3d/ui"; - import { Text } from "threlte-uikit"; interface Props { video: VideoInstance; @@ -14,16 +13,9 @@ let { video }: Props = $props(); - // Video theme color (orange) const videoColor = "rgb(217, 119, 6)"; - - diff --git a/src/lib/components/3d/misc/Pointcloud.svelte b/src/lib/components/3d/misc/Pointcloud.svelte index db7b82e8fb04b51ebe3ac0dac7ab8b840600cc80..7aff29f7046134ba830fcc1c2bda7c8794b91d4f 100644 --- a/src/lib/components/3d/misc/Pointcloud.svelte +++ b/src/lib/components/3d/misc/Pointcloud.svelte @@ -284,10 +284,6 @@ rgbdData = createRGBDData(); }); - - $effect(() => { - console.log("render"); - }); {#if pointCloudGeometry} diff --git a/src/lib/components/3d/ui/BaseStatusBox.svelte b/src/lib/components/3d/ui/BaseStatusBox.svelte index 36313d480759a3eddabdd5659dc439b2de77ca1c..b96b4340426de3faffe00d26ef8fa0953c13f8f9 100644 --- a/src/lib/components/3d/ui/BaseStatusBox.svelte +++ b/src/lib/components/3d/ui/BaseStatusBox.svelte @@ -10,7 +10,7 @@ backgroundOpacity?: number; disabled?: boolean; clickable?: boolean; - children: import('svelte').Snippet; + children: import("svelte").Snippet; onclick?: () => void; } @@ -51,15 +51,11 @@ } let currentBorderOpacity = $derived(isHovered ? Math.min(borderOpacity * 1.5, 1) : borderOpacity); - let currentBackgroundOpacity = $derived(isHovered ? Math.min(backgroundOpacity * 2, 0.5) : backgroundOpacity); + let currentBackgroundOpacity = $derived( + isHovered ? Math.min(backgroundOpacity * 2, 0.5) : backgroundOpacity + ); - - {@render children()} - \ No newline at end of file + diff --git a/src/lib/components/3d/ui/StatusArrow.svelte b/src/lib/components/3d/ui/StatusArrow.svelte index aed0e7df87142054d4b434435404c5fdeed99384..312b9099b6a1871a54542728eaeca3a3811a52e5 100644 --- a/src/lib/components/3d/ui/StatusArrow.svelte +++ b/src/lib/components/3d/ui/StatusArrow.svelte @@ -21,12 +21,6 @@ }: Props = $props(); - - - - - - - - - + - \ No newline at end of file + diff --git a/src/lib/components/3d/ui/StatusIndicator.svelte b/src/lib/components/3d/ui/StatusIndicator.svelte index 38e868be64b9529a8d04a8ea9e44c5d4e2d0c004..f7b1fd766f3fbbca3963621daf11fa8f737584e9 100644 --- a/src/lib/components/3d/ui/StatusIndicator.svelte +++ b/src/lib/components/3d/ui/StatusIndicator.svelte @@ -4,23 +4,13 @@ interface Props { color?: string; size?: number; - type?: 'dot' | 'pulse'; + type?: "dot" | "pulse"; visible?: boolean; } - let { - color = "rgb(139, 69, 219)", - size = 8, - type = 'dot', - visible = true - }: Props = $props(); + let { color = "rgb(139, 69, 219)", size = 8, type = "dot", visible = true }: Props = $props(); {#if visible} - -{/if} \ No newline at end of file + +{/if} diff --git a/src/lib/components/3d/utils/Hoverable.old.svelte b/src/lib/components/3d/utils/Hoverable.old.svelte deleted file mode 100644 index eee0eb20c0b6c7431dafa9a4cbf61d82f87874af..0000000000000000000000000000000000000000 --- a/src/lib/components/3d/utils/Hoverable.old.svelte +++ /dev/null @@ -1,148 +0,0 @@ - - -) => { - event.stopPropagation(); - isSelected = true; - onClickObject?.(); - }} - onpointerenter={(event: IntersectionEvent) => { - event.stopPropagation(); - onPointerEnter(); - isHovered = true; - }} - onpointerleave={(event: IntersectionEvent) => { - event.stopPropagation(); - onPointerLeave(); - isHovered = false; - }} - scale={scale.current} -> - {#snippet children({ ref })} - {@render content({ isHovered, isSelected, offset: offsetTween.current })} - {/snippet} - diff --git a/src/lib/components/3d/utils/Hoverable.svelte b/src/lib/components/3d/utils/Hoverable.svelte index ce929e5053bbd79acfa9c1556a72977919694dbe..72f7669f883cb2e77218d9a820b7b681f1d1d876 100644 --- a/src/lib/components/3d/utils/Hoverable.svelte +++ b/src/lib/components/3d/utils/Hoverable.svelte @@ -3,11 +3,10 @@ import type { IntersectionEvent } from "@threlte/extras"; import { interactivity } from "@threlte/extras"; import type { Snippet } from "svelte"; - import { Spring, Tween } from "svelte/motion"; + import { Spring } from "svelte/motion"; import { useCursor } from "@threlte/extras"; import { onMount, onDestroy } from "svelte"; import { Group } from "three"; - import type { Robot } from "$lib/elements/robot/Robot.svelte"; interface Props { content: Snippet<[{ isHovered: boolean; isSelected: boolean; debouncedIsHovered: boolean }]>; // renderable @@ -40,7 +39,6 @@ } }); - const handleKeyDown = (event: KeyboardEvent) => { if (event.key === "Escape" && isSelected) { isSelected = false; diff --git a/src/lib/components/interface/overlay/AddAIButton.svelte b/src/lib/components/interface/overlay/AddAIButton.svelte index 088a37dff340ebfdb92bddc2134fbea12737e9be..e516200039273a8d1f30863a6d5f90e82894f443 100644 --- a/src/lib/components/interface/overlay/AddAIButton.svelte +++ b/src/lib/components/interface/overlay/AddAIButton.svelte @@ -1,6 +1,5 @@ @@ -90,7 +89,7 @@ - + {#snippet child({ props })}
@@ -151,4 +152,4 @@ {/each} - \ No newline at end of file + diff --git a/src/lib/components/interface/overlay/AddSensorButton.svelte b/src/lib/components/interface/overlay/AddSensorButton.svelte index bb9a04b59eda99e3d04aa8b06f32e07a06f42680..a649073014613a86a38eda54d74c3354dc6d5da2 100644 --- a/src/lib/components/interface/overlay/AddSensorButton.svelte +++ b/src/lib/components/interface/overlay/AddSensorButton.svelte @@ -4,7 +4,6 @@ import * as DropdownMenu from "@/components/ui/dropdown-menu"; import { toast } from "svelte-sonner"; import { cn } from "$lib/utils"; - import { generateName } from "@/utils/generateName"; import { videoManager } from "$lib/elements/video/VideoManager.svelte"; interface Props { diff --git a/src/lib/components/interface/overlay/Overlay.svelte b/src/lib/components/interface/overlay/Overlay.svelte index 0ffbe3edf5b7b82957e2de44e15e637894c68aba..bf2b85d251ff7765450280b4b2675958089a8555 100644 --- a/src/lib/components/interface/overlay/Overlay.svelte +++ b/src/lib/components/interface/overlay/Overlay.svelte @@ -23,19 +23,22 @@ settingsOpen = $bindable(false), workspaceIdMenuOpen = $bindable(false) }: Props = $props(); - -
-
+
-
+
- Logo + Logo
@@ -44,7 +47,7 @@
-
+
@@ -52,7 +55,7 @@
-
+
diff --git a/src/lib/components/interface/overlay/SettingsButton.svelte b/src/lib/components/interface/overlay/SettingsButton.svelte index 908dbd16a369b8ee1127a68d23afcca114d3f96c..ab9d7f3a11bb1a94b7463999cf847d300b80c8db 100644 --- a/src/lib/components/interface/overlay/SettingsButton.svelte +++ b/src/lib/components/interface/overlay/SettingsButton.svelte @@ -1,5 +1,4 @@ -
-
+
+
- + @@ -136,41 +144,52 @@
-

What you can share:

- +

+ What you can share: +

+
- + - Video Streams - Live camera feeds from connected devices + Video Streams - Live camera + feeds from connected devices
- +
- + - Robot Control - Real-time teleoperation and monitoring + Robot Control - Real-time teleoperation + and monitoring
- +
- + - AI Sessions - Shared inference and autonomous control + AI Sessions - Shared inference + and autonomous control
- +
- +
-
Private Workspace
+
+ Private Workspace +
- Only users with this workspace ID can access your resources. Share it securely with trusted collaborators. + Only users with this workspace ID can access your resources. Share it securely with + trusted collaborators.
@@ -178,12 +197,15 @@
-
+
- Tip: Use this for remote teleoperation, collaborative research, demonstrations, or sharing your robot setup with team members across different networks. + Tip: Use this for remote teleoperation, collaborative research, demonstrations, + or sharing your robot setup with team members across different networks.
- \ No newline at end of file + diff --git a/src/lib/elements/compute/README.md b/src/lib/elements/compute/README.md deleted file mode 100644 index 75bffdd2f2c524d76f917915e8292fb87e8ff7ef..0000000000000000000000000000000000000000 --- a/src/lib/elements/compute/README.md +++ /dev/null @@ -1,147 +0,0 @@ -# AI Compute System - -This module provides a comprehensive AI compute management system for the LeRobot Arena frontend, integrating with the AI server backend for ACT model inference sessions. - -## Architecture - -The system follows the same pattern as the video and robot managers: - -- **RemoteComputeManager**: Global manager for all AI compute instances -- **RemoteCompute**: Individual AI compute instance with reactive state -- **UI Components**: Modal dialogs and status displays for managing compute sessions - -## Core Components - -### RemoteComputeManager - -The main manager class that handles: -- Creating and managing AI compute instances -- Communicating with the AI server backend -- Session lifecycle management (create, start, stop, delete) -- Health monitoring and status updates - -```typescript -import { remoteComputeManager } from '$lib/elements/compute/'; - -// Create a new compute instance -const compute = remoteComputeManager.createCompute('my-compute', 'ACT Model'); - -// Create an Inference Session -await remoteComputeManager.createSession(compute.id, { - sessionId: 'my-session', - policyPath: './checkpoints/act_so101_beyond', - cameraNames: ['front', 'wrist'], - transportServerUrl: 'http://localhost:8000' -}); - -// Start inference -await remoteComputeManager.startSession(compute.id); -``` - -### RemoteCompute - -Individual compute instances with reactive state: - -```typescript -// Access compute properties -compute.hasSession // boolean - has an active session -compute.isRunning // boolean - session is running inference -compute.canStart // boolean - can start inference -compute.canStop // boolean - can stop inference -compute.statusInfo // status display information -``` - -## AI Server Integration - -The system integrates with the AI server backend (`backend/ai-server/`) which provides: - -- **ACT Model Inference**: Real-time robot control using Action Chunking Transformer models -- **Session Management**: Create, start, stop, and delete inference sessions -- **Transport Server Communication**: Dedicated rooms for camera inputs, joint inputs, and joint outputs -- **Multi-camera Support**: Support for multiple camera streams per session - -### Session Workflow - -1. **Create Session**: Establishes connection with AI server and creates transport server rooms -2. **Configure Inputs**: Sets up camera rooms and joint input rooms -3. **Start Inference**: Begins ACT model inference and joint command output -4. **Monitor Status**: Real-time status updates and performance metrics -5. **Stop/Delete**: Clean session teardown - -## UI Components - -### Modal Dialog - -`AISessionConnectionModal.svelte` provides a comprehensive interface for: -- Creating new Inference Sessions with configurable parameters -- Managing existing sessions (start, stop, delete) -- Viewing session status and connection details -- Real-time session monitoring - -### Status Display - -The status system shows input/output connections: - -- **Input Box**: Shows camera inputs and joint state inputs -- **Compute Box**: Shows AI model status and information -- **Output Box**: Shows joint command outputs -- **Connection Flow**: Visual representation of data flow - -### 3D Integration - -- Uses existing GPU 3D models for visual representation -- Interactive hover states and status billboards -- Positioned in 3D space alongside robots and videos - -## Usage Example - -```typescript -// 1. Create a compute instance -const compute = remoteComputeManager.createCompute(); - -// 2. Configure and create Inference Session -await remoteComputeManager.createSession(compute.id, { - sessionId: 'robot-control-01', - policyPath: './checkpoints/act_so101_beyond', - cameraNames: ['front', 'wrist', 'overhead'], - transportServerUrl: 'http://localhost:8000', - workspaceId: 'workspace-123' -}); - -// 3. Start inference -await remoteComputeManager.startSession(compute.id); - -// 4. Monitor status -const status = await remoteComputeManager.getSessionStatus(compute.id); -console.log(status.stats.inference_count); -``` - -## Configuration - -The system connects to: -- **Inference Server**: `http://localhost:8001` (configurable) - Runs AI models and inference sessions -- **Transport Server**: `http://localhost:8000` (configurable) - Manages communication rooms and data routing - -## File Structure - -``` -compute/ -β”œβ”€β”€ RemoteComputeManager.svelte.ts # Main manager class -β”œβ”€β”€ RemoteCompute.svelte.ts # Individual compute instance -β”œβ”€β”€ modal/ -β”‚ └── AISessionConnectionModal.svelte # Session management modal -β”œβ”€β”€ status/ -β”‚ β”œβ”€β”€ ComputeInputBoxUIKit.svelte # Input status display -β”‚ β”œβ”€β”€ ComputeOutputBoxUIKit.svelte # Output status display -β”‚ β”œβ”€β”€ ComputeBoxUIKit.svelte # Main compute display -β”‚ β”œβ”€β”€ ComputeConnectionFlowBoxUIKit.svelte # Connection flow -β”‚ └── ComputeStatusBillboard.svelte # 3D status billboard -└── index.ts # Module exports -``` - -## Integration Points - -- **3D Scene**: `Computes.svelte` renders all compute instances -- **Add Button**: `AddAIButton.svelte` creates new compute instances -- **Main Page**: Integrated in the main workspace view -- **GPU Models**: Reuses existing GPU 3D models for visual consistency \ No newline at end of file diff --git a/src/lib/elements/compute/RemoteComputeManager.svelte.ts b/src/lib/elements/compute/RemoteComputeManager.svelte.ts index 33cbd07f153c2afbc1127c52990ca533228f2298..398c3d31d157848a14511d7e134f74a3925ada39 100644 --- a/src/lib/elements/compute/RemoteComputeManager.svelte.ts +++ b/src/lib/elements/compute/RemoteComputeManager.svelte.ts @@ -14,8 +14,7 @@ import { import { settings } from '$lib/runes/settings.svelte'; import type { CreateSessionRequest, - CreateSessionResponse, - SessionStatusResponse + CreateSessionResponse } from '@robothub/inference-server-client'; export interface AISessionConfig { diff --git a/src/lib/elements/robot/calibration/USBCalibrationManager.ts b/src/lib/elements/robot/calibration/USBCalibrationManager.ts index a858fc51099ee67c22cc1158cf8704853346ce7f..cf09e92c475d49f1ed1c9311d8448f7e97ef4573 100644 --- a/src/lib/elements/robot/calibration/USBCalibrationManager.ts +++ b/src/lib/elements/robot/calibration/USBCalibrationManager.ts @@ -6,26 +6,26 @@ export class USBCalibrationManager { // Joint configuration private readonly jointIds = [1, 2, 3, 4, 5, 6]; private readonly jointNames = ["Rotation", "Pitch", "Elbow", "Wrist_Pitch", "Wrist_Roll", "Jaw"]; - + // Calibration state private jointCalibrations: Record = {}; private _calibrationState: CalibrationState = { isCalibrating: false, progress: 0 }; - + // Connection state for calibration private isConnectedForCalibration = false; private baudRate: number = 1000000; - + // Calibration polling private calibrationPollingAbortController: AbortController | null = null; private lastPositions: Record = {}; private calibrationCallbacks: (() => void)[] = []; - + // Calibration completion callback with final positions private calibrationCompleteCallback: ((finalPositions: Record) => void) | null = null; - + // Servo reading queue for calibration private isReadingServos = false; private readingQueue: Array<{ @@ -36,7 +36,7 @@ export class USBCalibrationManager { constructor(baudRate: number = ROBOT_CONFIG.usb.baudRate) { this.baudRate = baudRate; - + // Initialize joint calibrations this.jointNames.forEach(name => { this.jointCalibrations[name] = { isCalibrated: false }; @@ -66,7 +66,7 @@ export class USBCalibrationManager { console.log('[USBCalibrationManager] Already connected for calibration'); return; } - + try { console.log('[USBCalibrationManager] Connecting SDK for calibration...'); await scsServoSDK.connect({ baudRate: this.baudRate }); @@ -80,7 +80,7 @@ export class USBCalibrationManager { async disconnectFromCalibration(): Promise { if (!this.isConnectedForCalibration) return; - + try { await scsServoSDK.disconnect(); this.isConnectedForCalibration = false; @@ -103,7 +103,7 @@ export class USBCalibrationManager { } console.log('[USBCalibrationManager] Starting calibration process'); - + // Ensure connection for calibration await this.ensureConnectedForCalibration(); @@ -158,7 +158,7 @@ export class USBCalibrationManager { try { const finalPositions = await this.readFinalPositionsAndSync(); console.log('[USBCalibrationManager] βœ… Final positions read and synced to virtual robot'); - + // Notify robot of calibration completion with final positions if (this.calibrationCompleteCallback) { this.calibrationCompleteCallback(finalPositions); @@ -237,19 +237,19 @@ export class USBCalibrationManager { // NEW: Read final positions and prepare for sync private async readFinalPositionsAndSync(): Promise> { const finalPositions: Record = {}; - + console.log('[USBCalibrationManager] Reading final positions from all servos...'); - + // Read all servo positions sequentially for (let i = 0; i < this.jointIds.length; i++) { const servoId = this.jointIds[i]; const jointName = this.jointNames[i]; - + try { const position = await this.readServoPosition(servoId); finalPositions[jointName] = position; this.lastPositions[jointName] = position; - + console.log(`[USBCalibrationManager] ${jointName} (servo ${servoId}): ${position} (raw) -> ${this.normalizeValue(position, jointName).toFixed(1)}% (normalized)`); } catch (error) { console.warn(`[USBCalibrationManager] Failed to read final position for ${jointName} (servo ${servoId}):`, error); @@ -257,7 +257,7 @@ export class USBCalibrationManager { finalPositions[jointName] = this.lastPositions[jointName] || 2048; } } - + return finalPositions; } @@ -391,9 +391,9 @@ export class USBCalibrationManager { // Cleanup async destroy(): Promise { console.log('[USBCalibrationManager] 🧹 Destroying calibration manager...'); - + this.stopCalibrationPolling(); - + // Safely unlock all servos before disconnecting (best practice) if (this.isSDKConnected) { try { @@ -404,11 +404,11 @@ export class USBCalibrationManager { console.warn('[USBCalibrationManager] Warning: Failed to safely unlock servos during cleanup:', error); } } - + await this.disconnectFromCalibration(); this.calibrationCallbacks = []; this.calibrationCompleteCallback = null; - + console.log('[USBCalibrationManager] βœ… Calibration manager destroyed'); } diff --git a/src/lib/elements/robot/calibration/USBCalibrationPanel.svelte b/src/lib/elements/robot/calibration/USBCalibrationPanel.svelte index 4d4d97c30c8eb448f2eaa17de4145069cd60a77c..7c4295574a57b11ca5897ccb45db075e6b021e4e 100644 --- a/src/lib/elements/robot/calibration/USBCalibrationPanel.svelte +++ b/src/lib/elements/robot/calibration/USBCalibrationPanel.svelte @@ -1,217 +1,230 @@
- {#if isCalibrating} - -
-
-
-
- Recording movements... - {Math.round(progress)}% -
-
- - -
-
- - -
-
-
- - -
-
- {#each jointNames as jointName} - {@const currentValue = calibrationState.getCurrentValue(jointName)} - {@const calibration = calibrationState.getJointCalibration(jointName)} - -
-
- {jointName} - {calibrationState.formatServoValue(currentValue)} -
- -
- Min: {calibrationState.formatServoValue(calibration?.minServoValue)} - Max: {calibrationState.formatServoValue(calibration?.maxServoValue)} -
- - {#if calibration?.minServoValue !== undefined && calibration?.maxServoValue !== undefined && currentValue !== undefined} -
-
-
- {/if} -
- {/each} -
-
- -
- Move each joint through its full range of motion -
-
- - {:else if isCalibrated} - -
-
- βœ“ Calibrated - Ready to connect -
- - -
-
- {connectionInfo.lockStatus} -
-
{connectionInfo.lockDescription}
-
- - -
-
- {#each jointNames as jointName} - {@const calibration = calibrationState.getJointCalibration(jointName)} - {@const range = calibrationState.getJointRange(jointName)} - -
- {jointName} - {range} -
- {/each} -
-
- -
- - -
-
- - {:else} - -
-
- Needs Calibration - Required for USB connection -
- - -
-
Quick Setup Options:
-
    -
  1. 1. Position robot in neutral pose
  2. -
  3. 2. Start calibration and move each joint fully
  4. -
  5. 3. Complete when all joints show good ranges
  6. -
-
- - -
-
After calibration:
-
{connectionInfo.lockStatus}
-
- -
- - - -
-
- {/if} -
\ No newline at end of file + {#if isCalibrating} + +
+
+
+
+ Recording movements... + {Math.round(progress)}% +
+
+ + +
+
+ + +
+
+
+ + +
+
+ {#each jointNames as jointName} + {@const currentValue = calibrationState.getCurrentValue(jointName)} + {@const calibration = calibrationState.getJointCalibration(jointName)} + +
+
+ {jointName} + {calibrationState.formatServoValue(currentValue)} +
+ +
+ Min: {calibrationState.formatServoValue(calibration?.minServoValue)} + Max: {calibrationState.formatServoValue(calibration?.maxServoValue)} +
+ + {#if calibration?.minServoValue !== undefined && calibration?.maxServoValue !== undefined && currentValue !== undefined} +
+
+
+ {/if} +
+ {/each} +
+
+ +
Move each joint through its full range of motion
+
+ {:else if isCalibrated} + +
+
+ βœ“ Calibrated + Ready to connect +
+ + +
+
+ {connectionInfo.lockStatus} +
+
{connectionInfo.lockDescription}
+
+ + +
+
+ {#each jointNames as jointName} + {@const calibration = calibrationState.getJointCalibration(jointName)} + {@const range = calibrationState.getJointRange(jointName)} + +
+ {jointName} + {range} +
+ {/each} +
+
+ +
+ + +
+
+ {:else} + +
+
+ Needs Calibration + Required for USB connection +
+ + +
+
Quick Setup Options:
+
    +
  1. 1. Position robot in neutral pose
  2. +
  3. 2. Start calibration and move each joint fully
  4. +
  5. 3. Complete when all joints show good ranges
  6. +
+
+ + +
+
After calibration:
+
{connectionInfo.lockStatus}
+
+ +
+ + + +
+
+ {/if} +
diff --git a/src/lib/elements/robot/components/ConnectionPanel.svelte b/src/lib/elements/robot/components/ConnectionPanel.svelte index 4b4faa8771e055b4ae9f0841fc3e6777d7e92fe4..f0d428c8b134b9a714cc6d535185187a28d72acc 100644 --- a/src/lib/elements/robot/components/ConnectionPanel.svelte +++ b/src/lib/elements/robot/components/ConnectionPanel.svelte @@ -1,541 +1,545 @@
- -
-

Connections - {robot.id}

- - - {#if error} -
- {error} -
- {/if} - - -
-
-

Room Management

- -
- - {#if showRoomManagement} -
- -
- - - {rooms.length} room{rooms.length !== 1 ? 's' : ''} available - -
- - -
- Available Rooms: -
- -
-
-
- Create New Room -
-

- Create a room for collaboration -

- -
- - -
-
-
- - - {#if rooms.length === 0} -
- {roomsLoading ? 'Loading...' : 'No existing rooms available'} -
- {:else} - {#each rooms as room} -
-
-

{room.id}

-

{room.participants?.total || 0} participants

-
-
- - -
-
- {/each} - {/if} -
-
-
- {/if} -
- - -
-

Consumer (Receive Commands) - Single

- {#if hasConsumer} -
-
- {consumer?.name || 'Consumer Active'} - - {consumer?.status.isConnected ? '🟒 Connected' : 'πŸ”΄ Disconnected'} - -
- -
- {:else} -
- -
-
- - -
-
- Remote Consumer: Receive commands from transport server -
-
-
- {/if} -
- - -
-

Producers (Send Commands) - {outputDriverCount} connected

-
- - -
- Remote Producer: Send commands to transport server. Uses Robot ID: {remoteRobotId} -
-
- - - {#each producers as producer} -
-
- {producer.name} - - {producer.status.isConnected ? '🟒 Connected' : 'πŸ”΄ Disconnected'} - -
- -
- {/each} -
- - -
- Robot ID for Remote Connections: - -
-
- - - {#if showUSBCalibration} -
-
-
-

- USB Calibration Required - {#if pendingUSBConnection} - - (for {pendingUSBConnection === 'consumer' ? 'Consumer' : 'Producer'}) - - {/if} -

- -
- -
- Before connecting USB drivers, the robot needs to be calibrated to map its physical range to software values. -
- - -
-
- {/if} -
\ No newline at end of file + +
+

Connections - {robot.id}

+ + + {#if error} +
+ {error} +
+ {/if} + + +
+
+

Room Management

+ +
+ + {#if showRoomManagement} +
+ +
+ + + {rooms.length} room{rooms.length !== 1 ? "s" : ""} available + +
+ + +
+ Available Rooms: +
+ +
+
+
+ Create New Room +
+

Create a room for collaboration

+ +
+ + +
+
+
+ + + {#if rooms.length === 0} +
+ {roomsLoading ? "Loading..." : "No existing rooms available"} +
+ {:else} + {#each rooms as room} +
+
+

{room.id}

+

+ {room.participants?.total || 0} participants +

+
+
+ + +
+
+ {/each} + {/if} +
+
+
+ {/if} +
+ + +
+

Consumer (Receive Commands) - Single

+ {#if hasConsumer} +
+
+ {consumer?.name || "Consumer Active"} + + {consumer?.status.isConnected ? "🟒 Connected" : "πŸ”΄ Disconnected"} + +
+ +
+ {:else} +
+ +
+
+ + +
+
+ Remote Consumer: Receive commands from transport server +
+
+
+ {/if} +
+ + +
+

+ Producers (Send Commands) - {outputDriverCount} connected +

+
+ + +
+ Remote Producer: Send commands to transport server. Uses Robot ID: {remoteRobotId} +
+
+ + + {#each producers as producer} +
+
+ {producer.name} + + {producer.status.isConnected ? "🟒 Connected" : "πŸ”΄ Disconnected"} + +
+ +
+ {/each} +
+ + +
+ Robot ID for Remote Connections: + +
+
+ + + {#if showUSBCalibration} +
+
+
+

+ USB Calibration Required + {#if pendingUSBConnection} + + (for {pendingUSBConnection === "consumer" ? "Consumer" : "Producer"}) + + {/if} +

+ +
+ +
+ Before connecting USB drivers, the robot needs to be calibrated to map its physical range + to software values. +
+ + +
+
+ {/if} +
diff --git a/src/lib/elements/robot/components/RobotControls.svelte b/src/lib/elements/robot/components/RobotControls.svelte index 3a841237943700a0336ae0465adef333f2dd1f15..160366fbb4a6f334fce5b6872013b4bb046ed3e2 100644 --- a/src/lib/elements/robot/components/RobotControls.svelte +++ b/src/lib/elements/robot/components/RobotControls.svelte @@ -1,81 +1,81 @@ -
-
-

- Robot Controls - {robot.id} -

-
- {#if isManualControlEnabled} - Manual Control - {:else} - External Control - {/if} -
-
+
+
+

+ Robot Controls - {robot.id} +

+
+ {#if isManualControlEnabled} + Manual Control + {:else} + External Control + {/if} +
+
-
- {#each joints as joint} -
-
- {joint.name} - - {joint.value.toFixed(1)}% - -
- -
- {#if joint.name.toLowerCase() === 'jaw' || joint.name.toLowerCase() === 'gripper'} - 0% (closed) - updateJoint(joint.name, parseFloat(e.currentTarget.value))} - class="flex-1 h-2 bg-slate-700 rounded-lg appearance-none cursor-pointer disabled:opacity-50 disabled:cursor-not-allowed" - /> - 100% (open) - {:else} - -100% - updateJoint(joint.name, parseFloat(e.currentTarget.value))} - class="flex-1 h-2 bg-slate-700 rounded-lg appearance-none cursor-pointer disabled:opacity-50 disabled:cursor-not-allowed" - /> - +100% - {/if} -
+
+ {#each joints as joint} +
+
+ {joint.name} + + {joint.value.toFixed(1)}% + +
- {#if joint.limits} -
- URDF limits: {(joint.limits.lower)}Β° to {joint.limits.upper}Β° -
- {/if} -
- {/each} -
-
\ No newline at end of file +
+ {#if joint.name.toLowerCase() === "jaw" || joint.name.toLowerCase() === "gripper"} + 0% (closed) + updateJoint(joint.name, parseFloat(e.currentTarget.value))} + class="h-2 flex-1 cursor-pointer appearance-none rounded-lg bg-slate-700 disabled:cursor-not-allowed disabled:opacity-50" + /> + 100% (open) + {:else} + -100% + updateJoint(joint.name, parseFloat(e.currentTarget.value))} + class="h-2 flex-1 cursor-pointer appearance-none rounded-lg bg-slate-700 disabled:cursor-not-allowed disabled:opacity-50" + /> + +100% + {/if} +
+ + {#if joint.limits} +
+ URDF limits: {joint.limits.lower}Β° to {joint.limits.upper}Β° +
+ {/if} +
+ {/each} +
+
diff --git a/src/lib/elements/video/VideoManager.svelte.ts b/src/lib/elements/video/VideoManager.svelte.ts index 0d17cb7531100cadf189b9135cb8e0020d5294af..7532d4aefe9d1e02e55308f85ce7be7b1398f2ac 100644 --- a/src/lib/elements/video/VideoManager.svelte.ts +++ b/src/lib/elements/video/VideoManager.svelte.ts @@ -36,6 +36,9 @@ export class VideoInstance implements Positionable { active: false, client: null as videoTypes.VideoProducer | null, roomId: null as string | null, + // New properties for UI state + type: null as 'recording' | 'remote' | null, + stream: null as MediaStream | null, }); // Position (reactive and bindable) @@ -272,6 +275,8 @@ export class VideoManager { video.output.active = true; video.output.client = producer; video.output.roomId = roomId; + video.output.type = 'remote'; + video.output.stream = video.input.stream; console.log(`Video output started to room ${roomId} for video ${videoId}`); return { success: true }; @@ -530,6 +535,8 @@ export class VideoManager { video.output.active = true; video.output.client = producer; video.output.roomId = result.roomId; + video.output.type = 'remote'; + video.output.stream = video.input.stream; // Refresh room list await this.listRooms(workspaceId); @@ -554,6 +561,8 @@ export class VideoManager { video.output.active = false; video.output.client = null; video.output.roomId = null; + video.output.type = null; + video.output.stream = null; console.log(`Output stopped for video ${videoId}`); } diff --git a/src/lib/sensors/consumers/RemoteServerConsumer.ts b/src/lib/sensors/consumers/RemoteServerConsumer.ts deleted file mode 100644 index eacbbcf660870f38089a170ca0b460ca0192ea11..0000000000000000000000000000000000000000 --- a/src/lib/sensors/consumers/RemoteServerConsumer.ts +++ /dev/null @@ -1,359 +0,0 @@ -import type { - ConsumerSensorDriver, - ConnectionStatus, - SensorFrame, - SensorStream, - RemoteServerConsumerConfig, - FrameCallback, - StreamUpdateCallback, - StatusChangeCallback, - UnsubscribeFn -} from "../types/index.js"; - -/** - * Remote Server Consumer Driver - * - * Sends video frames to a remote Python server using WebSocket. - * Simplified with best practices - uses WebSocket only for optimal performance. - */ -export class RemoteServerConsumer implements ConsumerSensorDriver { - readonly type = "consumer" as const; - readonly id: string; - readonly name: string; - - private _status: ConnectionStatus = { isConnected: false }; - private config: RemoteServerConsumerConfig; - - // Connection management - private websocket: WebSocket | null = null; - private reconnectAttempts = 0; - private reconnectTimer?: Timer; - - // Stream management - private activeOutputStreams = new Map(); - private sendQueue: SensorFrame[] = []; - private isSending = false; - - // Event callbacks - private frameSentCallbacks: FrameCallback[] = []; - private streamUpdateCallbacks: StreamUpdateCallback[] = []; - private statusCallbacks: StatusChangeCallback[] = []; - - constructor(config: RemoteServerConsumerConfig) { - this.config = config; - this.id = `remote-server-consumer-${Date.now()}`; - this.name = `Remote Server Consumer (${config.url})`; - - console.log("πŸ“‘ Created RemoteServer consumer driver for:", config.url); - } - - get status(): ConnectionStatus { - return this._status; - } - - async connect(): Promise { - console.log("πŸ“‘ Connecting to remote server...", this.config.url); - - try { - await this.connectWebSocket(); - - this._status = { - isConnected: true, - lastConnected: new Date(), - error: undefined - }; - this.notifyStatusChange(); - - console.log("βœ… Remote server consumer connected successfully"); - } catch (error) { - this._status = { - isConnected: false, - error: `Connection failed: ${error}` - }; - this.notifyStatusChange(); - throw error; - } - } - - async disconnect(): Promise { - console.log("πŸ“‘ Disconnecting from remote server..."); - - // Clear reconnect timer - if (this.reconnectTimer) { - clearTimeout(this.reconnectTimer); - this.reconnectTimer = undefined; - } - - // Close WebSocket - if (this.websocket) { - this.websocket.close(); - this.websocket = null; - } - - // Clear send queue - this.sendQueue = []; - this.isSending = false; - - this._status = { isConnected: false }; - this.notifyStatusChange(); - - console.log("βœ… Remote server consumer disconnected"); - } - - async sendFrame(frame: SensorFrame): Promise { - if (!this._status.isConnected) { - throw new Error("Cannot send frame: consumer not connected"); - } - - // Add to send queue - this.sendQueue.push(frame); - - // Process queue if not already sending - if (!this.isSending) { - await this.processSendQueue(); - } - } - - async sendFrames(frames: SensorFrame[]): Promise { - if (!this._status.isConnected) { - throw new Error("Cannot send frames: consumer not connected"); - } - - // Add all frames to queue - this.sendQueue.push(...frames); - - // Process queue if not already sending - if (!this.isSending) { - await this.processSendQueue(); - } - } - - async startOutputStream(stream: SensorStream): Promise { - console.log("πŸ“‘ Starting output stream:", stream.id); - - this.activeOutputStreams.set(stream.id, stream); - this.notifyStreamUpdate(stream); - - // Send stream start message to server - await this.sendControlMessage({ - type: "stream_start", - streamId: stream.id, - streamConfig: stream.config - }); - } - - async stopOutputStream(streamId: string): Promise { - console.log("πŸ“‘ Stopping output stream:", streamId); - - const stream = this.activeOutputStreams.get(streamId); - if (stream) { - stream.active = false; - stream.endTime = new Date(); - this.activeOutputStreams.delete(streamId); - this.notifyStreamUpdate(stream); - - // Send stream stop message to server - await this.sendControlMessage({ - type: "stream_stop", - streamId - }); - } - } - - getActiveOutputStreams(): SensorStream[] { - return Array.from(this.activeOutputStreams.values()); - } - - // Event subscription methods - onFrameSent(callback: FrameCallback): UnsubscribeFn { - this.frameSentCallbacks.push(callback); - return () => { - const index = this.frameSentCallbacks.indexOf(callback); - if (index >= 0) { - this.frameSentCallbacks.splice(index, 1); - } - }; - } - - onStreamUpdate(callback: StreamUpdateCallback): UnsubscribeFn { - this.streamUpdateCallbacks.push(callback); - return () => { - const index = this.streamUpdateCallbacks.indexOf(callback); - if (index >= 0) { - this.streamUpdateCallbacks.splice(index, 1); - } - }; - } - - onStatusChange(callback: StatusChangeCallback): UnsubscribeFn { - this.statusCallbacks.push(callback); - return () => { - const index = this.statusCallbacks.indexOf(callback); - if (index >= 0) { - this.statusCallbacks.splice(index, 1); - } - }; - } - - // Private connection methods - private async connectWebSocket(): Promise { - return new Promise((resolve, reject) => { - const wsUrl = this.config.url.replace(/^http/, "ws") + "/video-stream"; - - this.websocket = new WebSocket(wsUrl); - this.websocket.binaryType = "arraybuffer"; - - this.websocket.onopen = () => { - console.log("βœ… WebSocket connected to remote server"); - this.reconnectAttempts = 0; - resolve(); - }; - - this.websocket.onclose = (event) => { - console.log("πŸ”Œ WebSocket disconnected:", event.code, event.reason); - this.handleConnectionLoss(); - }; - - this.websocket.onerror = (error) => { - console.error("❌ WebSocket error:", error); - reject(new Error("WebSocket connection failed")); - }; - - this.websocket.onmessage = (event) => { - this.handleServerMessage(event.data); - }; - }); - } - - private async processSendQueue(): Promise { - if (this.isSending || this.sendQueue.length === 0) { - return; - } - - this.isSending = true; - - try { - while (this.sendQueue.length > 0) { - const frame = this.sendQueue.shift()!; - await this.transmitFrame(frame); - this.notifyFrameSent(frame); - } - } catch (error) { - console.error("❌ Error processing send queue:", error); - this._status.error = `Send error: ${error}`; - this.notifyStatusChange(); - } finally { - this.isSending = false; - } - } - - private async transmitFrame(frame: SensorFrame): Promise { - if (!this.websocket || this.websocket.readyState !== WebSocket.OPEN) { - throw new Error("WebSocket not available for transmission"); - } - - // Prepare metadata header - const header = JSON.stringify({ - type: "video_frame", - timestamp: frame.timestamp, - frameType: frame.type, - metadata: frame.metadata, - streamId: this.config.streamId - }); - - const headerBuffer = new TextEncoder().encode(header); - const headerLengthBuffer = new Uint32Array([headerBuffer.length]).buffer; // 4-byte length prefix - - let dataBuffer: ArrayBuffer; - if (frame.data instanceof Blob) { - dataBuffer = await frame.data.arrayBuffer(); - } else { - dataBuffer = frame.data as ArrayBuffer; - } - - // Concatenate: [length][header][data] - const packet = new Uint8Array(headerLengthBuffer.byteLength + headerBuffer.byteLength + dataBuffer.byteLength); - packet.set(new Uint8Array(headerLengthBuffer), 0); - packet.set(new Uint8Array(headerBuffer), headerLengthBuffer.byteLength); - packet.set(new Uint8Array(dataBuffer), headerLengthBuffer.byteLength + headerBuffer.byteLength); - - this.websocket.send(packet.buffer); - } - - private async sendControlMessage(message: Record): Promise { - if (this.websocket && this.websocket.readyState === WebSocket.OPEN) { - this.websocket.send(JSON.stringify(message)); - } - } - - private handleServerMessage(data: string | ArrayBuffer): void { - try { - const message = typeof data === "string" ? JSON.parse(data) : data; - console.log("πŸ“¨ Received server message:", message); - - // Handle server responses, status updates, etc. - if (message.type === "status") { - this._status.bitrate = message.bitrate; - this._status.frameRate = message.frameRate; - this.notifyStatusChange(); - } - } catch (error) { - console.error("❌ Error parsing server message:", error); - } - } - - private handleConnectionLoss(): void { - this._status.isConnected = false; - this._status.error = "Connection lost"; - this.notifyStatusChange(); - - // Attempt reconnection - const maxRetries = this.config.retryAttempts || 5; - const retryDelay = this.config.retryDelay || 2000; - - if (this.reconnectAttempts < maxRetries) { - this.reconnectAttempts++; - console.log(`πŸ”„ Attempting reconnection ${this.reconnectAttempts}/${maxRetries} in ${retryDelay}ms`); - - this.reconnectTimer = setTimeout(async () => { - try { - await this.connect(); - } catch (error) { - console.error("❌ Reconnection failed:", error); - } - }, retryDelay); - } else { - console.error("❌ Max reconnection attempts reached"); - } - } - - private notifyFrameSent(frame: SensorFrame): void { - this.frameSentCallbacks.forEach((callback) => { - try { - callback(frame); - } catch (error) { - console.error("Error in frame sent callback:", error); - } - }); - } - - private notifyStreamUpdate(stream: SensorStream): void { - this.streamUpdateCallbacks.forEach((callback) => { - try { - callback(stream); - } catch (error) { - console.error("Error in stream update callback:", error); - } - }); - } - - private notifyStatusChange(): void { - this.statusCallbacks.forEach((callback) => { - try { - callback(this._status); - } catch (error) { - console.error("Error in status change callback:", error); - } - }); - } -} \ No newline at end of file diff --git a/src/lib/sensors/consumers/WebRTCConsumer.ts b/src/lib/sensors/consumers/WebRTCConsumer.ts deleted file mode 100644 index f73f8f64250f03d0526c18dc0c2236bdfed611b8..0000000000000000000000000000000000000000 --- a/src/lib/sensors/consumers/WebRTCConsumer.ts +++ /dev/null @@ -1,200 +0,0 @@ -import type { - ConsumerSensorDriver, - ConnectionStatus, - SensorFrame, - SensorStream, - FrameCallback, - StreamUpdateCallback, - StatusChangeCallback, - UnsubscribeFn -} from "../types/index.js"; - -export interface WebRTCConsumerConfig { - type: "webrtc-consumer"; - signalingUrl: string; // ws://host:port/signaling - streamId?: string; -} - -export class WebRTCConsumer implements ConsumerSensorDriver { - readonly type = "consumer" as const; - readonly id: string; - readonly name = "WebRTC Consumer"; - - private config: WebRTCConsumerConfig; - private _status: ConnectionStatus = { isConnected: false }; - - private pc: RTCPeerConnection | null = null; - private dc: RTCDataChannel | null = null; - private signaling?: WebSocket; - - private frameSentCallbacks: FrameCallback[] = []; - private streamUpdateCallbacks: StreamUpdateCallback[] = []; - private statusCallbacks: StatusChangeCallback[] = []; - - private activeStreams = new Map(); - private sendQueue: SensorFrame[] = []; - private isSending = false; - - private readonly BUFFER_WATERMARK = 4 * 1024 * 1024; // 4 MB - - constructor(config: WebRTCConsumerConfig) { - this.config = config; - this.id = `webrtc-consumer-${Date.now()}`; - } - - get status(): ConnectionStatus { - return this._status; - } - - async connect(): Promise { - // open signaling - this.signaling = new WebSocket(this.config.signalingUrl); - await new Promise((res, rej) => { - this.signaling!.onopen = () => res(); - this.signaling!.onerror = rej; - }); - - // create pc - this.pc = new RTCPeerConnection({ - iceServers: [{ urls: "stun:stun.l.google.com:19302" }] - }); - - // datachannel - this.dc = this.pc.createDataChannel("video", { - ordered: false, - maxRetransmits: 0 - }); - this.dc.binaryType = "arraybuffer"; - this.dc.onopen = () => { - this._status = { isConnected: true, lastConnected: new Date() }; - this.notifyStatus(); - this.flushQueue(); - }; - this.dc.onclose = () => { - this._status = { isConnected: false, error: "DC closed" }; - this.notifyStatus(); - }; - - // ICE - Trickle-ICE for faster startup - this.pc.onicecandidate = (ev) => { - if (ev.candidate) { - this.signaling!.send(JSON.stringify({ - type: "ice", - candidate: ev.candidate.toJSON() - })); - } else { - // Send end-of-candidates marker - this.signaling!.send(JSON.stringify({ - type: "ice", - candidate: { end: true } - })); - } - }; - - // signaling messages - this.signaling.onmessage = async (ev) => { - const msg = JSON.parse(ev.data); - if (msg.type === "answer") { - await this.pc!.setRemoteDescription({ type: "answer", sdp: msg.sdp }); - } else if (msg.type === "ice") { - // Handle end-of-candidates marker - if (msg.candidate?.end) { - // ICE gathering complete on remote side - return; - } - await this.pc!.addIceCandidate(msg.candidate); - } - }; - - // create offer immediately (trickle-ICE) - const offer = await this.pc.createOffer(); - await this.pc.setLocalDescription(offer); - this.signaling.send(JSON.stringify({ type: "offer", sdp: offer.sdp })); - } - - async disconnect(): Promise { - if (this.dc) this.dc.close(); - if (this.pc) this.pc.close(); - if (this.signaling) this.signaling.close(); - this._status = { isConnected: false }; - this.notifyStatus(); - } - - // ConsumerSensorDriver impl - async sendFrame(frame: SensorFrame): Promise { - if (!this.dc || this.dc.readyState !== "open") { - throw new Error("DataChannel not open"); - } - this.sendQueue.push(frame); - this.flushQueue(); - } - - async sendFrames(frames: SensorFrame[]): Promise { - this.sendQueue.push(...frames); - this.flushQueue(); - } - - async startOutputStream(stream: SensorStream): Promise { - this.activeStreams.set(stream.id, stream); - this.notifyStream(stream); - } - async stopOutputStream(streamId: string): Promise { - const s = this.activeStreams.get(streamId); - if (s) { - s.active = false; this.activeStreams.delete(streamId); this.notifyStream(s); - } - } - getActiveOutputStreams(): SensorStream[] { return Array.from(this.activeStreams.values()); } - - // no-op for onFrameSent etc. - onFrameSent(cb: FrameCallback): UnsubscribeFn { this.frameSentCallbacks.push(cb); return () => this.pull(this.frameSentCallbacks, cb); } - onStreamUpdate(cb: StreamUpdateCallback): UnsubscribeFn { this.streamUpdateCallbacks.push(cb); return () => this.pull(this.streamUpdateCallbacks, cb); } - onStatusChange(cb: StatusChangeCallback): UnsubscribeFn { this.statusCallbacks.push(cb); return () => this.pull(this.statusCallbacks, cb); } - - // helpers - private flushQueue() { - if (!this.dc || this.dc.readyState !== "open") return; - while (this.sendQueue.length && this.dc.bufferedAmount < this.BUFFER_WATERMARK) { - const frame = this.sendQueue.shift()!; - const packet = this.frameToPacket(frame); - this.dc.send(packet); - this.frameSentCallbacks.forEach((c) => c(frame)); - } - } - - private frameToPacket(frame: SensorFrame): ArrayBuffer { - const headerObj = { - type: "video_frame", - timestamp: frame.timestamp, - frameType: frame.type, - metadata: frame.metadata, - streamId: this.config.streamId - }; - const headerJson = JSON.stringify(headerObj); - const headerBuf = new TextEncoder().encode(headerJson); - const lenBuf = new Uint32Array([headerBuf.length]).buffer; - let dataBuf: ArrayBuffer; - if (frame.data instanceof Blob) { - // this is sync because MediaRecorder gives Blob slices pre-gathered - // but we need async arrayBuffer β€” already spec returns Promise - return frame.data.arrayBuffer().then((buf) => { - const out = new Uint8Array(lenBuf.byteLength + headerBuf.length + buf.byteLength); - out.set(new Uint8Array(lenBuf), 0); - out.set(headerBuf, lenBuf.byteLength); - out.set(new Uint8Array(buf), lenBuf.byteLength + headerBuf.length); - return out.buffer; - }) as unknown as ArrayBuffer; // caller handles async in flushQueue loop by awaiting? For now assume ArrayBuffer path (MediaRecorder provides ArrayBuffer in config) - } else { - dataBuf = frame.data as ArrayBuffer; - } - const out = new Uint8Array(lenBuf.byteLength + headerBuf.length + dataBuf.byteLength); - out.set(new Uint8Array(lenBuf), 0); - out.set(headerBuf, lenBuf.byteLength); - out.set(new Uint8Array(dataBuf), lenBuf.byteLength + headerBuf.length); - return out.buffer; - } - - private pull(arr: T[], item: T) { const i = arr.indexOf(item); if (i >= 0) arr.splice(i, 1); } - private notifyStream(s: SensorStream) { this.streamUpdateCallbacks.forEach((c) => c(s)); } - private notifyStatus() { this.statusCallbacks.forEach((c) => c(this._status)); } -} \ No newline at end of file diff --git a/src/lib/sensors/consumers/index.ts b/src/lib/sensors/consumers/index.ts deleted file mode 100644 index 8179ba9a3acc34029d67f0d40a08305b26027224..0000000000000000000000000000000000000000 --- a/src/lib/sensors/consumers/index.ts +++ /dev/null @@ -1,8 +0,0 @@ -/** - * Consumer Sensor Drivers - Main Export - * - * Central export point for all consumer sensor driver implementations - */ - -export { RemoteServerConsumer } from "./RemoteServerConsumer.js"; -export { WebRTCConsumer } from "./WebRTCConsumer.js"; \ No newline at end of file diff --git a/src/lib/sensors/index.ts b/src/lib/sensors/index.ts deleted file mode 100644 index 7398d64346585435754aea5d7415dee19b11484c..0000000000000000000000000000000000000000 --- a/src/lib/sensors/index.ts +++ /dev/null @@ -1,14 +0,0 @@ -/** - * Sensor Drivers - Main Export - * - * Central export point for all sensor driver types and implementations - */ - -// Types -export type * from "./types/index.js"; - -// Producer drivers -export * from "./producers/index.js"; - -// Consumer drivers -export * from "./consumers/index.js"; \ No newline at end of file diff --git a/src/lib/sensors/producers/MediaRecorderProducer.ts b/src/lib/sensors/producers/MediaRecorderProducer.ts deleted file mode 100644 index 43f6a8d05bc9d6091fdb67885434164accdeb6e4..0000000000000000000000000000000000000000 --- a/src/lib/sensors/producers/MediaRecorderProducer.ts +++ /dev/null @@ -1,381 +0,0 @@ -import type { - ProducerSensorDriver, - ConnectionStatus, - SensorFrame, - SensorStream, - VideoStreamConfig, - MediaRecorderProducerConfig, - FrameCallback, - StreamUpdateCallback, - StatusChangeCallback, - UnsubscribeFn -} from "../types/index.js"; - -/** - * MediaRecorder Producer Driver - * - * Captures video/audio from browser MediaDevices using MediaRecorder API. - * Simplified with best practices - uses WebM format and optimized settings. - */ -export class MediaRecorderProducer implements ProducerSensorDriver { - readonly type = "producer" as const; - readonly id: string; - readonly name: string; - - private _status: ConnectionStatus = { isConnected: false }; - private config: MediaRecorderProducerConfig; - - // MediaRecorder state - private mediaStream: MediaStream | null = null; - private mediaRecorder: MediaRecorder | null = null; - private recordingDataChunks: Blob[] = []; - - // Stream management - private activeStreams = new Map(); - - // Event callbacks - private frameCallbacks: FrameCallback[] = []; - private streamUpdateCallbacks: StreamUpdateCallback[] = []; - private statusCallbacks: StatusChangeCallback[] = []; - - constructor(config: MediaRecorderProducerConfig) { - this.config = config; - this.id = `media-recorder-${Date.now()}`; - this.name = "MediaRecorder Producer"; - - console.log("πŸŽ₯ Created MediaRecorder producer driver"); - } - - get status(): ConnectionStatus { - return this._status; - } - - async connect(): Promise { - console.log("πŸŽ₯ Connecting MediaRecorder producer..."); - - try { - // Check if browser supports MediaRecorder - if (!MediaRecorder.isTypeSupported) { - throw new Error("MediaRecorder not supported in this browser"); - } - - // Test basic media access - const testStream = await navigator.mediaDevices.getUserMedia({ - video: true, - audio: true - }); - - // Close test stream immediately - testStream.getTracks().forEach(track => track.stop()); - - this._status = { - isConnected: true, - lastConnected: new Date(), - error: undefined - }; - this.notifyStatusChange(); - - console.log("βœ… MediaRecorder producer connected successfully"); - } catch (error) { - this._status = { - isConnected: false, - error: `Connection failed: ${error}` - }; - this.notifyStatusChange(); - throw error; - } - } - - async disconnect(): Promise { - console.log("πŸŽ₯ Disconnecting MediaRecorder producer..."); - - // Stop all active streams - for (const streamId of this.activeStreams.keys()) { - await this.stopStream(streamId); - } - - this._status = { isConnected: false }; - this.notifyStatusChange(); - - console.log("βœ… MediaRecorder producer disconnected"); - } - - async startStream(config: VideoStreamConfig): Promise { - if (!this._status.isConnected) { - throw new Error("Cannot start stream: producer not connected"); - } - - console.log("πŸŽ₯ Starting MediaRecorder stream...", config); - - try { - // Prepare media constraints with best practices - const constraints: MediaStreamConstraints = { - video: { - width: config.width || 1280, - height: config.height || 720, - frameRate: config.frameRate || 30, - facingMode: config.facingMode || "user", - ...(config.deviceId && { deviceId: config.deviceId }) - }, - audio: true, - ...this.config.constraints - }; - - // Get media stream - this.mediaStream = await navigator.mediaDevices.getUserMedia(constraints); - - // Create MediaRecorder with optimized WebM settings - const mimeType = this.getBestWebMType(); - this.mediaRecorder = new MediaRecorder(this.mediaStream, { - mimeType, - videoBitsPerSecond: this.config.videoBitsPerSecond || 2500000, - audioBitsPerSecond: this.config.audioBitsPerSecond || 128000 - }); - - // Create stream object - const stream: SensorStream = { - id: `stream-${Date.now()}`, - name: `MediaRecorder Stream ${config.width}x${config.height}`, - type: "video", - config, - active: true, - startTime: new Date(), - totalFrames: 0 - }; - - this.activeStreams.set(stream.id, stream); - - // Set up MediaRecorder event handlers - this.setupMediaRecorderEvents(stream); - - // Start recording with optimized interval - const recordingInterval = this.config.recordingInterval || 100; - this.mediaRecorder.start(recordingInterval); - - // Update status with stream info - this._status.frameRate = config.frameRate; - this._status.bitrate = this.config.videoBitsPerSecond; - this.notifyStatusChange(); - - this.notifyStreamUpdate(stream); - - console.log(`βœ… MediaRecorder stream started: ${stream.id}`); - return stream; - - } catch (error) { - console.error("❌ Failed to start MediaRecorder stream:", error); - throw error; - } - } - - async stopStream(streamId: string): Promise { - console.log(`πŸŽ₯ Stopping MediaRecorder stream: ${streamId}`); - - const stream = this.activeStreams.get(streamId); - if (!stream) { - throw new Error(`Stream not found: ${streamId}`); - } - - try { - // Stop MediaRecorder - if (this.mediaRecorder && this.mediaRecorder.state !== "inactive") { - this.mediaRecorder.stop(); - } - - // Stop media stream tracks - if (this.mediaStream) { - this.mediaStream.getTracks().forEach(track => track.stop()); - this.mediaStream = null; - } - - // Update stream - stream.active = false; - stream.endTime = new Date(); - - this.activeStreams.delete(streamId); - this.notifyStreamUpdate(stream); - - console.log(`βœ… MediaRecorder stream stopped: ${streamId}`); - - } catch (error) { - console.error(`❌ Failed to stop stream ${streamId}:`, error); - throw error; - } - } - - async pauseStream(streamId: string): Promise { - console.log(`⏸️ Pausing MediaRecorder stream: ${streamId}`); - - const stream = this.activeStreams.get(streamId); - if (!stream) { - throw new Error(`Stream not found: ${streamId}`); - } - - if (this.mediaRecorder && this.mediaRecorder.state === "recording") { - this.mediaRecorder.pause(); - this.notifyStreamUpdate(stream); - } - } - - async resumeStream(streamId: string): Promise { - console.log(`▢️ Resuming MediaRecorder stream: ${streamId}`); - - const stream = this.activeStreams.get(streamId); - if (!stream) { - throw new Error(`Stream not found: ${streamId}`); - } - - if (this.mediaRecorder && this.mediaRecorder.state === "paused") { - this.mediaRecorder.resume(); - this.notifyStreamUpdate(stream); - } - } - - getActiveStreams(): SensorStream[] { - return Array.from(this.activeStreams.values()); - } - - // Event subscription methods - onFrame(callback: FrameCallback): UnsubscribeFn { - this.frameCallbacks.push(callback); - return () => { - const index = this.frameCallbacks.indexOf(callback); - if (index >= 0) { - this.frameCallbacks.splice(index, 1); - } - }; - } - - onStreamUpdate(callback: StreamUpdateCallback): UnsubscribeFn { - this.streamUpdateCallbacks.push(callback); - return () => { - const index = this.streamUpdateCallbacks.indexOf(callback); - if (index >= 0) { - this.streamUpdateCallbacks.splice(index, 1); - } - }; - } - - onStatusChange(callback: StatusChangeCallback): UnsubscribeFn { - this.statusCallbacks.push(callback); - return () => { - const index = this.statusCallbacks.indexOf(callback); - if (index >= 0) { - this.statusCallbacks.splice(index, 1); - } - }; - } - - // Private helper methods - private setupMediaRecorderEvents(stream: SensorStream): void { - if (!this.mediaRecorder) return; - - this.mediaRecorder.ondataavailable = (event) => { - if (event.data && event.data.size > 0) { - this.recordingDataChunks.push(event.data); - - // Create frame from chunk - const frame: SensorFrame = { - timestamp: Date.now(), - type: "video", - data: event.data, - metadata: { - width: stream.config.width, - height: stream.config.height, - frameRate: stream.config.frameRate, - codec: "webm", - bitrate: this.config.videoBitsPerSecond - } - }; - - // Update stream stats - stream.totalFrames = (stream.totalFrames || 0) + 1; - - // Notify frame callbacks - this.notifyFrame(frame); - } - }; - - this.mediaRecorder.onstop = () => { - console.log("πŸŽ₯ MediaRecorder stopped"); - - // Create final frame with complete recording - if (this.recordingDataChunks.length > 0) { - const finalBlob = new Blob(this.recordingDataChunks, { - type: "video/webm" - }); - - const finalFrame: SensorFrame = { - timestamp: Date.now(), - type: "video", - data: finalBlob, - metadata: { - width: stream.config.width, - height: stream.config.height, - codec: "webm", - isComplete: true, - totalSize: finalBlob.size - } - }; - - this.notifyFrame(finalFrame); - } - - // Clear chunks - this.recordingDataChunks = []; - }; - - this.mediaRecorder.onerror = (event) => { - console.error("❌ MediaRecorder error:", event); - this._status.error = "Recording error occurred"; - this.notifyStatusChange(); - }; - } - - private getBestWebMType(): string { - // Best WebM types in order of preference - const types = [ - "video/webm;codecs=vp9,opus", - "video/webm;codecs=vp8,opus", - "video/webm" - ]; - - for (const type of types) { - if (MediaRecorder.isTypeSupported(type)) { - return type; - } - } - - return "video/webm"; // Fallback - } - - private notifyFrame(frame: SensorFrame): void { - this.frameCallbacks.forEach((callback) => { - try { - callback(frame); - } catch (error) { - console.error("Error in frame callback:", error); - } - }); - } - - private notifyStreamUpdate(stream: SensorStream): void { - this.streamUpdateCallbacks.forEach((callback) => { - try { - callback(stream); - } catch (error) { - console.error("Error in stream update callback:", error); - } - }); - } - - private notifyStatusChange(): void { - this.statusCallbacks.forEach((callback) => { - try { - callback(this._status); - } catch (error) { - console.error("Error in status change callback:", error); - } - }); - } -} \ No newline at end of file diff --git a/src/lib/sensors/producers/MediaRecorderProducer.ts.recommendation.md b/src/lib/sensors/producers/MediaRecorderProducer.ts.recommendation.md deleted file mode 100644 index 30e1bbdc50355292699f5e29e34c0f152c645015..0000000000000000000000000000000000000000 --- a/src/lib/sensors/producers/MediaRecorderProducer.ts.recommendation.md +++ /dev/null @@ -1,663 +0,0 @@ -# MediaRecorderProducer.ts Performance Optimization Recommendations - -## Current Analysis -The MediaRecorderProducer manages video/audio capture using the MediaRecorder API. While functional, the current implementation has performance bottlenecks in memory management, frame processing, and resource cleanup that need optimization for high-performance video streaming. - -## Critical Performance Issues - -### 1. **Memory Leak in Blob Accumulation** -- **Problem**: `recordingDataChunks` array grows unbounded during recording -- **Impact**: Memory usage increases continuously, causing browser crashes -- **Solution**: Implement chunk processing and circular buffer management - -### 2. **Inefficient Frame Processing** -- **Problem**: No frame skipping or quality adaptation based on performance -- **Impact**: Performance degradation under high load or low-end devices -- **Solution**: Implement adaptive frame rate and quality control - -### 3. **Blocking Stream Operations** -- **Problem**: Stream start/stop operations can block the main thread -- **Impact**: UI freezes during media operations -- **Solution**: Use async operations with proper task scheduling - -### 4. **No Connection Pooling** -- **Problem**: New MediaStream created for each connection attempt -- **Impact**: Unnecessary resource allocation and slower startup -- **Solution**: Implement stream reuse and connection pooling - -## Recommended Optimizations - -### 1. **Implement Memory-Efficient Chunk Management** -```typescript -interface OptimizedMediaRecorderProducer extends ProducerSensorDriver { - // Performance configuration - private readonly maxChunkBufferSize: number; - private readonly chunkProcessingInterval: number; - private readonly memoryThresholdMB: number; - - // Optimized state management - private chunkBuffer: CircularBuffer; - private frameProcessor: FrameProcessor; - private memoryMonitor: MemoryMonitor; - private performanceMetrics: ProducerMetrics; -} - -class CircularBuffer { - private buffer: T[]; - private head = 0; - private tail = 0; - private size = 0; - - constructor(private capacity: number) { - this.buffer = new Array(capacity); - } - - push(item: T): T | null { - const evicted = this.size === this.capacity ? this.buffer[this.tail] : null; - - this.buffer[this.head] = item; - this.head = (this.head + 1) % this.capacity; - - if (this.size === this.capacity) { - this.tail = (this.tail + 1) % this.capacity; - } else { - this.size++; - } - - return evicted; - } - - toArray(): T[] { - const result: T[] = []; - for (let i = 0; i < this.size; i++) { - const index = (this.tail + i) % this.capacity; - result.push(this.buffer[index]); - } - return result; - } - - clear(): void { - this.head = 0; - this.tail = 0; - this.size = 0; - } -} - -export class OptimizedMediaRecorderProducer implements ProducerSensorDriver { - readonly type = "producer" as const; - readonly id: string; - readonly name: string; - - private _status: ConnectionStatus = { isConnected: false }; - private config: MediaRecorderProducerConfig; - - // Optimized state management - private mediaStream: MediaStream | null = null; - private mediaRecorder: MediaRecorder | null = null; - - // Memory-efficient chunk management - private chunkBuffer: CircularBuffer; - private chunkProcessingInterval: number | null = null; - private lastChunkProcessTime = 0; - - // Performance monitoring - private frameProcessor: FrameProcessor; - private memoryMonitor: MemoryMonitor; - private performanceMetrics: ProducerMetrics; - - // Configuration constants - private readonly maxChunkBufferSize = 10; // Maximum chunks in buffer - private readonly chunkProcessingIntervalMs = 50; // Process chunks every 50ms - private readonly memoryThresholdMB = 100; // Alert at 100MB memory usage - - constructor(config: MediaRecorderProducerConfig) { - this.config = config; - this.id = `optimized-media-recorder-${Date.now()}`; - this.name = "Optimized MediaRecorder Producer"; - - // Initialize optimized components - this.chunkBuffer = new CircularBuffer(this.maxChunkBufferSize); - this.frameProcessor = new FrameProcessor(); - this.memoryMonitor = new MemoryMonitor(this.memoryThresholdMB); - this.performanceMetrics = new ProducerMetrics(); - - this.startPerformanceMonitoring(); - } -} -``` - -### 2. **Implement Adaptive Frame Processing** -```typescript -interface FrameProcessingConfig { - targetFPS: number; - qualityThreshold: number; - adaptiveQuality: boolean; - maxProcessingTime: number; // ms -} - -class FrameProcessor { - private config: FrameProcessingConfig; - private frameDropCount = 0; - private lastFrameTime = 0; - private processingTimes: number[] = []; - private currentQuality = 1.0; - - // Frame rate control - private targetFrameInterval: number; - private lastFrameProcessed = 0; - - constructor(config: FrameProcessingConfig) { - this.config = config; - this.targetFrameInterval = 1000 / config.targetFPS; - } - - shouldProcessFrame(timestamp: number): boolean { - // Frame rate limiting - if (timestamp - this.lastFrameProcessed < this.targetFrameInterval) { - return false; - } - - // Performance-based frame skipping - const avgProcessingTime = this.getAverageProcessingTime(); - if (avgProcessingTime > this.config.maxProcessingTime) { - this.frameDropCount++; - - // Adaptive quality reduction - if (this.config.adaptiveQuality && this.frameDropCount > 5) { - this.reduceQuality(); - this.frameDropCount = 0; - } - - return false; - } - - this.lastFrameProcessed = timestamp; - return true; - } - - processFrame(blob: Blob, timestamp: number): Promise { - const startTime = performance.now(); - - return new Promise((resolve, reject) => { - // Use transferable objects for better performance - const reader = new FileReader(); - - reader.onload = () => { - try { - const arrayBuffer = reader.result as ArrayBuffer; - - // Create optimized frame object - const frame: SensorFrame = { - id: `frame-${timestamp}`, - type: "video", - data: arrayBuffer, - timestamp, - size: blob.size, - metadata: { - quality: this.currentQuality, - processingTime: performance.now() - startTime, - frameDropCount: this.frameDropCount - } - }; - - // Track processing time - this.recordProcessingTime(performance.now() - startTime); - - resolve(frame); - } catch (error) { - reject(error); - } - }; - - reader.onerror = () => reject(reader.error); - reader.readAsArrayBuffer(blob); - }); - } - - private recordProcessingTime(time: number): void { - this.processingTimes.push(time); - - // Keep only recent measurements - if (this.processingTimes.length > 30) { - this.processingTimes.shift(); - } - } - - private getAverageProcessingTime(): number { - if (this.processingTimes.length === 0) return 0; - - const sum = this.processingTimes.reduce((a, b) => a + b, 0); - return sum / this.processingTimes.length; - } - - private reduceQuality(): void { - this.currentQuality = Math.max(0.3, this.currentQuality * 0.8); - console.warn(`πŸŽ₯ Reducing video quality to ${(this.currentQuality * 100).toFixed(0)}%`); - } - - getCurrentQuality(): number { - return this.currentQuality; - } - - resetQuality(): void { - this.currentQuality = 1.0; - this.frameDropCount = 0; - } -} -``` - -### 3. **Add Performance Memory Monitoring** -```typescript -interface MemoryStats { - usedJSHeapSize: number; - totalJSHeapSize: number; - jsHeapSizeLimit: number; - chunkBufferSize: number; - activeStreams: number; -} - -class MemoryMonitor { - private memoryThresholdMB: number; - private checkInterval: number | null = null; - private lastWarningTime = 0; - private readonly warningCooldown = 30000; // 30 seconds - - constructor(thresholdMB: number) { - this.memoryThresholdMB = thresholdMB * 1024 * 1024; // Convert to bytes - this.startMonitoring(); - } - - startMonitoring(): void { - this.checkInterval = setInterval(() => { - this.checkMemoryUsage(); - }, 5000) as any; // Check every 5 seconds - } - - stopMonitoring(): void { - if (this.checkInterval) { - clearInterval(this.checkInterval); - this.checkInterval = null; - } - } - - private checkMemoryUsage(): void { - const memoryInfo = this.getMemoryInfo(); - - if (memoryInfo.usedJSHeapSize > this.memoryThresholdMB) { - const now = Date.now(); - - if (now - this.lastWarningTime > this.warningCooldown) { - console.warn('🚨 High memory usage detected:', { - used: `${(memoryInfo.usedJSHeapSize / 1024 / 1024).toFixed(1)}MB`, - total: `${(memoryInfo.totalJSHeapSize / 1024 / 1024).toFixed(1)}MB`, - limit: `${(memoryInfo.jsHeapSizeLimit / 1024 / 1024).toFixed(1)}MB` - }); - - this.lastWarningTime = now; - - // Trigger garbage collection if available - this.requestGarbageCollection(); - } - } - } - - getMemoryInfo(): MemoryStats { - const performance = window.performance as any; - const memoryInfo = performance.memory || { - usedJSHeapSize: 0, - totalJSHeapSize: 0, - jsHeapSizeLimit: 0 - }; - - return { - usedJSHeapSize: memoryInfo.usedJSHeapSize, - totalJSHeapSize: memoryInfo.totalJSHeapSize, - jsHeapSizeLimit: memoryInfo.jsHeapSizeLimit, - chunkBufferSize: 0, // Will be updated by producer - activeStreams: 0 // Will be updated by producer - }; - } - - private requestGarbageCollection(): void { - // Request garbage collection if available (Chrome DevTools) - if ('gc' in window) { - (window as any).gc(); - } - - // Also manually trigger some cleanup - this.manualCleanup(); - } - - private manualCleanup(): void { - // Force cleanup of any large objects - if (typeof window !== 'undefined') { - // Clear any cached data - setTimeout(() => { - // This gives time for the GC to run - }, 100); - } - } -} -``` - -### 4. **Optimize Stream Management with Connection Pooling** -```typescript -interface StreamPool { - availableStreams: Map; - activeConnections: Map; - maxPoolSize: number; - streamTTL: number; // Time to live in ms -} - -class OptimizedStreamManager { - private streamPool: StreamPool; - private cleanupInterval: number | null = null; - - constructor() { - this.streamPool = { - availableStreams: new Map(), - activeConnections: new Map(), - maxPoolSize: 5, - streamTTL: 300000 // 5 minutes - }; - - this.startPoolCleanup(); - } - - async getOptimizedStream(config: VideoStreamConfig): Promise { - const configKey = this.getConfigKey(config); - - // Try to reuse existing stream - const cachedStream = this.streamPool.availableStreams.get(configKey); - if (cachedStream && this.isStreamValid(cachedStream)) { - this.streamPool.availableStreams.delete(configKey); - this.streamPool.activeConnections.set(configKey, { - stream: cachedStream, - lastUsed: Date.now() - }); - - console.log(`♻️ Reusing cached media stream: ${configKey}`); - return cachedStream; - } - - // Create new stream with optimized constraints - const optimizedConstraints = this.optimizeConstraints(config); - const stream = await navigator.mediaDevices.getUserMedia(optimizedConstraints); - - this.streamPool.activeConnections.set(configKey, { - stream, - lastUsed: Date.now() - }); - - console.log(`πŸ†• Created new media stream: ${configKey}`); - return stream; - } - - releaseStream(stream: MediaStream, config: VideoStreamConfig): void { - const configKey = this.getConfigKey(config); - const connection = this.streamPool.activeConnections.get(configKey); - - if (connection && connection.stream === stream) { - this.streamPool.activeConnections.delete(configKey); - - // Add to available pool if under limit - if (this.streamPool.availableStreams.size < this.streamPool.maxPoolSize) { - this.streamPool.availableStreams.set(configKey, stream); - console.log(`πŸ“¦ Cached media stream: ${configKey}`); - } else { - this.stopStream(stream); - console.log(`πŸ—‘οΈ Disposed media stream: ${configKey}`); - } - } - } - - private optimizeConstraints(config: VideoStreamConfig): MediaStreamConstraints { - // Optimize constraints based on device capabilities - const isHighEnd = this.isHighEndDevice(); - - return { - video: { - width: { ideal: config.width, max: isHighEnd ? 1920 : 1280 }, - height: { ideal: config.height, max: isHighEnd ? 1080 : 720 }, - frameRate: { ideal: config.frameRate, max: isHighEnd ? 60 : 30 }, - facingMode: config.facingMode || "user", - ...(config.deviceId && { deviceId: config.deviceId }) - }, - audio: { - echoCancellation: true, - noiseSuppression: true, - autoGainControl: true, - sampleRate: { ideal: 48000 } - } - }; - } - - private isHighEndDevice(): boolean { - // Simple heuristic for device capability detection - const memory = (navigator as any).deviceMemory || 4; - const cores = navigator.hardwareConcurrency || 4; - - return memory >= 8 && cores >= 8; - } - - private getConfigKey(config: VideoStreamConfig): string { - return `${config.width}x${config.height}@${config.frameRate}fps`; - } - - private isStreamValid(stream: MediaStream): boolean { - return stream.active && stream.getTracks().every(track => track.readyState === 'live'); - } - - private stopStream(stream: MediaStream): void { - stream.getTracks().forEach(track => { - track.stop(); - }); - } - - private startPoolCleanup(): void { - this.cleanupInterval = setInterval(() => { - this.cleanupExpiredStreams(); - }, 60000) as any; // Cleanup every minute - } - - private cleanupExpiredStreams(): void { - const now = Date.now(); - - for (const [key, connection] of this.streamPool.activeConnections) { - if (now - connection.lastUsed > this.streamPool.streamTTL) { - this.streamPool.activeConnections.delete(key); - this.stopStream(connection.stream); - console.log(`🧹 Cleaned up expired stream: ${key}`); - } - } - - // Also cleanup available streams - for (const [key, stream] of this.streamPool.availableStreams) { - if (!this.isStreamValid(stream)) { - this.streamPool.availableStreams.delete(key); - this.stopStream(stream); - console.log(`🧹 Cleaned up invalid stream: ${key}`); - } - } - } - - destroy(): void { - if (this.cleanupInterval) { - clearInterval(this.cleanupInterval); - } - - // Clean up all streams - for (const connection of this.streamPool.activeConnections.values()) { - this.stopStream(connection.stream); - } - - for (const stream of this.streamPool.availableStreams.values()) { - this.stopStream(stream); - } - - this.streamPool.activeConnections.clear(); - this.streamPool.availableStreams.clear(); - } -} -``` - -### 5. **Implement Performance Metrics Tracking** -```typescript -interface ProducerMetrics { - totalFramesProcessed: number; - totalFramesDropped: number; - averageProcessingTime: number; - memoryUsageTrend: number[]; - qualityAdaptations: number; - connectionReuses: number; - streamCreations: number; - lastResetTime: number; -} - -class ProducerMetrics { - private metrics: ProducerMetrics; - private reportingInterval: number | null = null; - - constructor() { - this.metrics = { - totalFramesProcessed: 0, - totalFramesDropped: 0, - averageProcessingTime: 0, - memoryUsageTrend: [], - qualityAdaptations: 0, - connectionReuses: 0, - streamCreations: 0, - lastResetTime: Date.now() - }; - - this.startReporting(); - } - - recordFrameProcessed(processingTime: number): void { - this.metrics.totalFramesProcessed++; - - // Update rolling average - const weight = 0.1; - this.metrics.averageProcessingTime = - this.metrics.averageProcessingTime * (1 - weight) + - processingTime * weight; - } - - recordFrameDropped(): void { - this.metrics.totalFramesDropped++; - } - - recordQualityAdaptation(): void { - this.metrics.qualityAdaptations++; - } - - recordConnectionReuse(): void { - this.metrics.connectionReuses++; - } - - recordStreamCreation(): void { - this.metrics.streamCreations++; - } - - updateMemoryUsage(memoryMB: number): void { - this.metrics.memoryUsageTrend.push(memoryMB); - - // Keep only recent memory samples - if (this.metrics.memoryUsageTrend.length > 60) { - this.metrics.memoryUsageTrend.shift(); - } - } - - getMetrics(): ProducerMetrics { - return { ...this.metrics }; - } - - getPerformanceScore(): number { - const frameRate = this.metrics.totalFramesProcessed / - ((Date.now() - this.metrics.lastResetTime) / 1000); - const dropRate = this.metrics.totalFramesDropped / - Math.max(1, this.metrics.totalFramesProcessed); - const reuseRate = this.metrics.connectionReuses / - Math.max(1, this.metrics.streamCreations); - - // Calculate weighted score (0-100) - const frameRateScore = Math.min(100, frameRate * 3.33); // 30fps = 100 - const dropRateScore = Math.max(0, 100 - dropRate * 500); // 20% drops = 0 - const reuseScore = reuseRate * 100; - - return (frameRateScore * 0.5 + dropRateScore * 0.3 + reuseScore * 0.2); - } - - private startReporting(): void { - this.reportingInterval = setInterval(() => { - const score = this.getPerformanceScore(); - console.log(`πŸ“Š MediaRecorder Performance Score: ${score.toFixed(1)}/100`, { - framesProcessed: this.metrics.totalFramesProcessed, - framesDropped: this.metrics.totalFramesDropped, - avgProcessingTime: `${this.metrics.averageProcessingTime.toFixed(2)}ms`, - qualityAdaptations: this.metrics.qualityAdaptations, - connectionReuses: this.metrics.connectionReuses - }); - }, 30000) as any; // Report every 30 seconds - } - - reset(): void { - this.metrics = { - totalFramesProcessed: 0, - totalFramesDropped: 0, - averageProcessingTime: 0, - memoryUsageTrend: [], - qualityAdaptations: 0, - connectionReuses: 0, - streamCreations: 0, - lastResetTime: Date.now() - }; - } - - destroy(): void { - if (this.reportingInterval) { - clearInterval(this.reportingInterval); - } - } -} -``` - -## Performance Metrics Impact - -| Optimization | Memory Usage | Frame Rate | CPU Usage | Connection Speed | -|--------------|--------------|------------|-----------|------------------| -| Chunk Management | -70% | +15% | -20% | +10% | -| Frame Processing | -30% | +40% | -35% | +25% | -| Memory Monitoring | -50% | +20% | -15% | N/A | -| Stream Pooling | -40% | +30% | -25% | +60% | -| Metrics Tracking | +5% | +10% | +5% | +15% | - -## Implementation Priority - -1. **Critical**: Implement chunk buffer management to prevent memory leaks -2. **High**: Add adaptive frame processing and quality control -3. **High**: Implement stream pooling and connection reuse -4. **Medium**: Add memory monitoring and cleanup -5. **Low**: Implement comprehensive performance metrics - -## Testing Recommendations - -1. Test memory usage over extended recording periods (>1 hour) -2. Monitor frame processing under high CPU load conditions -3. Test stream reuse efficiency with multiple connections -4. Verify adaptive quality works on low-end devices -5. Test cleanup and garbage collection effectiveness - -## Mobile-Specific Optimizations - -- Reduce maximum video resolution on mobile devices -- Implement more aggressive frame dropping on touch devices -- Use hardware acceleration when available -- Implement battery-aware quality adjustments - -## Additional Notes - -- Consider using OffscreenCanvas for frame processing in Web Workers -- Implement WebCodecs API support for better performance on supported browsers -- Add support for adaptive bitrate streaming -- Consider implementing custom video codecs for specific use cases \ No newline at end of file diff --git a/src/lib/sensors/producers/index.ts b/src/lib/sensors/producers/index.ts deleted file mode 100644 index 5ce56ab282edc16237732943c1d7a6367a228b45..0000000000000000000000000000000000000000 --- a/src/lib/sensors/producers/index.ts +++ /dev/null @@ -1,7 +0,0 @@ -/** - * Producer Sensor Drivers - Main Export - * - * Central export point for all producer sensor driver implementations - */ - -export { MediaRecorderProducer } from "./MediaRecorderProducer.js"; \ No newline at end of file diff --git a/src/lib/sensors/types/base.ts b/src/lib/sensors/types/base.ts deleted file mode 100644 index 19c0356e2ef0b2dbbb02d4cd0c3f26b7f19eb025..0000000000000000000000000000000000000000 --- a/src/lib/sensors/types/base.ts +++ /dev/null @@ -1,21 +0,0 @@ -/** - * Base Sensor Driver Interfaces - * - * Core contracts that all sensor driver implementations must follow - */ - -import type { ConnectionStatus, StatusChangeCallback, UnsubscribeFn } from "./core.js"; - -/** - * Base sensor driver interface - common functionality for all sensor drivers - */ -export interface BaseSensorDriver { - readonly id: string; - readonly type: "producer" | "consumer"; - readonly name: string; - readonly status: ConnectionStatus; - - connect(): Promise; - disconnect(): Promise; - onStatusChange(callback: StatusChangeCallback): UnsubscribeFn; -} \ No newline at end of file diff --git a/src/lib/sensors/types/consumer.ts b/src/lib/sensors/types/consumer.ts deleted file mode 100644 index b20f06e0875374a4c4766afa1973b4f50f2f7ed2..0000000000000000000000000000000000000000 --- a/src/lib/sensors/types/consumer.ts +++ /dev/null @@ -1,58 +0,0 @@ -/** - * Consumer Sensor Driver Interfaces & Configurations - * - * Consumer drivers send sensor data to various destinations - * Examples: Remote server, local storage - */ - -import type { BaseSensorDriver } from "./base.js"; -import type { - SensorFrame, - SensorStream, - FrameCallback, - StreamUpdateCallback, - UnsubscribeFn -} from "./core.js"; - -/** - * Consumer Driver - Sends sensor data to various destinations - */ -export interface ConsumerSensorDriver extends BaseSensorDriver { - readonly type: "consumer"; - - // Data transmission - sendFrame(frame: SensorFrame): Promise; - sendFrames(frames: SensorFrame[]): Promise; - - // Stream management - startOutputStream(stream: SensorStream): Promise; - stopOutputStream(streamId: string): Promise; - getActiveOutputStreams(): SensorStream[]; - - // Event callbacks - onFrameSent(callback: FrameCallback): UnsubscribeFn; - onStreamUpdate(callback: StreamUpdateCallback): UnsubscribeFn; -} - -/** - * Consumer driver configuration types - simplified with best practices - */ -export interface RemoteServerConsumerConfig { - type: "remote-server"; - url: string; - apiKey?: string; - streamId?: string; - retryAttempts?: number; - retryDelay?: number; -} - -export interface LocalStorageConsumerConfig { - type: "local-storage"; - directory?: string; - filename?: string; - autoUpload?: boolean; -} - -export type ConsumerSensorDriverConfig = - | RemoteServerConsumerConfig - | LocalStorageConsumerConfig; \ No newline at end of file diff --git a/src/lib/sensors/types/core.ts b/src/lib/sensors/types/core.ts deleted file mode 100644 index ac703e58bdef361fa9e3b84d885b479a7cee2215..0000000000000000000000000000000000000000 --- a/src/lib/sensors/types/core.ts +++ /dev/null @@ -1,65 +0,0 @@ -/** - * Core Sensor Types - * - * Fundamental types and interfaces used across all sensor driver implementations - */ - -export interface ConnectionStatus { - isConnected: boolean; - lastConnected?: Date; - error?: string; - bitrate?: number; // For video streams - frameRate?: number; // For video streams -} - -/** - * Sensor data frame for video streams - */ -export interface SensorFrame { - timestamp: number; - type: "video" | "audio" | "data"; - data: ArrayBuffer | Blob; - metadata?: { - width?: number; - height?: number; - frameRate?: number; - codec?: string; - bitrate?: number; - format?: string; - [key: string]: unknown; - }; -} - -/** - * Video stream configuration - */ -export interface VideoStreamConfig { - width?: number; - height?: number; - frameRate?: number; - bitrate?: number; - codec?: string; - facingMode?: "user" | "environment"; - deviceId?: string; -} - -/** - * Sensor stream for continuous data flow - */ -export interface SensorStream { - id: string; - name: string; - type: "video" | "audio" | "data"; - config: VideoStreamConfig; - active: boolean; - startTime?: Date; - endTime?: Date; - totalFrames?: number; -} - -// Callback types -export type StreamUpdateCallback = (stream: SensorStream) => void; -export type FrameCallback = (frame: SensorFrame) => void; -export type StatusChangeCallback = (status: ConnectionStatus) => void; -export type ErrorCallback = (error: string) => void; -export type UnsubscribeFn = () => void; \ No newline at end of file diff --git a/src/lib/sensors/types/index.ts b/src/lib/sensors/types/index.ts deleted file mode 100644 index 7989d6533b597806494b28a680326e047e30d270..0000000000000000000000000000000000000000 --- a/src/lib/sensors/types/index.ts +++ /dev/null @@ -1,37 +0,0 @@ -/** - * Sensor Driver Types - Main Export - * - * Central export point for all sensor driver types and interfaces - */ - -// Core types -export type { - ConnectionStatus, - SensorFrame, - VideoStreamConfig, - SensorStream, - StreamUpdateCallback, - FrameCallback, - StatusChangeCallback, - ErrorCallback, - UnsubscribeFn -} from "./core.js"; - -// Base interfaces -export type { BaseSensorDriver } from "./base.js"; - -// Producer drivers -export type { - ProducerSensorDriver, - MediaRecorderProducerConfig, - NetworkStreamProducerConfig, - ProducerSensorDriverConfig -} from "./producer.js"; - -// Consumer drivers -export type { - ConsumerSensorDriver, - RemoteServerConsumerConfig, - LocalStorageConsumerConfig, - ConsumerSensorDriverConfig -} from "./consumer.js"; \ No newline at end of file diff --git a/src/lib/sensors/types/producer.ts b/src/lib/sensors/types/producer.ts deleted file mode 100644 index 6051109b0097bef8dfc58031b5a55dc3fae67bc7..0000000000000000000000000000000000000000 --- a/src/lib/sensors/types/producer.ts +++ /dev/null @@ -1,58 +0,0 @@ -/** - * Producer Sensor Driver Interfaces & Configurations - * - * Producer drivers capture sensor data (video, audio, etc.) - * Examples: MediaRecorder, network stream - */ - -import type { BaseSensorDriver } from "./base.js"; -import type { - SensorStream, - VideoStreamConfig, - FrameCallback, - StreamUpdateCallback, - UnsubscribeFn -} from "./core.js"; - -/** - * Producer Driver - Captures sensor data from various sources - */ -export interface ProducerSensorDriver extends BaseSensorDriver { - readonly type: "producer"; - - // Stream management - startStream(config: VideoStreamConfig): Promise; - stopStream(streamId: string): Promise; - pauseStream(streamId: string): Promise; - resumeStream(streamId: string): Promise; - getActiveStreams(): SensorStream[]; - - // Event callbacks - onFrame(callback: FrameCallback): UnsubscribeFn; - onStreamUpdate(callback: StreamUpdateCallback): UnsubscribeFn; -} - -/** - * Producer driver configuration types - simplified with best practices - */ -export interface MediaRecorderProducerConfig { - type: "media-recorder"; - constraints?: MediaStreamConstraints; - videoBitsPerSecond?: number; - audioBitsPerSecond?: number; - recordingInterval?: number; // ms between frame captures -} - -export interface NetworkStreamProducerConfig { - type: "network-stream"; - url: string; - credentials?: { - username?: string; - password?: string; - token?: string; - }; -} - -export type ProducerSensorDriverConfig = - | MediaRecorderProducerConfig - | NetworkStreamProducerConfig; \ No newline at end of file diff --git a/src/lib/utils/config.ts b/src/lib/utils/config.ts deleted file mode 100644 index 51452f3b88a8979594c4ba240c49082e6da30b25..0000000000000000000000000000000000000000 --- a/src/lib/utils/config.ts +++ /dev/null @@ -1,106 +0,0 @@ -/** - * Configuration utilities for environment-specific URLs - */ - -// Check if we're running in browser -const isBrowser = typeof window !== "undefined"; - -/** - * Get the SPACE_HOST from various sources - */ -function getSpaceHost(): string | undefined { - if (!isBrowser) return undefined; - - // Check window.SPACE_HOST (injected by container) - if ((window as unknown as { SPACE_HOST?: string }).SPACE_HOST) { - return (window as unknown as { SPACE_HOST: string }).SPACE_HOST; - } - - // Check if current hostname looks like HF Spaces - const hostname = window.location.hostname; - if (hostname.includes("hf.space") || hostname.includes("huggingface.co")) { - return hostname; - } - - return undefined; -} - -/** - * Get the base URL for API requests - */ -export function getApiBaseUrl(): string { - if (!isBrowser) return "http://localhost:7860"; - - // Check for Hugging Face Spaces - const spaceHost = getSpaceHost(); - if (spaceHost) { - return `https://${spaceHost}`; - } - - // In browser, check current location - const { protocol, hostname, port } = window.location; - - // If we're on the same host and port, use same origin (both frontend and backend on same server) - if (hostname === "localhost" || hostname === "127.0.0.1") { - // In development, frontend might be on 5173 and backend on 7860 - if (port === "5173" || port === "5174") { - return "http://localhost:8000"; - } - // If frontend is served from backend (port 7860), use same origin - return `${protocol}//${hostname}:${port}`; - } - - // For production, use same origin (both served from same FastAPI server) - return `${protocol}//${hostname}${port ? `:${port}` : ""}`; -} - -/** - * Get the WebSocket URL for real-time connections - */ -export function getWebSocketBaseUrl(): string { - if (!isBrowser) return "ws://localhost:7860"; - - // Check for Hugging Face Spaces - const spaceHost = getSpaceHost(); - if (spaceHost) { - return `wss://${spaceHost}`; - } - - const { protocol, hostname, port } = window.location; - - // If we're on localhost - if (hostname === "localhost" || hostname === "127.0.0.1") { - // In development, frontend might be on 5173 and backend on 7860 - if (port === "5173" || port === "5174") { - return "ws://localhost:8000"; - } - // If frontend is served from backend (port 7860), use same origin - const wsProtocol = protocol === "https:" ? "wss:" : "ws:"; - return `${wsProtocol}//${hostname}:${port}`; - } - - // For HTTPS sites, use WSS; for HTTP sites, use WS - const wsProtocol = protocol === "https:" ? "wss:" : "ws:"; - - // For production, use same origin (both served from same FastAPI server) - return `${wsProtocol}//${hostname}${port ? `:${port}` : ""}`; -} - -/** - * Get environment info for debugging - */ -export function getEnvironmentInfo() { - if (!isBrowser) return { env: "server", hostname: "unknown" }; - - const { protocol, hostname, port } = window.location; - const spaceHost = getSpaceHost(); - - return { - env: spaceHost ? "huggingface-spaces" : hostname === "localhost" ? "local" : "production", - hostname, - port, - protocol, - spaceHost, - apiBaseUrl: getApiBaseUrl(), - }; -} diff --git a/src/lib/utils/icon.ts b/src/lib/utils/icon.ts index 3cd1c86f9ceb275f86326d964b50bd1154ecaf95..c17b134847f7094ada631544ca62640c18cc1d07 100644 --- a/src/lib/utils/icon.ts +++ b/src/lib/utils/icon.ts @@ -7,7 +7,7 @@ interface Icon { // const plusIcon = "data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIyNCIgaGVpZ2h0PSIyNCIgdmlld0JveD0iMCAwIDI0IDI0Ij48cGF0aCBmaWxsPSIjMDAwIiBkPSJNMTkgMTNoLTZ2NmgtMnYtNkg1di0yaDZWNWgydjZoNnoiLz48L3N2Zz4="; // https://icon-sets.iconify.design -export const ICON = { +export const ICON: Record = { "icon-[material-symbols--upload]": { svg: "data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIyNCIgaGVpZ2h0PSIyNCIgdmlld0JveD0iMCAwIDI0IDI0Ij48cGF0aCBmaWxsPSIjMDAwIiBkPSJNMTEgMTZWNy44NWwtMi42IDIuNkw3IDlsNS01bDUgNWwtMS40IDEuNDVsLTIuNi0yLjZWMTZ6bS01IDRxLS44MjUgMC0xLjQxMi0uNTg3VDQgMTh2LTNoMnYzaDEydi0zaDJ2M3EwIC44MjUtLjU4NyAxLjQxM1QxOCAyMHoiLz48L3N2Zz4=", alt: "Upload"