Initial commit
This commit is contained in:
113
docs/README.md
Normal file
113
docs/README.md
Normal file
@@ -0,0 +1,113 @@
|
||||
# Vuer Documentation Skill
|
||||
|
||||
This directory contains comprehensive documentation for Vuer, a lightweight 3D visualization toolkit for robotics and VR applications.
|
||||
|
||||
**Version:** v0.0.67
|
||||
**Source:** https://docs.vuer.ai/
|
||||
|
||||
## Documentation Structure
|
||||
|
||||
### Guides (2/2 complete)
|
||||
- [Introduction](guides/introduction.md) - Overview and key features
|
||||
- [Getting Started](guides/getting-started.md) - Installation and setup
|
||||
|
||||
### Tutorials
|
||||
|
||||
#### Vuer Basics (5/5 complete)
|
||||
- [Setting Up Your First Scene](tutorials/basics/setting-a-scene.md) - Create basic 3D scenes
|
||||
- [Async Programming](tutorials/basics/async-programming.md) - Handle parallel routines and callbacks
|
||||
- [Simple Life Cycle](tutorials/basics/simple-life-cycle.md) - CRUD operations for components
|
||||
- [SSL Proxy for WebXR](tutorials/basics/ssl-proxy-webxr.md) - Setup secure connections for VR
|
||||
- [Serving Dynamic Content](tutorials/basics/serving-dynamic-content.md) - Add custom HTML routes
|
||||
|
||||
#### Robotics Visualization (4/4 complete)
|
||||
- [Using URDF](tutorials/robotics/using-urdf.md) - Load and display robot models
|
||||
- [MIT Mini Cheetah](tutorials/robotics/mini-cheetah.md) - Animated quadruped robot
|
||||
- [Unitree Go1 with Stairs](tutorials/robotics/go1-stairs.md) - Complex scene with fog effects
|
||||
- [Camera Frustums](tutorials/robotics/camera-frustums.md) - Visualize camera viewpoints
|
||||
|
||||
#### Virtual Cameras (3/6 complete)
|
||||
- [Recording Camera Movements](tutorials/camera/recording-camera-movements.md) - Capture user camera movements
|
||||
- [Manipulating Camera Pose](tutorials/camera/manipulating-camera-pose.md) - Programmatic camera control
|
||||
- [Grab Render from Virtual Camera](tutorials/camera/grab-render-virtual-camera.md) - Capture rendered images
|
||||
- Collecting Render from Multiple Browser Sessions - *Not yet fetched*
|
||||
- Transforming Points using Camera Matrix - *Not yet fetched*
|
||||
- Render Queue - *Not yet fetched*
|
||||
|
||||
#### Physics in Mixed Reality (3/4 complete)
|
||||
- [MuJoCo WASM](tutorials/physics/mujoco-wasm.md) - Browser-based physics simulation
|
||||
- [MoCap Control](tutorials/physics/mocap-control.md) - VR motion controller integration
|
||||
- [Hand Control](tutorials/physics/hand-control.md) - VR hand tracking with MuJoCo
|
||||
- MuJoCo Gallery - *Not yet fetched*
|
||||
|
||||
### API Documentation
|
||||
- Python API reference - *Not yet fetched*
|
||||
- Component schemas - *Not yet fetched*
|
||||
- Event types - *Not yet fetched*
|
||||
|
||||
## Completion Status
|
||||
|
||||
**Completed and Saved:** 16 pages
|
||||
**Remaining to fetch:** ~10+ pages
|
||||
**Total Estimated:** 25+ pages
|
||||
|
||||
## What is Vuer?
|
||||
|
||||
Vuer is a light-weight visualization toolkit for interacting with dynamic 3D and robotics data. Key features:
|
||||
|
||||
- **Lightweight performance** - Efficient 3D rendering
|
||||
- **VR and AR compatibility** - Works with virtual and augmented reality devices
|
||||
- **WebSocket-based** - Real-time communication between Python and browser
|
||||
- **Robotics-focused** - URDF support, physics simulation, camera tools
|
||||
- **Extensible** - Custom components and handlers
|
||||
- **MIT License** - Free and open source
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
pip install 'vuer[all]==0.0.67'
|
||||
```
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Box
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn
|
||||
async def main(session):
|
||||
session.set @ Scene(
|
||||
Box(
|
||||
args=[0.1, 0.1, 0.1],
|
||||
position=[0, 0, 0],
|
||||
key="box",
|
||||
),
|
||||
)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Use Cases
|
||||
|
||||
- **Robotics Visualization** - Display robot models, trajectories, sensor data
|
||||
- **VR/AR Applications** - Interactive 3D environments
|
||||
- **Data Visualization** - 3D plots, point clouds, meshes
|
||||
- **Physics Simulation** - MuJoCo integration for browser-based physics
|
||||
- **Camera Calibration** - Visualize camera frustums and capture renders
|
||||
- **Motion Capture** - VR controller and hand tracking
|
||||
|
||||
## Development
|
||||
|
||||
The project emerged from research at MIT and UCSD, with contributors specializing in robotics, computer vision, and computer graphics.
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Read the [Getting Started Guide](guides/getting-started.md)
|
||||
2. Follow the [Vuer Basics tutorials](tutorials/basics/)
|
||||
3. Explore [Robotics examples](tutorials/robotics/)
|
||||
4. Try [Camera tutorials](tutorials/camera/)
|
||||
5. Experiment with [Physics simulation](tutorials/physics/)
|
||||
|
||||
## Source
|
||||
|
||||
All documentation fetched from: https://docs.vuer.ai/
|
||||
104
docs/guides/getting-started.md
Normal file
104
docs/guides/getting-started.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Getting Started with Vuer
|
||||
|
||||
**Version:** v0.0.67
|
||||
|
||||
## Environment Setup
|
||||
|
||||
### Create a Conda Environment
|
||||
|
||||
```bash
|
||||
conda create -n vuer python=3.8
|
||||
conda activate vuer
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
### Latest PyPI Version
|
||||
|
||||
Install the latest version with all dependencies:
|
||||
|
||||
```bash
|
||||
pip install -U 'vuer[all]==0.0.67'
|
||||
```
|
||||
|
||||
### For Development
|
||||
|
||||
If you're contributing to Vuer, use an editable installation:
|
||||
|
||||
```bash
|
||||
pip install -e '.[all]'
|
||||
```
|
||||
|
||||
## Key Learning Pathways
|
||||
|
||||
### 1. Vuer Basics Tutorial
|
||||
Learn foundational concepts for building 3D visualizations with Vuer.
|
||||
|
||||
### 2. Tutorial for Roboticists
|
||||
Specialized tutorial for robotics applications, including URDF loading and robot visualization.
|
||||
|
||||
### 3. Example Gallery
|
||||
Extensive collection of examples demonstrating various capabilities.
|
||||
|
||||
## VR/AR Headset Access
|
||||
|
||||
To access Vuer visualizations on VR/AR headsets:
|
||||
|
||||
1. **Install ngrok** to convert local WebSocket connections to secure connections
|
||||
2. Local WebSocket: `ws://localhost:8012`
|
||||
3. Secure WebSocket via ngrok: `wss://xxxxx.ngrok.io`
|
||||
4. Access the visualization through a query parameter
|
||||
|
||||
## Running Examples
|
||||
|
||||
### Clone and Setup
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/vuer-ai/vuer.git
|
||||
cd vuer
|
||||
|
||||
# Install with example dependencies
|
||||
pip install -U 'vuer[example]==0.0.67'
|
||||
|
||||
# Download 3D assets using git LFS
|
||||
git lfs pull
|
||||
```
|
||||
|
||||
### Execute Examples
|
||||
|
||||
Navigate to the examples directory and run Python files:
|
||||
|
||||
```bash
|
||||
cd docs/examples
|
||||
python your_example.py
|
||||
```
|
||||
|
||||
### Apple Silicon Compatibility
|
||||
|
||||
**Important:** Apple Silicon users should install a specific version of open3d:
|
||||
|
||||
```bash
|
||||
pip install open3d==0.15.1
|
||||
```
|
||||
|
||||
This is due to compatibility issues with newer versions on Apple Silicon.
|
||||
|
||||
## Building Documentation
|
||||
|
||||
For contributors working on documentation:
|
||||
|
||||
```bash
|
||||
make docs
|
||||
```
|
||||
|
||||
## Next Steps
|
||||
|
||||
- Explore the [Vuer Basics Tutorial](../tutorials/basics/)
|
||||
- Check out the [Robotics Tutorial](../tutorials/robotics/)
|
||||
- Browse the [Example Gallery](../examples/)
|
||||
- Read the [API Documentation](../api/)
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/quick_start.html
|
||||
59
docs/guides/introduction.md
Normal file
59
docs/guides/introduction.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# Vuer: 3D Visualization Toolkit
|
||||
|
||||
**Version:** v0.0.67
|
||||
|
||||
## What is Vuer?
|
||||
|
||||
Vuer is a light-weight visualization toolkit for interacting with dynamic 3D and robotics data. The framework emphasizes accessibility, supporting virtual and augmented reality experiences while remaining mobile-device compatible.
|
||||
|
||||
## Key Features
|
||||
|
||||
- **Lightweight performance** - Efficient 3D rendering and visualization
|
||||
- **VR and AR compatibility** - Works with virtual and augmented reality devices
|
||||
- **Community support** - Active community and documentation
|
||||
- **Extensible and customizable** - Build custom visualizations and components
|
||||
- **MIT open-source license** - Free to use and modify
|
||||
|
||||
## Installation
|
||||
|
||||
Install the framework via pip:
|
||||
|
||||
```bash
|
||||
pip install 'vuer[all]==0.0.67'
|
||||
```
|
||||
|
||||
## Quick Example
|
||||
|
||||
Here's a straightforward example demonstrating URDF file loading and browser-based visualization:
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
|
||||
# Create Vuer app
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn
|
||||
async def main(session):
|
||||
# Your visualization code here
|
||||
pass
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Development & Expertise
|
||||
|
||||
The project emerged from research at MIT and UCSD, with contributors specializing in robotics, computer vision, and computer graphics fields.
|
||||
|
||||
## Available Resources
|
||||
|
||||
- **Tutorials** - Learn the basics and advanced topics
|
||||
- Vuer Basics
|
||||
- Tutorial for Roboticists
|
||||
- Camera tutorials
|
||||
- Physics and MuJoCo integration
|
||||
- **Example Gallery** - Diverse use cases and demonstrations
|
||||
- **API Documentation** - Comprehensive reference for components and data types
|
||||
|
||||
## Source
|
||||
|
||||
Documentation available at: https://docs.vuer.ai/
|
||||
90
docs/tutorials/basics/async-programming.md
Normal file
90
docs/tutorials/basics/async-programming.md
Normal file
@@ -0,0 +1,90 @@
|
||||
# Async Programming in Vuer
|
||||
|
||||
## Overview
|
||||
|
||||
Vuer supports asynchronous programming patterns for handling parallel routines and callbacks. The tutorial demonstrates creating a server with background tasks running concurrently.
|
||||
|
||||
## Key Components
|
||||
|
||||
### Server Setup
|
||||
The framework uses `Vuer()` to instantiate a server with configuration options via query parameters like `reconnect=True` and `collapseMenu=True`.
|
||||
|
||||
### Main Function Decorator
|
||||
The `@app.spawn(start=True)` decorator marks the entry point as an async function that starts the application immediately.
|
||||
|
||||
### Task Management
|
||||
Sessions provide `spawn_task()` method for launching background operations. Tasks can be cancelled with `.cancel()` when no longer needed.
|
||||
|
||||
## Code Pattern Example
|
||||
|
||||
The tutorial shows a main loop that:
|
||||
- Spawns independent background tasks
|
||||
- Updates scene objects continuously using `sess.upsert`
|
||||
- Manages task lifecycle by cancelling long-running operations after conditions are met
|
||||
- Uses `await sleep()` for non-blocking delays
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Box
|
||||
import asyncio
|
||||
import numpy as np
|
||||
|
||||
app = Vuer(
|
||||
queries=dict(
|
||||
reconnect=True,
|
||||
collapseMenu=True,
|
||||
),
|
||||
)
|
||||
|
||||
async def background_task(session):
|
||||
"""A background task that runs independently"""
|
||||
count = 0
|
||||
while True:
|
||||
print(f"Background task running: {count}")
|
||||
count += 1
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
# Spawn a background task
|
||||
task = session.spawn_task(background_task(session))
|
||||
|
||||
# Main animation loop
|
||||
for i in range(100):
|
||||
theta = i * 0.1
|
||||
x = 0.5 * np.cos(theta)
|
||||
z = 0.5 * np.sin(theta)
|
||||
|
||||
# Update the box position
|
||||
session.upsert @ Box(
|
||||
args=[0.1, 0.1, 0.1],
|
||||
position=[x, 0.05, z],
|
||||
color="red",
|
||||
materialType="standard",
|
||||
key="animated-box",
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.05)
|
||||
|
||||
# Cancel the background task when done
|
||||
task.cancel()
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Practical Features
|
||||
|
||||
The demonstration animates a red box moving in a circular path while background tasks execute independently, illustrating how multiple asynchronous operations coexist within a single VuerSession without blocking the main rendering loop.
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `session.spawn_task()`** for background operations
|
||||
2. **Always use `await asyncio.sleep()`** for delays to avoid blocking
|
||||
3. **Cancel tasks** when they're no longer needed to free resources
|
||||
4. **Use `session.upsert`** for updating scene components
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/basics/async_programming.html
|
||||
145
docs/tutorials/basics/serving-dynamic-content.md
Normal file
145
docs/tutorials/basics/serving-dynamic-content.md
Normal file
@@ -0,0 +1,145 @@
|
||||
# Serving Dynamic HTML Content in Vuer
|
||||
|
||||
## Overview
|
||||
|
||||
Vuer allows you to serve dynamic content by adding custom route handlers to your application server.
|
||||
|
||||
## Implementation Method
|
||||
|
||||
You can register a dynamic HTML handler using the `add_route()` method.
|
||||
|
||||
## Basic Example
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
|
||||
app = Vuer()
|
||||
|
||||
counter = 0
|
||||
|
||||
def dynamic_html_handler():
|
||||
global counter
|
||||
counter += 1
|
||||
template = f"""
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head><title>Dynamic HTML</title></head>
|
||||
<body>
|
||||
<h1>Counter Value: {counter}</h1>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
return template
|
||||
|
||||
app.add_route("/dynamic", dynamic_html_handler, method="GET")
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
After starting the server, visit `http://localhost:8012/dynamic` to see the dynamically generated content. The counter value will update on each page reload.
|
||||
|
||||
## Key Parameters
|
||||
|
||||
- **Route path**: Specify the URL endpoint (e.g., "/dynamic")
|
||||
- **Handler function**: Returns HTML content as a string
|
||||
- **HTTP method**: Specify the request method (e.g., "GET")
|
||||
|
||||
## Advanced Example with JSON Response
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
import json
|
||||
|
||||
app = Vuer()
|
||||
|
||||
def json_api_handler():
|
||||
data = {
|
||||
"status": "success",
|
||||
"data": {
|
||||
"message": "Hello from Vuer!",
|
||||
"timestamp": time.time()
|
||||
}
|
||||
}
|
||||
return json.dumps(data)
|
||||
|
||||
app.add_route("/api/data", json_api_handler, method="GET")
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Multiple Routes
|
||||
|
||||
You can add multiple routes to your application:
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
|
||||
app = Vuer()
|
||||
|
||||
def home_handler():
|
||||
return "<h1>Home Page</h1>"
|
||||
|
||||
def about_handler():
|
||||
return "<h1>About Page</h1>"
|
||||
|
||||
def api_handler():
|
||||
return '{"status": "ok"}'
|
||||
|
||||
app.add_route("/", home_handler, method="GET")
|
||||
app.add_route("/about", about_handler, method="GET")
|
||||
app.add_route("/api/status", api_handler, method="GET")
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Return strings**: Handler functions should return HTML or text as strings
|
||||
2. **Use templates**: Consider using template engines for complex HTML
|
||||
3. **Handle errors**: Add error handling in your route handlers
|
||||
4. **Set content types**: For JSON responses, consider setting appropriate headers
|
||||
|
||||
## Combining with 3D Scenes
|
||||
|
||||
You can serve both dynamic HTML content and 3D scenes from the same Vuer application:
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Box
|
||||
import asyncio
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Add dynamic HTML route
|
||||
def stats_handler():
|
||||
return f"""
|
||||
<html>
|
||||
<body>
|
||||
<h1>Vuer Statistics</h1>
|
||||
<p>Server running on port 8012</p>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
app.add_route("/stats", stats_handler, method="GET")
|
||||
|
||||
# Add 3D scene
|
||||
@app.spawn
|
||||
async def main(session):
|
||||
session.set @ Scene(
|
||||
Box(
|
||||
args=[0.1, 0.1, 0.1],
|
||||
position=[0, 0, 0],
|
||||
key="box",
|
||||
),
|
||||
)
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/basics/adding_html_handler.html
|
||||
111
docs/tutorials/basics/setting-a-scene.md
Normal file
111
docs/tutorials/basics/setting-a-scene.md
Normal file
@@ -0,0 +1,111 @@
|
||||
# Setting Up Your First Scene in Vuer
|
||||
|
||||
## Overview
|
||||
This tutorial guides you through creating a basic 3D scene using Vuer, a Python framework for building interactive 3D visualizations. The example demonstrates how to establish a server, add scene components, lighting, and interactive elements.
|
||||
|
||||
## Step 1: Initialize the Vuer Server
|
||||
|
||||
Begin by importing and instantiating the application:
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
|
||||
app = Vuer(
|
||||
queries=dict(
|
||||
reconnect=True,
|
||||
collapseMenu=True,
|
||||
),
|
||||
)
|
||||
```
|
||||
|
||||
The queries parameter configures the scene via URL parameters. Launch the server with `app.run()`, which produces output showing the local connection URL.
|
||||
|
||||
## Step 2: Create an Async Session
|
||||
|
||||
The framework uses WebSocket sessions to connect clients with the Python server. Bind an async function to handle each session using the spawn decorator:
|
||||
|
||||
```python
|
||||
@app.spawn
|
||||
async def session(sess: VuerSession):
|
||||
print("Example: we have started a websocket session!")
|
||||
```
|
||||
|
||||
## Step 3: Build Your Scene
|
||||
|
||||
Within the session, construct the scene by setting Scene objects containing various components:
|
||||
|
||||
```python
|
||||
sess.set @ Scene(
|
||||
Box(
|
||||
args=[0.1, 0.1, 0.1, 101, 101, 101],
|
||||
position=[0, 0.05, 0],
|
||||
color="red",
|
||||
materialType="standard",
|
||||
material=dict(color="#23aaff"),
|
||||
key="fox-1",
|
||||
),
|
||||
...
|
||||
)
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
### Box
|
||||
A 3D cube primitive with position, color, and material properties.
|
||||
|
||||
### SpotLight
|
||||
Provides directional lighting within the scene, wrapped in a Movable component to allow interactive repositioning.
|
||||
|
||||
### Movable
|
||||
Enables user interaction, allowing scene elements to be dragged and manipulated in the 3D viewport.
|
||||
|
||||
## Essential Pattern
|
||||
|
||||
Always include `await asyncio.sleep(0.0)` after scene modifications to ensure proper asynchronous handling and client synchronization.
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Box, SpotLight, Movable
|
||||
import asyncio
|
||||
|
||||
app = Vuer(
|
||||
queries=dict(
|
||||
reconnect=True,
|
||||
collapseMenu=True,
|
||||
),
|
||||
)
|
||||
|
||||
@app.spawn
|
||||
async def session(sess):
|
||||
print("Example: we have started a websocket session!")
|
||||
|
||||
sess.set @ Scene(
|
||||
Box(
|
||||
args=[0.1, 0.1, 0.1, 101, 101, 101],
|
||||
position=[0, 0.05, 0],
|
||||
color="red",
|
||||
materialType="standard",
|
||||
material=dict(color="#23aaff"),
|
||||
key="fox-1",
|
||||
),
|
||||
Movable(
|
||||
SpotLight(
|
||||
intensity=3.0,
|
||||
distance=10.0,
|
||||
decay=0.0,
|
||||
position=[0, 2, 0],
|
||||
key="spotlight",
|
||||
),
|
||||
),
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/basics/setting_a_scene.html
|
||||
102
docs/tutorials/basics/simple-life-cycle.md
Normal file
102
docs/tutorials/basics/simple-life-cycle.md
Normal file
@@ -0,0 +1,102 @@
|
||||
# Vuer Component Life Cycle
|
||||
|
||||
## Core Concept
|
||||
|
||||
This tutorial demonstrates the basic **CRUD operations** for components in Vuer: adding a component, updating in-place, and removing.
|
||||
|
||||
## Life Cycle Operations
|
||||
|
||||
### 1. Adding Components
|
||||
Create new components using `session.upsert`:
|
||||
|
||||
```python
|
||||
session.upsert @ Obj(
|
||||
src="/static/model.obj",
|
||||
position=[0, 0, 0],
|
||||
key="my-model",
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Updating Components
|
||||
Modify existing components by reusing the same key:
|
||||
|
||||
```python
|
||||
session.upsert @ Obj(
|
||||
src="/static/model.obj",
|
||||
position=[1, 0, 0], # Changed position
|
||||
key="my-model", # Same key updates the existing component
|
||||
)
|
||||
```
|
||||
|
||||
### 3. Removing Components
|
||||
Delete components with `session.remove`:
|
||||
|
||||
```python
|
||||
session.remove @ "my-model"
|
||||
```
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Obj
|
||||
import asyncio
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn
|
||||
async def main(session):
|
||||
# Create/Update: Add a wireframe mesh
|
||||
for i in range(80):
|
||||
wireframe = i % 2 == 0
|
||||
|
||||
# Toggle wireframe on and off
|
||||
session.upsert @ Obj(
|
||||
src="/static/model.obj",
|
||||
position=[0, 0, 0],
|
||||
wireframe=wireframe,
|
||||
key="primary-mesh",
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.01)
|
||||
|
||||
# Add a second component
|
||||
session.upsert @ Obj(
|
||||
src="/static/model.obj",
|
||||
position=[1, 0, 0],
|
||||
wireframe=True,
|
||||
key="secondary-mesh",
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.8)
|
||||
|
||||
# Remove the second component
|
||||
session.remove @ "secondary-mesh"
|
||||
|
||||
# Keep the session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Patterns
|
||||
|
||||
### Using Keys
|
||||
- **Unique keys** identify components in the scene
|
||||
- **Reusing a key** updates the existing component
|
||||
- **Different keys** create new components
|
||||
|
||||
### Update Frequency
|
||||
The example toggles wireframe every 0.01 seconds (100 times per second), demonstrating how Vuer handles rapid updates efficiently.
|
||||
|
||||
### Timing
|
||||
Components can be added and removed at any time during the session, allowing for dynamic scene management.
|
||||
|
||||
## Visual Effect
|
||||
|
||||
This creates an animated effect showing two mesh versions (solid and wireframe) alternating visibility, demonstrating how components can be dynamically managed throughout an application's runtime.
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/basics/simple_life_cycle.html
|
||||
121
docs/tutorials/basics/ssl-proxy-webxr.md
Normal file
121
docs/tutorials/basics/ssl-proxy-webxr.md
Normal file
@@ -0,0 +1,121 @@
|
||||
# Setting Up SSL Proxy for WebXR
|
||||
|
||||
## Overview
|
||||
|
||||
Vuer requires secure connections for WebXR functionality. Both the web client and WebSocket connections must use TLS/SSL encryption.
|
||||
|
||||
## Key Configuration Points
|
||||
|
||||
### WebSocket Endpoint Setup
|
||||
Pass the secure WebSocket URL via query parameter to the web client:
|
||||
```
|
||||
https://vuer.ai?ws=wss://xxxxx.ngrok.io
|
||||
```
|
||||
|
||||
### Static File Serving
|
||||
Update component source paths to use the correct HTTPS domain. For example, change:
|
||||
|
||||
```python
|
||||
# Before (insecure)
|
||||
src='http://localhost:8012/static/urdf/robot.urdf'
|
||||
```
|
||||
|
||||
To:
|
||||
|
||||
```python
|
||||
# After (secure)
|
||||
src='https://<your-domain-with-ssl>/static/urdf/robot.urdf'
|
||||
```
|
||||
|
||||
## Recommended Proxy Solutions
|
||||
|
||||
### Option 1: ngrok (Recommended)
|
||||
|
||||
ngrok converts local HTTP/WebSocket connections to secure HTTPS/WSS:
|
||||
|
||||
- `ws://localhost:8012` → `wss://xxxx.ngrok.io`
|
||||
- `http://localhost:8012/static/` → `https://xxxx.ngrok.io/static/`
|
||||
|
||||
**Installation:**
|
||||
Visit [ngrok's website](https://ngrok.com) for installation instructions.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
ngrok http 8012
|
||||
```
|
||||
|
||||
This will provide a secure URL like `https://xxxx.ngrok.io` that you can use for WebXR.
|
||||
|
||||
### Option 2: localtunnel
|
||||
|
||||
Free alternative requiring a passcode.
|
||||
|
||||
**Installation:**
|
||||
```bash
|
||||
npm install -g localtunnel
|
||||
```
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
lt --port 8012
|
||||
```
|
||||
|
||||
**Documentation:** https://localtunnel.me
|
||||
|
||||
### Option 3: Let's Encrypt Self-Signed Certificate
|
||||
|
||||
Generate and use your own SSL certificate.
|
||||
|
||||
**Launch Vuer with Certificate:**
|
||||
```bash
|
||||
vuer --cert cert.pem --key key.pem --port 8012
|
||||
```
|
||||
|
||||
**Generate Certificate:**
|
||||
Follow Let's Encrypt's localhost certificate guide for implementation details.
|
||||
|
||||
## Complete Example with ngrok
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Urdf
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn
|
||||
async def main(session):
|
||||
# Use the ngrok HTTPS domain for static files
|
||||
session.set @ Scene(
|
||||
Urdf(
|
||||
src='https://xxxx.ngrok.io/static/urdf/robot.urdf',
|
||||
position=[0, 0, 0],
|
||||
key="robot",
|
||||
),
|
||||
)
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
Then access your app at:
|
||||
```
|
||||
https://vuer.ai?ws=wss://xxxx.ngrok.io
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### WebSocket Connection Fails
|
||||
- Ensure you're using `wss://` (not `ws://`)
|
||||
- Verify the ngrok tunnel is running
|
||||
- Check firewall settings
|
||||
|
||||
### Static Files Not Loading
|
||||
- Confirm HTTPS domain is correct
|
||||
- Verify static files are being served
|
||||
- Check browser console for mixed content warnings
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/basics/ssl_proxy_webxr.html
|
||||
235
docs/tutorials/camera/grab-render-virtual-camera.md
Normal file
235
docs/tutorials/camera/grab-render-virtual-camera.md
Normal file
@@ -0,0 +1,235 @@
|
||||
# Collecting Render from Virtual Cameras
|
||||
|
||||
## Overview
|
||||
This tutorial covers two methods for capturing rendered images from virtual cameras in Vuer, with a focus on the recommended ondemand approach.
|
||||
|
||||
## Methods Overview
|
||||
|
||||
### Method 1: Frame/Time Mode (Legacy)
|
||||
Uses event handlers to collect rendered images. Only supported in `stream='frame'` or `stream='time'` mode.
|
||||
|
||||
**Limitations:**
|
||||
- Less backend control
|
||||
- Continuous rendering even when not needed
|
||||
- Higher resource usage
|
||||
|
||||
### Method 2: OnDemand Mode (Recommended)
|
||||
Uses a synchronous `grab_render` RPC API. Only available in `stream='ondemand'` mode.
|
||||
|
||||
**Advantages:**
|
||||
- Superior backend control
|
||||
- Renders only when explicitly requested
|
||||
- Lower computational overhead
|
||||
- Support for depth rendering
|
||||
|
||||
## Complete Example: OnDemand Mode
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import numpy as np
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import Scene, CameraView, DefaultScene, Box, Urdf
|
||||
from PIL import Image
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
# Set up the scene
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
# Add some objects to render
|
||||
Box(
|
||||
args=[1, 1, 1],
|
||||
position=[0, 0.5, 0],
|
||||
color="red",
|
||||
key="box",
|
||||
),
|
||||
|
||||
Urdf(
|
||||
src="/static/robot.urdf",
|
||||
position=[2, 0, 0],
|
||||
key="robot",
|
||||
),
|
||||
|
||||
# Configure camera with ondemand streaming
|
||||
CameraView(
|
||||
key="ego",
|
||||
fov=50,
|
||||
width=640,
|
||||
height=480,
|
||||
position=[0, 2, 5],
|
||||
rotation=[0, 0, 0],
|
||||
stream="ondemand",
|
||||
renderDepth=True, # Enable depth rendering
|
||||
near=0.1,
|
||||
far=100,
|
||||
),
|
||||
)
|
||||
|
||||
# Wait for scene to initialize
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
# Capture renders from different positions
|
||||
for i in range(10):
|
||||
# Update camera position
|
||||
x = 5 * np.cos(i * 0.2)
|
||||
z = 5 * np.sin(i * 0.2)
|
||||
|
||||
session.update @ CameraView(
|
||||
key="ego",
|
||||
position=[x, 2, z],
|
||||
rotation=[0, i * 0.2, 0],
|
||||
)
|
||||
|
||||
# Small delay for camera update
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
# Grab the render
|
||||
result = session.grab_render(downsample=1, key="ego")
|
||||
|
||||
if result:
|
||||
# Process RGB image
|
||||
rgb_data = result.get("rgb")
|
||||
if rgb_data:
|
||||
# Convert to numpy array
|
||||
img_array = np.frombuffer(rgb_data, dtype=np.uint8)
|
||||
img_array = img_array.reshape((480, 640, 3))
|
||||
|
||||
# Save image
|
||||
img = Image.fromarray(img_array)
|
||||
img.save(f"render_{i:03d}.png")
|
||||
print(f"Saved render_{i:03d}.png")
|
||||
|
||||
# Process depth map
|
||||
depth_data = result.get("depth")
|
||||
if depth_data:
|
||||
depth_array = np.frombuffer(depth_data, dtype=np.float32)
|
||||
depth_array = depth_array.reshape((480, 640))
|
||||
|
||||
# Save depth map
|
||||
depth_img = Image.fromarray(
|
||||
(depth_array * 255).astype(np.uint8)
|
||||
)
|
||||
depth_img.save(f"depth_{i:03d}.png")
|
||||
print(f"Saved depth_{i:03d}.png")
|
||||
|
||||
print("Finished capturing renders")
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key API: `grab_render()`
|
||||
|
||||
```python
|
||||
result = session.grab_render(downsample=1, key="ego")
|
||||
```
|
||||
|
||||
### Parameters
|
||||
- **downsample**: Downsample factor (1 = no downsampling, 2 = half resolution)
|
||||
- **key**: Camera key to capture from
|
||||
|
||||
### Returns
|
||||
Dictionary containing:
|
||||
- **rgb**: RGB image data as bytes
|
||||
- **depth**: Depth map data as float32 array (if `renderDepth=True`)
|
||||
|
||||
## Depth Rendering
|
||||
|
||||
Enable depth map capture by setting `renderDepth=True`:
|
||||
|
||||
```python
|
||||
CameraView(
|
||||
key="ego",
|
||||
renderDepth=True,
|
||||
stream="ondemand",
|
||||
# ... other parameters
|
||||
)
|
||||
```
|
||||
|
||||
**Benefits:**
|
||||
- Captures depth without changing object materials
|
||||
- Available since 2024 update
|
||||
- Minimal computational overhead
|
||||
|
||||
## Legacy Method: Event Handler
|
||||
|
||||
For `stream='frame'` or `stream='time'` mode:
|
||||
|
||||
```python
|
||||
async def handle_camera_view(event, session):
|
||||
"""Handle CAMERA_VIEW events"""
|
||||
if event.key != "ego":
|
||||
return
|
||||
|
||||
# Access rendered image
|
||||
image_data = event.value.get("image")
|
||||
|
||||
# Process image data
|
||||
print(f"Received image: {len(image_data)} bytes")
|
||||
|
||||
app.add_handler("CAMERA_VIEW", handle_camera_view)
|
||||
```
|
||||
|
||||
## Multi-Camera Capture
|
||||
|
||||
Capture from multiple cameras:
|
||||
|
||||
```python
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
# Set up multiple cameras
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
CameraView(
|
||||
key="front-camera",
|
||||
position=[0, 1, 5],
|
||||
stream="ondemand",
|
||||
width=640,
|
||||
height=480,
|
||||
),
|
||||
|
||||
CameraView(
|
||||
key="top-camera",
|
||||
position=[0, 10, 0],
|
||||
rotation=[-1.57, 0, 0],
|
||||
stream="ondemand",
|
||||
width=640,
|
||||
height=480,
|
||||
),
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
# Capture from both cameras
|
||||
front_render = session.grab_render(key="front-camera")
|
||||
top_render = session.grab_render(key="top-camera")
|
||||
|
||||
# Process renders...
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use ondemand mode** - More efficient for programmatic rendering
|
||||
2. **Enable depth rendering** - Get depth maps without material changes
|
||||
3. **Add small delays** - Wait for camera updates before grabbing
|
||||
4. **Set appropriate resolution** - Balance quality and performance
|
||||
5. **Use downsampling** - Reduce data size when full resolution isn't needed
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
The ondemand approach:
|
||||
- Minimizes computational overhead
|
||||
- Only renders when explicitly requested
|
||||
- Ideal for resource-constrained applications
|
||||
- Perfect for dataset generation and batch processing
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/camera/grab_render_virtual_camera.html
|
||||
212
docs/tutorials/camera/manipulating-camera-pose.md
Normal file
212
docs/tutorials/camera/manipulating-camera-pose.md
Normal file
@@ -0,0 +1,212 @@
|
||||
# Manipulating Camera Pose in Vuer
|
||||
|
||||
## Overview
|
||||
This tutorial demonstrates how to programmatically control virtual camera positions and orientations within the Vuer framework, along with tracking user interactions.
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### CameraView Component
|
||||
Virtual cameras in Vuer are controlled through the `CameraView` component with parameters:
|
||||
- **fov**: Field of view in degrees
|
||||
- **width, height**: Resolution in pixels
|
||||
- **position**: Camera position `[x, y, z]`
|
||||
- **rotation**: Camera rotation `[x, y, z]`
|
||||
- **matrix**: 4x4 transformation matrix (alternative to position/rotation)
|
||||
- **stream**: Streaming mode (`"time"`, `"frame"`, or `"ondemand"`)
|
||||
- **fps**: Frame rate for streaming
|
||||
- **near, far**: Clipping planes
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import pickle
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.events import ClientEvent
|
||||
from vuer.schemas import Scene, CameraView, DefaultScene, Urdf
|
||||
from ml_logger import ML_Logger
|
||||
|
||||
# Initialize logger
|
||||
logger = ML_Logger(root=".", prefix="assets")
|
||||
|
||||
# Load pre-recorded camera matrices
|
||||
with open("assets/camera_movement.pkl", "rb") as f:
|
||||
data = pickle.load(f)
|
||||
matrices = [item["matrix"] for item in data]
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Event handler to track camera movements
|
||||
async def track_movement(event: ClientEvent, sess: VuerSession):
|
||||
"""Capture camera movement events"""
|
||||
if event.key != "ego":
|
||||
return
|
||||
|
||||
logger.log(**event.value, flush=True, silent=True)
|
||||
print(f"Camera moved: {event.value['position']}")
|
||||
|
||||
app.add_handler("CAMERA_MOVE", track_movement)
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(proxy: VuerSession):
|
||||
# Set up the scene
|
||||
proxy.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
# Add a robot for reference
|
||||
Urdf(
|
||||
src="/static/robot.urdf",
|
||||
position=[0, 0, 0],
|
||||
key="robot",
|
||||
),
|
||||
)
|
||||
|
||||
# Animate camera through recorded positions
|
||||
for i in range(len(matrices)):
|
||||
proxy.update @ [
|
||||
CameraView(
|
||||
key="ego",
|
||||
fov=50,
|
||||
width=320,
|
||||
height=240,
|
||||
matrix=matrices[i % len(matrices)],
|
||||
stream="time",
|
||||
fps=30,
|
||||
near=0.1,
|
||||
far=100,
|
||||
),
|
||||
]
|
||||
|
||||
await asyncio.sleep(0.033) # 30 FPS
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Dynamic Camera Control Methods
|
||||
|
||||
### Method 1: Using Transformation Matrix
|
||||
|
||||
```python
|
||||
session.update @ CameraView(
|
||||
key="ego",
|
||||
matrix=[
|
||||
[1, 0, 0, x],
|
||||
[0, 1, 0, y],
|
||||
[0, 0, 1, z],
|
||||
[0, 0, 0, 1],
|
||||
],
|
||||
)
|
||||
```
|
||||
|
||||
### Method 2: Using Position and Rotation
|
||||
|
||||
```python
|
||||
session.update @ CameraView(
|
||||
key="ego",
|
||||
position=[x, y, z],
|
||||
rotation=[rx, ry, rz], # Euler angles in radians
|
||||
)
|
||||
```
|
||||
|
||||
### Method 3: Animated Camera Path
|
||||
|
||||
```python
|
||||
import math
|
||||
|
||||
for i in range(360):
|
||||
theta = math.radians(i)
|
||||
radius = 5
|
||||
|
||||
# Circular orbit
|
||||
x = radius * math.cos(theta)
|
||||
z = radius * math.sin(theta)
|
||||
|
||||
session.update @ CameraView(
|
||||
key="ego",
|
||||
position=[x, 2, z],
|
||||
rotation=[0, theta, 0],
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.033) # 30 FPS
|
||||
```
|
||||
|
||||
## Replaying Recorded Movements
|
||||
|
||||
Load and replay pre-recorded camera movements:
|
||||
|
||||
```python
|
||||
import pickle
|
||||
|
||||
# Load recorded movements
|
||||
with open("assets/camera_movement.pkl", "rb") as f:
|
||||
movements = pickle.load(f)
|
||||
|
||||
# Replay movements
|
||||
for movement in movements:
|
||||
session.update @ CameraView(
|
||||
key="ego",
|
||||
matrix=movement["matrix"],
|
||||
fov=50,
|
||||
width=320,
|
||||
height=240,
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.033) # 30 FPS
|
||||
```
|
||||
|
||||
## Event Handling
|
||||
|
||||
Track user-initiated camera movements:
|
||||
|
||||
```python
|
||||
async def track_movement(event: ClientEvent, sess: VuerSession):
|
||||
"""Log user camera movements"""
|
||||
if event.key != "ego":
|
||||
return
|
||||
|
||||
# Access camera data
|
||||
position = event.value.get("position")
|
||||
rotation = event.value.get("rotation")
|
||||
matrix = event.value.get("matrix")
|
||||
|
||||
print(f"Position: {position}")
|
||||
print(f"Rotation: {rotation}")
|
||||
|
||||
# Save to logger
|
||||
logger.log(**event.value, flush=True, silent=True)
|
||||
|
||||
app.add_handler("CAMERA_MOVE", track_movement)
|
||||
```
|
||||
|
||||
## Streaming Modes
|
||||
|
||||
### "time" Mode
|
||||
Continuous streaming at specified FPS:
|
||||
```python
|
||||
CameraView(stream="time", fps=30)
|
||||
```
|
||||
|
||||
### "frame" Mode
|
||||
Stream individual frames on demand.
|
||||
|
||||
### "ondemand" Mode
|
||||
Only render when explicitly requested (most efficient):
|
||||
```python
|
||||
CameraView(stream="ondemand")
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use matrices for complex movements** - More precise than position/rotation
|
||||
2. **Track user movements** - Enable interactive camera control
|
||||
3. **Set appropriate FPS** - Balance smoothness and performance
|
||||
4. **Use clipping planes** - Optimize rendering with near/far settings
|
||||
5. **Use ondemand mode** - Save resources when continuous streaming isn't needed
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/camera/move_camera.html
|
||||
157
docs/tutorials/camera/recording-camera-movements.md
Normal file
157
docs/tutorials/camera/recording-camera-movements.md
Normal file
@@ -0,0 +1,157 @@
|
||||
# Recording Camera Movements in Vuer
|
||||
|
||||
## Overview
|
||||
|
||||
This tutorial demonstrates how to capture user camera movements in a Vuer application and save them to a file for later programmatic control.
|
||||
|
||||
## Purpose
|
||||
|
||||
Record camera movements to produce a camera movement file (`assets/camera_movement.pkl`) that can be used to:
|
||||
- Replay camera movements
|
||||
- Control camera movements programmatically
|
||||
- Analyze user navigation patterns
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import os
|
||||
import asyncio
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.events import ClientEvent
|
||||
from vuer.schemas import Scene, CameraView, DefaultScene
|
||||
from ml_logger import ML_Logger
|
||||
|
||||
# Initialize logger
|
||||
logger = ML_Logger(root=os.getcwd(), prefix="assets")
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Event handler for camera movements
|
||||
async def track_movement(event: ClientEvent, sess: VuerSession):
|
||||
"""Capture and log camera movement events"""
|
||||
if event.key != "ego":
|
||||
return
|
||||
|
||||
print("camera moved", event.value["matrix"])
|
||||
|
||||
# Save camera data to file
|
||||
logger.log(**event.value, flush=True, file="camera_movement.pkl")
|
||||
|
||||
# Register the event handler
|
||||
app.add_handler("CAMERA_MOVE", track_movement)
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
# Set up the scene
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
# Configure the camera view
|
||||
CameraView(
|
||||
fov=50,
|
||||
width=320,
|
||||
height=240,
|
||||
position=[0, 2, 5],
|
||||
rotation=[0, 0, 0],
|
||||
key="ego",
|
||||
),
|
||||
)
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
### 1. Event Handler Setup
|
||||
|
||||
Create an event listener for camera movement events:
|
||||
|
||||
```python
|
||||
async def track_movement(event: ClientEvent, sess: VuerSession):
|
||||
if event.key != "ego":
|
||||
return
|
||||
print("camera moved", event.value["matrix"])
|
||||
```
|
||||
|
||||
The handler:
|
||||
- Filters for the ego camera (`event.key != "ego"`)
|
||||
- Accesses movement data via `event.value["matrix"]`
|
||||
- Can process or log the camera transformation matrix
|
||||
|
||||
### 2. Initialize Logger
|
||||
|
||||
Uses ML-Logger to persist camera data to disk:
|
||||
|
||||
```python
|
||||
from ml_logger import ML_Logger
|
||||
logger = ML_Logger(root=os.getcwd(), prefix="assets")
|
||||
```
|
||||
|
||||
### 3. Register Handler
|
||||
|
||||
Connect the handler to the app:
|
||||
|
||||
```python
|
||||
app.add_handler("CAMERA_MOVE", track_movement)
|
||||
```
|
||||
|
||||
### 4. Configure Camera View
|
||||
|
||||
The scene includes a `CameraView` component with:
|
||||
- **fov**: Field of view in degrees
|
||||
- **width, height**: Resolution
|
||||
- **position**: Initial camera position `[x, y, z]`
|
||||
- **rotation**: Initial camera rotation `[x, y, z]`
|
||||
- **key**: Unique identifier (used to filter events)
|
||||
|
||||
## Saving Camera Data
|
||||
|
||||
Camera movement data is saved using:
|
||||
|
||||
```python
|
||||
logger.log(**event.value, flush=True, file="camera_movement.pkl")
|
||||
```
|
||||
|
||||
This creates a persistent record in `assets/camera_movement.pkl`.
|
||||
|
||||
## Data Format
|
||||
|
||||
The `event.value` dictionary typically contains:
|
||||
- **matrix**: 4x4 transformation matrix
|
||||
- **position**: Camera position `[x, y, z]`
|
||||
- **rotation**: Camera rotation (quaternion or Euler angles)
|
||||
- **timestamp**: Event timestamp
|
||||
|
||||
## Usage in Subsequent Tutorials
|
||||
|
||||
The recorded camera movements can be loaded and replayed:
|
||||
|
||||
```python
|
||||
import pickle
|
||||
|
||||
# Load recorded movements
|
||||
with open("assets/camera_movement.pkl", "rb") as f:
|
||||
movements = pickle.load(f)
|
||||
|
||||
# Replay movements
|
||||
for movement in movements:
|
||||
session.update @ CameraView(
|
||||
matrix=movement["matrix"],
|
||||
key="ego",
|
||||
)
|
||||
await asyncio.sleep(0.033) # 30 FPS
|
||||
```
|
||||
|
||||
## Installation Requirements
|
||||
|
||||
```bash
|
||||
pip install ml-logger
|
||||
```
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/camera/record_camera_movement.html
|
||||
334
docs/tutorials/physics/hand-control.md
Normal file
334
docs/tutorials/physics/hand-control.md
Normal file
@@ -0,0 +1,334 @@
|
||||
# MuJoCo VR Hand Control
|
||||
|
||||
## Overview
|
||||
|
||||
This tutorial demonstrates how to control virtual hands in MuJoCo by leveraging mocap (motion capture) points that track user hand poses in VR environments.
|
||||
|
||||
## Important Requirement
|
||||
|
||||
**SSL/HTTPS Required**: VR hand tracking requires secure connections. Use ngrok or localtunnel to set up SSL.
|
||||
|
||||
See the [SSL Proxy WebXR tutorial](../basics/ssl-proxy-webxr.md) for setup instructions.
|
||||
|
||||
## Mocap Point API
|
||||
|
||||
The implementation uses **XR Hand Naming Conventions** to link mocap bodies with hand joints.
|
||||
|
||||
### Naming Format
|
||||
|
||||
```
|
||||
"{joint}-{left | right}"
|
||||
```
|
||||
|
||||
Examples:
|
||||
- `wrist-right`
|
||||
- `middle-finger-phalanx-proximal-right`
|
||||
- `thumb-tip-left`
|
||||
|
||||
## Hand Joint Mapping
|
||||
|
||||
The system defines **25 distinct hand joints** (indexed 0-24):
|
||||
|
||||
### Joint Index Reference
|
||||
|
||||
```python
|
||||
HAND_JOINTS = {
|
||||
0: "wrist",
|
||||
# Thumb (1-4)
|
||||
1: "thumb-metacarpal",
|
||||
2: "thumb-phalanx-proximal",
|
||||
3: "thumb-phalanx-distal",
|
||||
4: "thumb-tip",
|
||||
# Index finger (5-9)
|
||||
5: "index-finger-metacarpal",
|
||||
6: "index-finger-phalanx-proximal",
|
||||
7: "index-finger-phalanx-intermediate",
|
||||
8: "index-finger-phalanx-distal",
|
||||
9: "index-finger-tip",
|
||||
# Middle finger (10-14)
|
||||
10: "middle-finger-metacarpal",
|
||||
11: "middle-finger-phalanx-proximal",
|
||||
12: "middle-finger-phalanx-intermediate",
|
||||
13: "middle-finger-phalanx-distal",
|
||||
14: "middle-finger-tip",
|
||||
# Ring finger (15-19)
|
||||
15: "ring-finger-metacarpal",
|
||||
16: "ring-finger-phalanx-proximal",
|
||||
17: "ring-finger-phalanx-intermediate",
|
||||
18: "ring-finger-phalanx-distal",
|
||||
19: "ring-finger-tip",
|
||||
# Pinky finger (20-24)
|
||||
20: "pinky-finger-metacarpal",
|
||||
21: "pinky-finger-phalanx-proximal",
|
||||
22: "pinky-finger-phalanx-intermediate",
|
||||
23: "pinky-finger-phalanx-distal",
|
||||
24: "pinky-finger-tip",
|
||||
}
|
||||
```
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import (
|
||||
Scene, Fog, Sphere,
|
||||
MuJoCo, ContribLoader,
|
||||
Hands
|
||||
)
|
||||
from vuer.events import ClientEvent
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Assets for hand simulation
|
||||
HAND_ASSETS = [
|
||||
"/static/mujoco/hands/scene.xml",
|
||||
"/static/mujoco/hands/left_hand.xml",
|
||||
"/static/mujoco/hands/right_hand.xml",
|
||||
"/static/mujoco/hands/palm.obj",
|
||||
"/static/mujoco/hands/finger.obj",
|
||||
]
|
||||
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
|
||||
"""Handle physics updates"""
|
||||
print("ON_MUJOCO_FRAME", event.value)
|
||||
|
||||
# Access mocap data
|
||||
mocap_pos = event.value.get("mocap_pos")
|
||||
mocap_quat = event.value.get("mocap_quat")
|
||||
|
||||
# Process hand tracking data
|
||||
if mocap_pos and mocap_quat:
|
||||
# Update hand positions based on tracking
|
||||
pass
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
# Load MuJoCo library
|
||||
session.upsert @ ContribLoader(
|
||||
library="@vuer-ai/mujoco-ts",
|
||||
version="0.0.24",
|
||||
entry="dist/index.umd.js",
|
||||
key="mujoco-loader",
|
||||
)
|
||||
|
||||
await asyncio.sleep(2.0)
|
||||
|
||||
# Set up scene with hands
|
||||
session.set @ Scene(
|
||||
bgChildren=[
|
||||
# MuJoCo default styling
|
||||
Fog(color=0x2C3F57, near=10, far=20),
|
||||
|
||||
# Add VR hands
|
||||
Hands(),
|
||||
|
||||
# Background sphere
|
||||
Sphere(
|
||||
args=[50, 10, 10],
|
||||
materialType="basic",
|
||||
material=dict(color=0x2C3F57, side=1),
|
||||
),
|
||||
],
|
||||
|
||||
# Initialize MuJoCo with hand model
|
||||
MuJoCo(
|
||||
src="/static/mujoco/hands/scene.xml",
|
||||
assets=HAND_ASSETS,
|
||||
scale=0.1,
|
||||
key="hand-sim",
|
||||
),
|
||||
)
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
### Hands Component
|
||||
|
||||
```python
|
||||
Hands()
|
||||
```
|
||||
|
||||
This enables VR hand tracking, providing position and orientation data for all 25 hand joints.
|
||||
|
||||
### Mocap Bodies in MuJoCo XML
|
||||
|
||||
In your MuJoCo scene XML, define mocap bodies using the XR hand naming convention:
|
||||
|
||||
```xml
|
||||
<mujoco>
|
||||
<worldbody>
|
||||
<!-- Right hand mocap bodies -->
|
||||
<body name="wrist-right" mocap="true">
|
||||
<geom type="sphere" size="0.02" rgba="1 0 0 0.5"/>
|
||||
</body>
|
||||
|
||||
<body name="thumb-tip-right" mocap="true">
|
||||
<geom type="sphere" size="0.01" rgba="0 1 0 0.5"/>
|
||||
</body>
|
||||
|
||||
<body name="index-finger-tip-right" mocap="true">
|
||||
<geom type="sphere" size="0.01" rgba="0 0 1 0.5"/>
|
||||
</body>
|
||||
|
||||
<!-- Add more joints as needed -->
|
||||
|
||||
<!-- Left hand mocap bodies -->
|
||||
<body name="wrist-left" mocap="true">
|
||||
<geom type="sphere" size="0.02" rgba="1 0 0 0.5"/>
|
||||
</body>
|
||||
|
||||
<!-- Add left hand joints -->
|
||||
</worldbody>
|
||||
</mujoco>
|
||||
```
|
||||
|
||||
## Accessing Hand Data
|
||||
|
||||
### Method 1: Event Handler
|
||||
|
||||
```python
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def on_frame(event: ClientEvent, sess: VuerSession):
|
||||
# Get mocap positions (3D coordinates)
|
||||
mocap_pos = event.value.get("mocap_pos")
|
||||
|
||||
# Get mocap quaternions (orientations)
|
||||
mocap_quat = event.value.get("mocap_quat")
|
||||
|
||||
if mocap_pos:
|
||||
# mocap_pos is a list of [x, y, z] positions
|
||||
# Order matches the mocap body order in XML
|
||||
wrist_pos = mocap_pos[0]
|
||||
thumb_tip_pos = mocap_pos[1]
|
||||
# etc.
|
||||
|
||||
print(f"Wrist position: {wrist_pos}")
|
||||
```
|
||||
|
||||
### Method 2: Direct Hand Tracking
|
||||
|
||||
```python
|
||||
from vuer.events import ClientEvent
|
||||
|
||||
@app.add_handler("HAND_MOVE")
|
||||
async def on_hand_move(event: ClientEvent, sess: VuerSession):
|
||||
"""Handle hand tracking events directly"""
|
||||
hand_data = event.value
|
||||
|
||||
# Access hand side
|
||||
side = hand_data.get("side") # "left" or "right"
|
||||
|
||||
# Access joint positions
|
||||
joints = hand_data.get("joints") # List of 25 joint positions
|
||||
|
||||
print(f"{side} hand moved")
|
||||
print(f"Wrist: {joints[0]}")
|
||||
print(f"Index tip: {joints[9]}")
|
||||
```
|
||||
|
||||
## Creating Hand-Object Interactions
|
||||
|
||||
```python
|
||||
# In your MuJoCo XML
|
||||
<mujoco>
|
||||
<worldbody>
|
||||
<!-- Hand mocap bodies -->
|
||||
<body name="index-finger-tip-right" mocap="true"/>
|
||||
|
||||
<!-- Graspable object -->
|
||||
<body name="cube" pos="0 0 0.5">
|
||||
<freejoint/>
|
||||
<geom type="box" size="0.05 0.05 0.05" rgba="1 1 0 1"/>
|
||||
</body>
|
||||
|
||||
<!-- Equality constraint to connect finger to object -->
|
||||
<equality>
|
||||
<weld body1="index-finger-tip-right" body2="cube" active="false"/>
|
||||
</equality>
|
||||
</worldbody>
|
||||
|
||||
<actuator>
|
||||
<!-- Actuator to activate/deactivate weld constraint -->
|
||||
<general joint="cube" dyntype="none"/>
|
||||
</actuator>
|
||||
</mujoco>
|
||||
```
|
||||
|
||||
## Example: Pinch Detection
|
||||
|
||||
```python
|
||||
import numpy as np
|
||||
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def detect_pinch(event: ClientEvent, sess: VuerSession):
|
||||
mocap_pos = event.value.get("mocap_pos")
|
||||
|
||||
if mocap_pos and len(mocap_pos) >= 10:
|
||||
# Get thumb tip and index finger tip positions
|
||||
thumb_tip = np.array(mocap_pos[4]) # Index 4
|
||||
index_tip = np.array(mocap_pos[9]) # Index 9
|
||||
|
||||
# Calculate distance
|
||||
distance = np.linalg.norm(thumb_tip - index_tip)
|
||||
|
||||
# Detect pinch
|
||||
if distance < 0.02: # 2cm threshold
|
||||
print("Pinch detected!")
|
||||
# Trigger grasp action
|
||||
```
|
||||
|
||||
## VR Access
|
||||
|
||||
1. Start server:
|
||||
```bash
|
||||
python your_script.py
|
||||
```
|
||||
|
||||
2. Set up ngrok:
|
||||
```bash
|
||||
ngrok http 8012
|
||||
```
|
||||
|
||||
3. Access via VR headset:
|
||||
```
|
||||
https://vuer.ai?ws=wss://xxxxx.ngrok.io
|
||||
```
|
||||
|
||||
4. Enable hand tracking in your VR headset settings
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use XR naming convention** - Follow the exact joint naming format
|
||||
2. **Define all needed mocap bodies** - Only tracked joints need mocap bodies
|
||||
3. **Set appropriate scale** - Scale simulation for VR comfort (e.g., 0.1)
|
||||
4. **Handle both hands** - Create separate mocap bodies for left and right
|
||||
5. **Test joint mapping** - Verify each joint is tracking correctly
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Hands not tracking
|
||||
- Verify SSL is properly set up
|
||||
- Check that hand tracking is enabled in VR headset
|
||||
- Confirm `Hands()` component is in the scene
|
||||
|
||||
### Mocap bodies not moving
|
||||
- Verify mocap body names match XR convention exactly
|
||||
- Check that `mocap="true"` is set in XML
|
||||
- Ensure body names include `-left` or `-right` suffix
|
||||
|
||||
### Poor tracking accuracy
|
||||
- Calibrate VR headset hand tracking
|
||||
- Ensure good lighting conditions
|
||||
- Check for hand occlusion issues
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mocap_hand_control.html
|
||||
285
docs/tutorials/physics/mocap-control.md
Normal file
285
docs/tutorials/physics/mocap-control.md
Normal file
@@ -0,0 +1,285 @@
|
||||
# MuJoCo Motion Capture Control
|
||||
|
||||
## Overview
|
||||
This tutorial demonstrates implementing mocap (motion capture) control within a MuJoCo physics simulation using the Vuer framework for VR/mixed reality applications.
|
||||
|
||||
## Key Dependencies
|
||||
|
||||
```python
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import (
|
||||
Scene, Fog, Sphere,
|
||||
MuJoCo, ContribLoader,
|
||||
MotionControllers, MotionControllerActuator
|
||||
)
|
||||
from vuer.events import ClientEvent
|
||||
```
|
||||
|
||||
## Important Requirement
|
||||
|
||||
**SSL/HTTPS Required**: The server requires SSL for WebXR functionality. Use ngrok or localtunnel to convert:
|
||||
- `ws://localhost:8012` → `wss://xxxxx.ngrok.io`
|
||||
- `http://localhost:8012` → `https://xxxxx.ngrok.io`
|
||||
|
||||
See the [SSL Proxy WebXR tutorial](../basics/ssl-proxy-webxr.md) for setup instructions.
|
||||
|
||||
## Complete Example: Gripper Control
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import (
|
||||
Scene, Fog, Sphere,
|
||||
MuJoCo, ContribLoader,
|
||||
MotionControllers, MotionControllerActuator
|
||||
)
|
||||
from vuer.events import ClientEvent
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Define all assets for the simulation
|
||||
GRIPPER_ASSETS = [
|
||||
"/static/mujoco/gripper/scene.xml",
|
||||
"/static/mujoco/gripper/gripper.xml",
|
||||
"/static/mujoco/gripper/bin.xml",
|
||||
"/static/mujoco/gripper/table.xml",
|
||||
"/static/mujoco/gripper/base.obj",
|
||||
"/static/mujoco/gripper/finger.obj",
|
||||
"/static/mujoco/gripper/bin.obj",
|
||||
"/static/mujoco/gripper/table.obj",
|
||||
]
|
||||
|
||||
# Event handler for physics updates
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
|
||||
"""Respond to each simulation frame"""
|
||||
frame_data = event.value
|
||||
|
||||
# Access simulation state
|
||||
# qpos = frame_data.get("qpos") # Joint positions
|
||||
# qvel = frame_data.get("qvel") # Joint velocities
|
||||
# time = frame_data.get("time") # Simulation time
|
||||
|
||||
# Apply control inputs
|
||||
# Update visualization
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
# Step 1: Load MuJoCo library
|
||||
session.upsert @ ContribLoader(
|
||||
library="@vuer-ai/mujoco-ts",
|
||||
version="0.0.24",
|
||||
entry="dist/index.umd.js",
|
||||
key="mujoco-loader",
|
||||
)
|
||||
|
||||
# Wait for library to load
|
||||
await asyncio.sleep(2.0)
|
||||
|
||||
# Step 2: Configure scene with VR controls
|
||||
session.set @ Scene(
|
||||
# Add fog effect (mimics MuJoCo's default styling)
|
||||
Fog(
|
||||
color=0x2C3F57,
|
||||
near=10,
|
||||
far=20,
|
||||
),
|
||||
|
||||
# Add background sphere
|
||||
Sphere(
|
||||
args=[50, 10, 10],
|
||||
materialType="basic",
|
||||
material=dict(color=0x2C3F57, side=1), # BackSide
|
||||
key="background",
|
||||
),
|
||||
|
||||
# Add VR motion controllers
|
||||
MotionControllers(),
|
||||
|
||||
# Add motion controller actuator for VR input
|
||||
MotionControllerActuator(
|
||||
key="controller-actuator",
|
||||
),
|
||||
|
||||
# Initialize MuJoCo simulation
|
||||
MuJoCo(
|
||||
src="/static/mujoco/gripper/scene.xml",
|
||||
assets=GRIPPER_ASSETS,
|
||||
scale=0.1,
|
||||
timeout=100,
|
||||
key="gripper-sim",
|
||||
),
|
||||
)
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
### MotionControllers
|
||||
Captures VR controller input:
|
||||
|
||||
```python
|
||||
MotionControllers()
|
||||
```
|
||||
|
||||
This enables tracking of VR controller positions, orientations, and button presses.
|
||||
|
||||
### MotionControllerActuator
|
||||
Bridges VR input to MuJoCo simulation:
|
||||
|
||||
```python
|
||||
MotionControllerActuator(
|
||||
key="controller-actuator",
|
||||
)
|
||||
```
|
||||
|
||||
### MuJoCo Component with Scale
|
||||
```python
|
||||
MuJoCo(
|
||||
src="/static/scene.xml",
|
||||
assets=ASSETS,
|
||||
scale=0.1, # Scale simulation (10% of original size)
|
||||
timeout=100, # Timeout in milliseconds
|
||||
key="sim",
|
||||
)
|
||||
```
|
||||
|
||||
## Event Handling: ON_MUJOCO_FRAME
|
||||
|
||||
This event fires on every physics update:
|
||||
|
||||
```python
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
|
||||
frame_data = event.value
|
||||
|
||||
# Simulation state
|
||||
qpos = frame_data.get("qpos") # Joint positions
|
||||
qvel = frame_data.get("qvel") # Joint velocities
|
||||
time = frame_data.get("time") # Simulation time
|
||||
ctrl = frame_data.get("ctrl") # Control inputs
|
||||
|
||||
# Apply control logic
|
||||
new_ctrl = calculate_control(qpos, qvel)
|
||||
|
||||
# Update simulation
|
||||
sess.upsert @ MuJoCo(
|
||||
ctrl=new_ctrl,
|
||||
key="gripper-sim",
|
||||
)
|
||||
```
|
||||
|
||||
## Scene Setup Pattern
|
||||
|
||||
### 1. Configure MuJoCo Styling
|
||||
|
||||
```python
|
||||
Fog(
|
||||
color=0x2C3F57, # MuJoCo default gray-blue
|
||||
near=10,
|
||||
far=20,
|
||||
)
|
||||
|
||||
Sphere(
|
||||
args=[50, 10, 10],
|
||||
materialType="basic",
|
||||
material=dict(color=0x2C3F57, side=1),
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Add VR Input
|
||||
|
||||
```python
|
||||
MotionControllers()
|
||||
MotionControllerActuator()
|
||||
```
|
||||
|
||||
### 3. Initialize Physics
|
||||
|
||||
```python
|
||||
MuJoCo(
|
||||
src="/static/scene.xml",
|
||||
assets=ASSETS,
|
||||
scale=0.1,
|
||||
)
|
||||
```
|
||||
|
||||
## Asset Organization
|
||||
|
||||
Organize your gripper assets:
|
||||
|
||||
```
|
||||
static/mujoco/gripper/
|
||||
├── scene.xml # Main scene
|
||||
├── gripper.xml # Gripper model
|
||||
├── bin.xml # Bin configuration
|
||||
├── table.xml # Table configuration
|
||||
├── base.obj # 3D meshes
|
||||
├── finger.obj
|
||||
├── bin.obj
|
||||
└── table.obj
|
||||
```
|
||||
|
||||
## VR Access
|
||||
|
||||
1. Start the server:
|
||||
```bash
|
||||
python your_script.py
|
||||
```
|
||||
|
||||
2. Set up ngrok:
|
||||
```bash
|
||||
ngrok http 8012
|
||||
```
|
||||
|
||||
3. Access via VR headset:
|
||||
```
|
||||
https://vuer.ai?ws=wss://xxxxx.ngrok.io
|
||||
```
|
||||
|
||||
## Controlling the Simulation
|
||||
|
||||
### Method 1: Direct Control Values
|
||||
|
||||
```python
|
||||
session.upsert @ MuJoCo(
|
||||
ctrl=[0.5, -0.3, 0.0], # Control values for actuators
|
||||
key="gripper-sim",
|
||||
)
|
||||
```
|
||||
|
||||
### Method 2: VR Controller Input
|
||||
|
||||
The `MotionControllerActuator` automatically maps VR controller movements to simulation controls.
|
||||
|
||||
### Method 3: Event-Based Control
|
||||
|
||||
```python
|
||||
@app.add_handler("ON_MUJOCO_FRAME")
|
||||
async def on_frame(event, sess):
|
||||
# Read current state
|
||||
qpos = event.value.get("qpos")
|
||||
|
||||
# Calculate control
|
||||
ctrl = your_control_algorithm(qpos)
|
||||
|
||||
# Apply control
|
||||
sess.upsert @ MuJoCo(ctrl=ctrl, key="gripper-sim")
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use SSL** - Required for WebXR functionality
|
||||
2. **Add delays** - Wait for library to load before initializing simulation
|
||||
3. **Handle events** - Use ON_MUJOCO_FRAME for responsive control
|
||||
4. **Scale appropriately** - Adjust simulation scale for VR comfort
|
||||
5. **Declare all assets** - Include every file in the assets list
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mocap_control.html
|
||||
225
docs/tutorials/physics/mujoco-wasm.md
Normal file
225
docs/tutorials/physics/mujoco-wasm.md
Normal file
@@ -0,0 +1,225 @@
|
||||
# MuJoCo WASM Integration
|
||||
|
||||
## Overview
|
||||
|
||||
The MuJoCo component enables running physics simulations directly in the browser using WebAssembly technology. This allows for real-time physics simulation without requiring server-side computation.
|
||||
|
||||
## Key Components
|
||||
|
||||
### Required Libraries
|
||||
- **Library**: `@vuer-ai/mujoco-ts`
|
||||
- **Version**: `0.0.24`
|
||||
- **Entry point**: `dist/index.umd.js`
|
||||
|
||||
### Asset Management
|
||||
You need to supply a list of paths to relevant files via the `assets` attribute. This includes:
|
||||
- XML configuration files
|
||||
- 3D meshes (OBJ format)
|
||||
- Textures (PNG format)
|
||||
|
||||
## Complete Example: Cassie Robot
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Fog, MuJoCo, ContribLoader
|
||||
|
||||
app = Vuer()
|
||||
|
||||
# Define all assets needed for the simulation
|
||||
CASSIE_ASSETS = [
|
||||
"/static/mujoco/cassie/scene.xml",
|
||||
"/static/mujoco/cassie/cassie.xml",
|
||||
"/static/mujoco/cassie/pelvis.obj",
|
||||
"/static/mujoco/cassie/left-hip.obj",
|
||||
"/static/mujoco/cassie/left-thigh.obj",
|
||||
"/static/mujoco/cassie/left-shin.obj",
|
||||
"/static/mujoco/cassie/left-foot.obj",
|
||||
"/static/mujoco/cassie/right-hip.obj",
|
||||
"/static/mujoco/cassie/right-thigh.obj",
|
||||
"/static/mujoco/cassie/right-shin.obj",
|
||||
"/static/mujoco/cassie/right-foot.obj",
|
||||
"/static/mujoco/cassie/texture.png",
|
||||
]
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
# Load the MuJoCo library
|
||||
session.upsert @ ContribLoader(
|
||||
library="@vuer-ai/mujoco-ts",
|
||||
version="0.0.24",
|
||||
entry="dist/index.umd.js",
|
||||
key="mujoco-loader",
|
||||
)
|
||||
|
||||
# Wait for library to load
|
||||
await asyncio.sleep(2.0)
|
||||
|
||||
# Set up the scene with MuJoCo's default styling
|
||||
session.set @ Scene(
|
||||
Fog(
|
||||
color=0x2C3F57, # MuJoCo default background
|
||||
near=10,
|
||||
far=20,
|
||||
),
|
||||
|
||||
# Initialize MuJoCo simulation
|
||||
MuJoCo(
|
||||
src="/static/mujoco/cassie/scene.xml",
|
||||
assets=CASSIE_ASSETS,
|
||||
key="cassie-sim",
|
||||
),
|
||||
)
|
||||
|
||||
# Keep session alive
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Implementation Workflow
|
||||
|
||||
### 1. Load the Contrib Library
|
||||
|
||||
```python
|
||||
session.upsert @ ContribLoader(
|
||||
library="@vuer-ai/mujoco-ts",
|
||||
version="0.0.24",
|
||||
entry="dist/index.umd.js",
|
||||
key="mujoco-loader",
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Configure the Scene
|
||||
|
||||
Set up fog effects and background styling to match MuJoCo's default appearance:
|
||||
|
||||
```python
|
||||
Fog(
|
||||
color=0x2C3F57, # MuJoCo's default gray-blue
|
||||
near=10,
|
||||
far=20,
|
||||
)
|
||||
```
|
||||
|
||||
### 3. Provide Asset Paths
|
||||
|
||||
Supply URLs to all necessary model files:
|
||||
|
||||
```python
|
||||
assets = [
|
||||
"/static/scene.xml", # Main scene file
|
||||
"/static/robot.xml", # Robot description
|
||||
"/static/mesh1.obj", # 3D meshes
|
||||
"/static/mesh2.obj",
|
||||
"/static/texture.png", # Textures
|
||||
]
|
||||
```
|
||||
|
||||
### 4. Initialize MuJoCo Component
|
||||
|
||||
```python
|
||||
MuJoCo(
|
||||
src="/static/scene.xml", # Main XML file
|
||||
assets=assets, # All required assets
|
||||
key="mujoco-sim",
|
||||
)
|
||||
```
|
||||
|
||||
## Event Handling
|
||||
|
||||
Listen for simulation updates:
|
||||
|
||||
```python
|
||||
async def on_mujoco_frame(event, session):
|
||||
"""Handle physics updates"""
|
||||
print("MuJoCo frame:", event.value)
|
||||
# Access simulation state
|
||||
# Apply control inputs
|
||||
# Update visualization
|
||||
|
||||
app.add_handler("ON_MUJOCO_FRAME", on_mujoco_frame)
|
||||
```
|
||||
|
||||
## Timing Considerations
|
||||
|
||||
### Option 1: Sleep Delay
|
||||
```python
|
||||
session.upsert @ ContribLoader(...)
|
||||
await asyncio.sleep(2.0) # Wait for library to load
|
||||
session.set @ MuJoCo(...)
|
||||
```
|
||||
|
||||
### Option 2: Event Listener
|
||||
```python
|
||||
async def on_contrib_load(event, session):
|
||||
"""Initialize MuJoCo after library loads"""
|
||||
session.set @ MuJoCo(
|
||||
src="/static/scene.xml",
|
||||
assets=ASSETS,
|
||||
)
|
||||
|
||||
app.add_handler("ON_CONTRIB_LOAD", on_contrib_load)
|
||||
```
|
||||
|
||||
## Asset Organization
|
||||
|
||||
Organize your assets directory:
|
||||
|
||||
```
|
||||
static/mujoco/
|
||||
├── cassie/
|
||||
│ ├── scene.xml # Main scene file
|
||||
│ ├── cassie.xml # Robot configuration
|
||||
│ ├── pelvis.obj # Body meshes
|
||||
│ ├── left-hip.obj
|
||||
│ ├── left-thigh.obj
|
||||
│ ├── ...
|
||||
│ └── texture.png # Textures
|
||||
└── gripper/
|
||||
├── scene.xml
|
||||
├── ...
|
||||
```
|
||||
|
||||
## Serving Assets
|
||||
|
||||
Configure Vuer to serve your assets:
|
||||
|
||||
```python
|
||||
app = Vuer(static_root="assets")
|
||||
```
|
||||
|
||||
Then reference assets with `/static/` prefix:
|
||||
```python
|
||||
src="/static/mujoco/cassie/scene.xml"
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Load library first** - Always load ContribLoader before MuJoCo component
|
||||
2. **List all assets** - Include every file referenced in XML
|
||||
3. **Use relative paths** - XML files should reference meshes with relative paths
|
||||
4. **Match MuJoCo styling** - Use fog and background colors for consistency
|
||||
5. **Handle loading time** - Wait for library to load before initialization
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Simulation not appearing
|
||||
- Verify all assets are accessible
|
||||
- Check ContribLoader has loaded (wait or use event)
|
||||
- Ensure XML file is valid MuJoCo format
|
||||
|
||||
### Missing textures/meshes
|
||||
- Confirm all assets are in the assets list
|
||||
- Check file paths in XML files
|
||||
- Verify static_root configuration
|
||||
|
||||
### Performance issues
|
||||
- Consider simplifying the model
|
||||
- Reduce mesh polygon counts
|
||||
- Optimize texture sizes
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mujoco_wasm.html
|
||||
200
docs/tutorials/robotics/camera-frustums.md
Normal file
200
docs/tutorials/robotics/camera-frustums.md
Normal file
@@ -0,0 +1,200 @@
|
||||
# Camera Frustums in Robotics Visualization
|
||||
|
||||
## Overview
|
||||
|
||||
Camera frustums are essential for visualizing camera viewpoints in robotics applications. Vuer allows you to programmatically insert camera frustums into the scene to represent camera positions and orientations.
|
||||
|
||||
## Basic Frustum
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Frustum, DefaultScene
|
||||
import asyncio
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
Frustum(
|
||||
position=[0, 1, 2],
|
||||
rotation=[0, 0, 0],
|
||||
scale=[1, 1, 1],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
showFocalPlane=True,
|
||||
key="camera-frustum",
|
||||
),
|
||||
)
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Frustum Configuration Options
|
||||
|
||||
### showImagePlane
|
||||
Display the image plane (where the image is captured).
|
||||
|
||||
### showFrustum
|
||||
Show the frustum wireframe (pyramid shape representing the camera's field of view).
|
||||
|
||||
### showFocalPlane
|
||||
Display the focal plane (plane at the focal length).
|
||||
|
||||
### Position and Orientation
|
||||
- **position**: `[x, y, z]` coordinates
|
||||
- **rotation**: Euler angles `[x, y, z]` in radians
|
||||
- **scale**: `[x, y, z]` scale factors
|
||||
|
||||
## Stress Test Example: 1,728 Frustums
|
||||
|
||||
The tutorial demonstrates a stress-test implementation with a large grid of frustums:
|
||||
|
||||
```python
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import Scene, Frustum, DefaultScene
|
||||
import asyncio
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
frustums = []
|
||||
|
||||
# Create 12×12×12 grid of frustums
|
||||
for x in range(12):
|
||||
for y in range(12):
|
||||
for z in range(12):
|
||||
frustums.append(
|
||||
Frustum(
|
||||
position=[x * 2, y * 2, z * 2],
|
||||
scale=[0.5, 0.5, 0.5],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
showFocalPlane=False,
|
||||
key=f"frustum-{x}-{y}-{z}",
|
||||
)
|
||||
)
|
||||
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
*frustums, # Unpack all frustums into the scene
|
||||
)
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
This creates **1,728 frustum objects** (12³), demonstrating the framework's capability to handle large numbers of camera visualization objects simultaneously.
|
||||
|
||||
## Practical Use Case: Multi-Camera Robot
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Frustum, Urdf, DefaultScene
|
||||
import asyncio
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
session.set @ Scene(
|
||||
DefaultScene(),
|
||||
|
||||
# Robot model
|
||||
Urdf(
|
||||
src="/static/robot.urdf",
|
||||
position=[0, 0, 0],
|
||||
key="robot",
|
||||
),
|
||||
|
||||
# Front camera
|
||||
Frustum(
|
||||
position=[0.5, 0.5, 0],
|
||||
rotation=[0, 0, 0],
|
||||
scale=[0.3, 0.3, 0.3],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
key="front-camera",
|
||||
),
|
||||
|
||||
# Left camera
|
||||
Frustum(
|
||||
position=[0, 0.5, 0.5],
|
||||
rotation=[0, -1.57, 0],
|
||||
scale=[0.3, 0.3, 0.3],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
key="left-camera",
|
||||
),
|
||||
|
||||
# Right camera
|
||||
Frustum(
|
||||
position=[0, 0.5, -0.5],
|
||||
rotation=[0, 1.57, 0],
|
||||
scale=[0.3, 0.3, 0.3],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
key="right-camera",
|
||||
),
|
||||
)
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Dynamic Frustum Updates
|
||||
|
||||
You can update frustum positions in real-time:
|
||||
|
||||
```python
|
||||
import math
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
session.set @ Scene(DefaultScene())
|
||||
|
||||
for i in range(1000):
|
||||
# Orbit the frustum around the origin
|
||||
x = 3 * math.cos(i * 0.05)
|
||||
z = 3 * math.sin(i * 0.05)
|
||||
rotation_y = i * 0.05
|
||||
|
||||
session.upsert @ Frustum(
|
||||
position=[x, 1, z],
|
||||
rotation=[0, rotation_y, 0],
|
||||
showImagePlane=True,
|
||||
showFrustum=True,
|
||||
key="orbiting-camera",
|
||||
)
|
||||
|
||||
await asyncio.sleep(0.033) # ~30 FPS
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
The stress test with 1,728 frustums demonstrates that Vuer can handle:
|
||||
- Large numbers of visualization objects
|
||||
- Complex geometric primitives
|
||||
- Real-time rendering of camera representations
|
||||
|
||||
This makes it practical for robotics applications requiring:
|
||||
- Multi-camera system visualization
|
||||
- SLAM trajectory visualization
|
||||
- Sensor fusion displays
|
||||
- Camera calibration tools
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/robotics/frustums.html
|
||||
181
docs/tutorials/robotics/go1-stairs.md
Normal file
181
docs/tutorials/robotics/go1-stairs.md
Normal file
@@ -0,0 +1,181 @@
|
||||
# Unitree Go1 Robot with Stairs
|
||||
|
||||
## Overview
|
||||
This tutorial demonstrates setting up a 3D scene containing a Unitree Go1 quadruped robot positioned in front of a staircase, along with lighting and atmospheric effects.
|
||||
|
||||
## Scene Components
|
||||
|
||||
The visualization includes four main elements:
|
||||
|
||||
1. **Unitree Go1 Robot** - A quadrupedal robot model loaded via URDF format
|
||||
2. **Stairway Mesh** - A textured 3D model of stairs
|
||||
3. **Fog Effect** - Atmospheric fog that darkens distant portions of the scene
|
||||
4. **Lighting** - An ambient light source plus two movable point lights
|
||||
|
||||
## Complete Code Example
|
||||
|
||||
```python
|
||||
import math
|
||||
import asyncio
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Urdf, Obj, AmbientLight, PointLight, Movable, Plane, Fog
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
# Setup scene with fog and lighting
|
||||
session.set @ Scene(
|
||||
# Add fog effect
|
||||
Fog(
|
||||
color="#000000",
|
||||
near=1,
|
||||
far=20,
|
||||
),
|
||||
|
||||
# Ground plane
|
||||
Plane(
|
||||
args=[100, 100],
|
||||
position=[0, -0.01, 0],
|
||||
rotation=[-1.57, 0, 0],
|
||||
key="ground",
|
||||
),
|
||||
|
||||
# Staircase mesh
|
||||
Obj(
|
||||
src="/static/stairs/stairs.obj",
|
||||
position=[2, 0, 0],
|
||||
rotation=[0, 0, 0],
|
||||
materialType="standard",
|
||||
material={
|
||||
"color": "#cccccc",
|
||||
"roughness": 0.8,
|
||||
},
|
||||
key="stairs",
|
||||
),
|
||||
|
||||
# Ambient lighting
|
||||
AmbientLight(intensity=1.0),
|
||||
|
||||
# Movable point lights
|
||||
Movable(
|
||||
PointLight(
|
||||
intensity=3.0,
|
||||
position=[2, 3, 2],
|
||||
key="light-1",
|
||||
)
|
||||
),
|
||||
Movable(
|
||||
PointLight(
|
||||
intensity=3.0,
|
||||
position=[-2, 3, -2],
|
||||
key="light-2",
|
||||
)
|
||||
),
|
||||
)
|
||||
|
||||
# Animation loop for the robot
|
||||
for i in range(10000):
|
||||
# Calculate joint angles using sinusoidal functions
|
||||
hip_angle = 0.3 * math.sin(i * 0.1)
|
||||
thigh_angle = 0.785 - 0.25 * math.sin(i * 0.1)
|
||||
calf_angle = -1.5 + 0.5 * math.sin(i * 0.1)
|
||||
|
||||
# Update robot position and joints
|
||||
session.upsert @ Urdf(
|
||||
src="/static/go1/go1.urdf",
|
||||
position=[0, 0, 0.33],
|
||||
rotation=[0, 0, 0],
|
||||
jointValues={
|
||||
# Front Left
|
||||
"FL_hip_joint": hip_angle,
|
||||
"FL_thigh_joint": thigh_angle,
|
||||
"FL_calf_joint": calf_angle,
|
||||
|
||||
# Front Right
|
||||
"FR_hip_joint": -hip_angle,
|
||||
"FR_thigh_joint": thigh_angle,
|
||||
"FR_calf_joint": calf_angle,
|
||||
|
||||
# Rear Left
|
||||
"RL_hip_joint": -hip_angle,
|
||||
"RL_thigh_joint": thigh_angle,
|
||||
"RL_calf_joint": calf_angle,
|
||||
|
||||
# Rear Right
|
||||
"RR_hip_joint": hip_angle,
|
||||
"RR_thigh_joint": thigh_angle,
|
||||
"RR_calf_joint": calf_angle,
|
||||
},
|
||||
key="go1-robot",
|
||||
)
|
||||
|
||||
# Update at ~60 FPS
|
||||
await asyncio.sleep(0.016)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
### Fog Effect
|
||||
Creates atmospheric depth:
|
||||
```python
|
||||
Fog(
|
||||
color="#000000", # Black fog
|
||||
near=1, # Fog starts at distance 1
|
||||
far=20, # Full fog at distance 20
|
||||
)
|
||||
```
|
||||
|
||||
### Ground Plane
|
||||
A large plane rotated to be horizontal:
|
||||
```python
|
||||
Plane(
|
||||
args=[100, 100], # 100x100 units
|
||||
position=[0, -0.01, 0], # Slightly below origin
|
||||
rotation=[-1.57, 0, 0], # Rotated 90° (π/2)
|
||||
)
|
||||
```
|
||||
|
||||
### Staircase Mesh
|
||||
3D model with material properties:
|
||||
```python
|
||||
Obj(
|
||||
src="/static/stairs/stairs.obj",
|
||||
materialType="standard",
|
||||
material={
|
||||
"color": "#cccccc",
|
||||
"roughness": 0.8,
|
||||
},
|
||||
)
|
||||
```
|
||||
|
||||
## Robot Animation
|
||||
|
||||
The application updates continuously at approximately 60 frames per second, calculating joint values like:
|
||||
|
||||
```python
|
||||
thigh_angle = 0.785 - 0.25 * math.sin(i * 0.1)
|
||||
```
|
||||
|
||||
This creates realistic walking motion across the stair scene.
|
||||
|
||||
## Leg Coordination
|
||||
|
||||
The Go1 has a specific gait pattern:
|
||||
- Front Left and Rear Right move together (same hip angle)
|
||||
- Front Right and Rear Left move together (opposite hip angle)
|
||||
|
||||
This creates a natural trotting gait.
|
||||
|
||||
## Assets Required
|
||||
|
||||
Make sure you have the following assets:
|
||||
- `go1.urdf` - Robot description file
|
||||
- `stairs.obj` - Staircase 3D model
|
||||
- Associated mesh files for the Go1 robot
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/robotics/urdf_go1_stairs.html
|
||||
148
docs/tutorials/robotics/mini-cheetah.md
Normal file
148
docs/tutorials/robotics/mini-cheetah.md
Normal file
@@ -0,0 +1,148 @@
|
||||
# MIT Mini Cheetah URDF Tutorial
|
||||
|
||||
## Overview
|
||||
This tutorial demonstrates serving a URDF (Unified Robot Description Format) file locally to visualize the MIT Mini Cheetah robot in Vuer with animated leg movements.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
Set up your assets directory:
|
||||
|
||||
```
|
||||
assets/mini_cheetah/
|
||||
├── meshes/
|
||||
│ ├── mini_abad.obj
|
||||
│ ├── mini_body.obj
|
||||
│ ├── mini_lower_link.obj
|
||||
│ └── mini_upper_link.obj
|
||||
└── mini_cheetah.urdf
|
||||
```
|
||||
|
||||
## Download Assets
|
||||
|
||||
Use wget to fetch the URDF and mesh files:
|
||||
|
||||
```bash
|
||||
mkdir -p assets/mini_cheetah/meshes
|
||||
cd assets/mini_cheetah
|
||||
|
||||
# Download URDF file
|
||||
wget https://raw.githubusercontent.com/vuer-ai/vuer/main/assets/mini_cheetah/mini_cheetah.urdf
|
||||
|
||||
# Download mesh files
|
||||
cd meshes
|
||||
wget https://raw.githubusercontent.com/vuer-ai/vuer/main/assets/mini_cheetah/meshes/mini_abad.obj
|
||||
wget https://raw.githubusercontent.com/vuer-ai/vuer/main/assets/mini_cheetah/meshes/mini_body.obj
|
||||
wget https://raw.githubusercontent.com/vuer-ai/vuer/main/assets/mini_cheetah/meshes/mini_lower_link.obj
|
||||
wget https://raw.githubusercontent.com/vuer-ai/vuer/main/assets/mini_cheetah/meshes/mini_upper_link.obj
|
||||
```
|
||||
|
||||
## Complete Code Example
|
||||
|
||||
```python
|
||||
import math
|
||||
import asyncio
|
||||
from vuer import Vuer
|
||||
from vuer.schemas import Scene, Urdf, AmbientLight, Movable, PointLight
|
||||
|
||||
# Configure app to serve static files from assets directory
|
||||
app = Vuer(static_root="assets/mini_cheetah")
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
# Setup scene with lighting
|
||||
session.set @ Scene(
|
||||
AmbientLight(intensity=0.8),
|
||||
Movable(
|
||||
PointLight(
|
||||
intensity=2.0,
|
||||
position=[1, 2, 1],
|
||||
key="point-light-1",
|
||||
)
|
||||
),
|
||||
Movable(
|
||||
PointLight(
|
||||
intensity=2.0,
|
||||
position=[-1, 2, -1],
|
||||
key="point-light-2",
|
||||
)
|
||||
),
|
||||
)
|
||||
|
||||
# Animation loop
|
||||
for i in range(1000):
|
||||
# Calculate joint angles using sine waves
|
||||
hip_angle = 0.5 * math.sin(i * 0.1)
|
||||
thigh_angle = 0.785 - 0.25 * math.sin(i * 0.1)
|
||||
calf_angle = -1.5 + 0.5 * math.sin(i * 0.1)
|
||||
|
||||
# Update robot with animated joints
|
||||
session.upsert @ Urdf(
|
||||
src="/static/mini_cheetah.urdf",
|
||||
position=[0, 0, 0],
|
||||
rotation=[0, 0, 0],
|
||||
jointValues={
|
||||
# Front Left leg
|
||||
"FL_hip_joint": hip_angle,
|
||||
"FL_thigh_joint": thigh_angle,
|
||||
"FL_calf_joint": calf_angle,
|
||||
|
||||
# Front Right leg
|
||||
"FR_hip_joint": -hip_angle,
|
||||
"FR_thigh_joint": thigh_angle,
|
||||
"FR_calf_joint": calf_angle,
|
||||
|
||||
# Rear Left leg
|
||||
"RL_hip_joint": hip_angle,
|
||||
"RL_thigh_joint": thigh_angle,
|
||||
"RL_calf_joint": calf_angle,
|
||||
|
||||
# Rear Right leg
|
||||
"RR_hip_joint": -hip_angle,
|
||||
"RR_thigh_joint": thigh_angle,
|
||||
"RR_calf_joint": calf_angle,
|
||||
},
|
||||
key="mini-cheetah",
|
||||
)
|
||||
|
||||
# Update at ~60 FPS
|
||||
await asyncio.sleep(0.016)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Joint Control
|
||||
|
||||
The Mini Cheetah has 12 joints (3 per leg):
|
||||
|
||||
- **Hip joint**: Abduction/adduction (side-to-side movement)
|
||||
- **Thigh joint**: Hip flexion/extension (forward/backward)
|
||||
- **Calf joint**: Knee flexion/extension
|
||||
|
||||
## Animation Details
|
||||
|
||||
The example creates a walking motion using sinusoidal functions:
|
||||
|
||||
```python
|
||||
hip_angle = 0.5 * math.sin(i * 0.1)
|
||||
thigh_angle = 0.785 - 0.25 * math.sin(i * 0.1)
|
||||
calf_angle = -1.5 + 0.5 * math.sin(i * 0.1)
|
||||
```
|
||||
|
||||
## Expected Result
|
||||
|
||||
When executed correctly, the tutorial produces a 3D visualization of the Mini Cheetah robot with animated leg movements displayed in a web interface at `http://localhost:8012`.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Robot not appearing
|
||||
- Verify all mesh files are downloaded
|
||||
- Check that `static_root` points to the correct directory
|
||||
- Ensure URDF file references correct mesh paths
|
||||
|
||||
### Animation not smooth
|
||||
- Adjust the sleep interval (currently 0.016s for ~60 FPS)
|
||||
- Reduce animation speed by changing the multiplier in `i * 0.1`
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/robotics/urdf_mini_cheetah.html
|
||||
134
docs/tutorials/robotics/using-urdf.md
Normal file
134
docs/tutorials/robotics/using-urdf.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# Using URDF Files in Vuer
|
||||
|
||||
## Overview
|
||||
|
||||
Vuer enables loading URDF (Unified Robot Description Format) files for robotics visualization. The framework supports mesh files in `.dae`, `.stl`, `.obj`, and `.ply` formats.
|
||||
|
||||
## Basic Implementation
|
||||
|
||||
The code example demonstrates loading URDF models:
|
||||
|
||||
```python
|
||||
from asyncio import sleep
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import Urdf
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(proxy: VuerSession):
|
||||
proxy.upsert @ Urdf(
|
||||
src="https://docs.vuer.ai/en/latest/_static/perseverance/rover/m2020.urdf",
|
||||
jointValues={},
|
||||
rotation=[3.14 / 2, 0, 0],
|
||||
position=[0, 0, -1.5],
|
||||
key="perseverance",
|
||||
)
|
||||
|
||||
# Keep the session alive
|
||||
while True:
|
||||
await sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Key Parameters
|
||||
|
||||
### src
|
||||
URL path to the URDF file. Can be a local path or remote URL.
|
||||
|
||||
### jointValues
|
||||
Dictionary for joint configuration. Use empty `{}` for default joint positions.
|
||||
|
||||
Example with joint values:
|
||||
```python
|
||||
jointValues={
|
||||
"joint_1": 0.5,
|
||||
"joint_2": -0.3,
|
||||
"knee_joint": 1.2,
|
||||
}
|
||||
```
|
||||
|
||||
### rotation
|
||||
Euler angles `[x, y, z]` for model orientation in radians.
|
||||
|
||||
### position
|
||||
3D coordinates `[x, y, z]` for model placement.
|
||||
|
||||
### key
|
||||
Unique identifier for the model. Used for updates and removal.
|
||||
|
||||
## Supported Mesh Formats
|
||||
|
||||
Vuer supports URDF files with mesh files in the following formats:
|
||||
- `.dae` (COLLADA)
|
||||
- `.stl` (STereoLithography)
|
||||
- `.obj` (Wavefront OBJ)
|
||||
- `.ply` (Polygon File Format)
|
||||
|
||||
## Complete Example with Multiple Robots
|
||||
|
||||
```python
|
||||
from asyncio import sleep
|
||||
from vuer import Vuer, VuerSession
|
||||
from vuer.schemas import Urdf, Scene, AmbientLight
|
||||
|
||||
app = Vuer()
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session: VuerSession):
|
||||
session.set @ Scene(
|
||||
# Add lighting
|
||||
AmbientLight(intensity=1.0),
|
||||
|
||||
# Load first robot
|
||||
Urdf(
|
||||
src="/static/robots/robot1.urdf",
|
||||
position=[0, 0, 0],
|
||||
rotation=[0, 0, 0],
|
||||
key="robot-1",
|
||||
),
|
||||
|
||||
# Load second robot
|
||||
Urdf(
|
||||
src="/static/robots/robot2.urdf",
|
||||
position=[2, 0, 0],
|
||||
rotation=[0, 0, 0],
|
||||
key="robot-2",
|
||||
),
|
||||
)
|
||||
|
||||
# Keep the session alive
|
||||
while True:
|
||||
await sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Serving Local URDF Files
|
||||
|
||||
To serve URDF files from your local filesystem:
|
||||
|
||||
```python
|
||||
from vuer import Vuer
|
||||
|
||||
# Point to the directory containing your URDF files
|
||||
app = Vuer(static_root="path/to/urdf/directory")
|
||||
|
||||
@app.spawn(start=True)
|
||||
async def main(session):
|
||||
session.upsert @ Urdf(
|
||||
src="/static/my_robot.urdf", # Relative to static_root
|
||||
position=[0, 0, 0],
|
||||
key="my-robot",
|
||||
)
|
||||
|
||||
while True:
|
||||
await sleep(1.0)
|
||||
|
||||
app.run()
|
||||
```
|
||||
|
||||
## Source
|
||||
|
||||
Documentation: https://docs.vuer.ai/en/latest/tutorials/robotics/urdf.html
|
||||
Reference in New Issue
Block a user