Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 09:05:02 +08:00
commit 265175ed82
23 changed files with 3329 additions and 0 deletions

View File

@@ -0,0 +1,334 @@
# MuJoCo VR Hand Control
## Overview
This tutorial demonstrates how to control virtual hands in MuJoCo by leveraging mocap (motion capture) points that track user hand poses in VR environments.
## Important Requirement
**SSL/HTTPS Required**: VR hand tracking requires secure connections. Use ngrok or localtunnel to set up SSL.
See the [SSL Proxy WebXR tutorial](../basics/ssl-proxy-webxr.md) for setup instructions.
## Mocap Point API
The implementation uses **XR Hand Naming Conventions** to link mocap bodies with hand joints.
### Naming Format
```
"{joint}-{left | right}"
```
Examples:
- `wrist-right`
- `middle-finger-phalanx-proximal-right`
- `thumb-tip-left`
## Hand Joint Mapping
The system defines **25 distinct hand joints** (indexed 0-24):
### Joint Index Reference
```python
HAND_JOINTS = {
0: "wrist",
# Thumb (1-4)
1: "thumb-metacarpal",
2: "thumb-phalanx-proximal",
3: "thumb-phalanx-distal",
4: "thumb-tip",
# Index finger (5-9)
5: "index-finger-metacarpal",
6: "index-finger-phalanx-proximal",
7: "index-finger-phalanx-intermediate",
8: "index-finger-phalanx-distal",
9: "index-finger-tip",
# Middle finger (10-14)
10: "middle-finger-metacarpal",
11: "middle-finger-phalanx-proximal",
12: "middle-finger-phalanx-intermediate",
13: "middle-finger-phalanx-distal",
14: "middle-finger-tip",
# Ring finger (15-19)
15: "ring-finger-metacarpal",
16: "ring-finger-phalanx-proximal",
17: "ring-finger-phalanx-intermediate",
18: "ring-finger-phalanx-distal",
19: "ring-finger-tip",
# Pinky finger (20-24)
20: "pinky-finger-metacarpal",
21: "pinky-finger-phalanx-proximal",
22: "pinky-finger-phalanx-intermediate",
23: "pinky-finger-phalanx-distal",
24: "pinky-finger-tip",
}
```
## Complete Example
```python
import asyncio
from vuer import Vuer, VuerSession
from vuer.schemas import (
Scene, Fog, Sphere,
MuJoCo, ContribLoader,
Hands
)
from vuer.events import ClientEvent
app = Vuer()
# Assets for hand simulation
HAND_ASSETS = [
"/static/mujoco/hands/scene.xml",
"/static/mujoco/hands/left_hand.xml",
"/static/mujoco/hands/right_hand.xml",
"/static/mujoco/hands/palm.obj",
"/static/mujoco/hands/finger.obj",
]
@app.add_handler("ON_MUJOCO_FRAME")
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
"""Handle physics updates"""
print("ON_MUJOCO_FRAME", event.value)
# Access mocap data
mocap_pos = event.value.get("mocap_pos")
mocap_quat = event.value.get("mocap_quat")
# Process hand tracking data
if mocap_pos and mocap_quat:
# Update hand positions based on tracking
pass
@app.spawn(start=True)
async def main(session: VuerSession):
# Load MuJoCo library
session.upsert @ ContribLoader(
library="@vuer-ai/mujoco-ts",
version="0.0.24",
entry="dist/index.umd.js",
key="mujoco-loader",
)
await asyncio.sleep(2.0)
# Set up scene with hands
session.set @ Scene(
bgChildren=[
# MuJoCo default styling
Fog(color=0x2C3F57, near=10, far=20),
# Add VR hands
Hands(),
# Background sphere
Sphere(
args=[50, 10, 10],
materialType="basic",
material=dict(color=0x2C3F57, side=1),
),
],
# Initialize MuJoCo with hand model
MuJoCo(
src="/static/mujoco/hands/scene.xml",
assets=HAND_ASSETS,
scale=0.1,
key="hand-sim",
),
)
# Keep session alive
while True:
await asyncio.sleep(1.0)
app.run()
```
## Key Components
### Hands Component
```python
Hands()
```
This enables VR hand tracking, providing position and orientation data for all 25 hand joints.
### Mocap Bodies in MuJoCo XML
In your MuJoCo scene XML, define mocap bodies using the XR hand naming convention:
```xml
<mujoco>
<worldbody>
<!-- Right hand mocap bodies -->
<body name="wrist-right" mocap="true">
<geom type="sphere" size="0.02" rgba="1 0 0 0.5"/>
</body>
<body name="thumb-tip-right" mocap="true">
<geom type="sphere" size="0.01" rgba="0 1 0 0.5"/>
</body>
<body name="index-finger-tip-right" mocap="true">
<geom type="sphere" size="0.01" rgba="0 0 1 0.5"/>
</body>
<!-- Add more joints as needed -->
<!-- Left hand mocap bodies -->
<body name="wrist-left" mocap="true">
<geom type="sphere" size="0.02" rgba="1 0 0 0.5"/>
</body>
<!-- Add left hand joints -->
</worldbody>
</mujoco>
```
## Accessing Hand Data
### Method 1: Event Handler
```python
@app.add_handler("ON_MUJOCO_FRAME")
async def on_frame(event: ClientEvent, sess: VuerSession):
# Get mocap positions (3D coordinates)
mocap_pos = event.value.get("mocap_pos")
# Get mocap quaternions (orientations)
mocap_quat = event.value.get("mocap_quat")
if mocap_pos:
# mocap_pos is a list of [x, y, z] positions
# Order matches the mocap body order in XML
wrist_pos = mocap_pos[0]
thumb_tip_pos = mocap_pos[1]
# etc.
print(f"Wrist position: {wrist_pos}")
```
### Method 2: Direct Hand Tracking
```python
from vuer.events import ClientEvent
@app.add_handler("HAND_MOVE")
async def on_hand_move(event: ClientEvent, sess: VuerSession):
"""Handle hand tracking events directly"""
hand_data = event.value
# Access hand side
side = hand_data.get("side") # "left" or "right"
# Access joint positions
joints = hand_data.get("joints") # List of 25 joint positions
print(f"{side} hand moved")
print(f"Wrist: {joints[0]}")
print(f"Index tip: {joints[9]}")
```
## Creating Hand-Object Interactions
```python
# In your MuJoCo XML
<mujoco>
<worldbody>
<!-- Hand mocap bodies -->
<body name="index-finger-tip-right" mocap="true"/>
<!-- Graspable object -->
<body name="cube" pos="0 0 0.5">
<freejoint/>
<geom type="box" size="0.05 0.05 0.05" rgba="1 1 0 1"/>
</body>
<!-- Equality constraint to connect finger to object -->
<equality>
<weld body1="index-finger-tip-right" body2="cube" active="false"/>
</equality>
</worldbody>
<actuator>
<!-- Actuator to activate/deactivate weld constraint -->
<general joint="cube" dyntype="none"/>
</actuator>
</mujoco>
```
## Example: Pinch Detection
```python
import numpy as np
@app.add_handler("ON_MUJOCO_FRAME")
async def detect_pinch(event: ClientEvent, sess: VuerSession):
mocap_pos = event.value.get("mocap_pos")
if mocap_pos and len(mocap_pos) >= 10:
# Get thumb tip and index finger tip positions
thumb_tip = np.array(mocap_pos[4]) # Index 4
index_tip = np.array(mocap_pos[9]) # Index 9
# Calculate distance
distance = np.linalg.norm(thumb_tip - index_tip)
# Detect pinch
if distance < 0.02: # 2cm threshold
print("Pinch detected!")
# Trigger grasp action
```
## VR Access
1. Start server:
```bash
python your_script.py
```
2. Set up ngrok:
```bash
ngrok http 8012
```
3. Access via VR headset:
```
https://vuer.ai?ws=wss://xxxxx.ngrok.io
```
4. Enable hand tracking in your VR headset settings
## Best Practices
1. **Use XR naming convention** - Follow the exact joint naming format
2. **Define all needed mocap bodies** - Only tracked joints need mocap bodies
3. **Set appropriate scale** - Scale simulation for VR comfort (e.g., 0.1)
4. **Handle both hands** - Create separate mocap bodies for left and right
5. **Test joint mapping** - Verify each joint is tracking correctly
## Troubleshooting
### Hands not tracking
- Verify SSL is properly set up
- Check that hand tracking is enabled in VR headset
- Confirm `Hands()` component is in the scene
### Mocap bodies not moving
- Verify mocap body names match XR convention exactly
- Check that `mocap="true"` is set in XML
- Ensure body names include `-left` or `-right` suffix
### Poor tracking accuracy
- Calibrate VR headset hand tracking
- Ensure good lighting conditions
- Check for hand occlusion issues
## Source
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mocap_hand_control.html

View File

@@ -0,0 +1,285 @@
# MuJoCo Motion Capture Control
## Overview
This tutorial demonstrates implementing mocap (motion capture) control within a MuJoCo physics simulation using the Vuer framework for VR/mixed reality applications.
## Key Dependencies
```python
from vuer import Vuer, VuerSession
from vuer.schemas import (
Scene, Fog, Sphere,
MuJoCo, ContribLoader,
MotionControllers, MotionControllerActuator
)
from vuer.events import ClientEvent
```
## Important Requirement
**SSL/HTTPS Required**: The server requires SSL for WebXR functionality. Use ngrok or localtunnel to convert:
- `ws://localhost:8012``wss://xxxxx.ngrok.io`
- `http://localhost:8012``https://xxxxx.ngrok.io`
See the [SSL Proxy WebXR tutorial](../basics/ssl-proxy-webxr.md) for setup instructions.
## Complete Example: Gripper Control
```python
import asyncio
from vuer import Vuer, VuerSession
from vuer.schemas import (
Scene, Fog, Sphere,
MuJoCo, ContribLoader,
MotionControllers, MotionControllerActuator
)
from vuer.events import ClientEvent
app = Vuer()
# Define all assets for the simulation
GRIPPER_ASSETS = [
"/static/mujoco/gripper/scene.xml",
"/static/mujoco/gripper/gripper.xml",
"/static/mujoco/gripper/bin.xml",
"/static/mujoco/gripper/table.xml",
"/static/mujoco/gripper/base.obj",
"/static/mujoco/gripper/finger.obj",
"/static/mujoco/gripper/bin.obj",
"/static/mujoco/gripper/table.obj",
]
# Event handler for physics updates
@app.add_handler("ON_MUJOCO_FRAME")
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
"""Respond to each simulation frame"""
frame_data = event.value
# Access simulation state
# qpos = frame_data.get("qpos") # Joint positions
# qvel = frame_data.get("qvel") # Joint velocities
# time = frame_data.get("time") # Simulation time
# Apply control inputs
# Update visualization
@app.spawn(start=True)
async def main(session: VuerSession):
# Step 1: Load MuJoCo library
session.upsert @ ContribLoader(
library="@vuer-ai/mujoco-ts",
version="0.0.24",
entry="dist/index.umd.js",
key="mujoco-loader",
)
# Wait for library to load
await asyncio.sleep(2.0)
# Step 2: Configure scene with VR controls
session.set @ Scene(
# Add fog effect (mimics MuJoCo's default styling)
Fog(
color=0x2C3F57,
near=10,
far=20,
),
# Add background sphere
Sphere(
args=[50, 10, 10],
materialType="basic",
material=dict(color=0x2C3F57, side=1), # BackSide
key="background",
),
# Add VR motion controllers
MotionControllers(),
# Add motion controller actuator for VR input
MotionControllerActuator(
key="controller-actuator",
),
# Initialize MuJoCo simulation
MuJoCo(
src="/static/mujoco/gripper/scene.xml",
assets=GRIPPER_ASSETS,
scale=0.1,
timeout=100,
key="gripper-sim",
),
)
# Keep session alive
while True:
await asyncio.sleep(1.0)
app.run()
```
## Key Components
### MotionControllers
Captures VR controller input:
```python
MotionControllers()
```
This enables tracking of VR controller positions, orientations, and button presses.
### MotionControllerActuator
Bridges VR input to MuJoCo simulation:
```python
MotionControllerActuator(
key="controller-actuator",
)
```
### MuJoCo Component with Scale
```python
MuJoCo(
src="/static/scene.xml",
assets=ASSETS,
scale=0.1, # Scale simulation (10% of original size)
timeout=100, # Timeout in milliseconds
key="sim",
)
```
## Event Handling: ON_MUJOCO_FRAME
This event fires on every physics update:
```python
@app.add_handler("ON_MUJOCO_FRAME")
async def on_mujoco_frame(event: ClientEvent, sess: VuerSession):
frame_data = event.value
# Simulation state
qpos = frame_data.get("qpos") # Joint positions
qvel = frame_data.get("qvel") # Joint velocities
time = frame_data.get("time") # Simulation time
ctrl = frame_data.get("ctrl") # Control inputs
# Apply control logic
new_ctrl = calculate_control(qpos, qvel)
# Update simulation
sess.upsert @ MuJoCo(
ctrl=new_ctrl,
key="gripper-sim",
)
```
## Scene Setup Pattern
### 1. Configure MuJoCo Styling
```python
Fog(
color=0x2C3F57, # MuJoCo default gray-blue
near=10,
far=20,
)
Sphere(
args=[50, 10, 10],
materialType="basic",
material=dict(color=0x2C3F57, side=1),
)
```
### 2. Add VR Input
```python
MotionControllers()
MotionControllerActuator()
```
### 3. Initialize Physics
```python
MuJoCo(
src="/static/scene.xml",
assets=ASSETS,
scale=0.1,
)
```
## Asset Organization
Organize your gripper assets:
```
static/mujoco/gripper/
├── scene.xml # Main scene
├── gripper.xml # Gripper model
├── bin.xml # Bin configuration
├── table.xml # Table configuration
├── base.obj # 3D meshes
├── finger.obj
├── bin.obj
└── table.obj
```
## VR Access
1. Start the server:
```bash
python your_script.py
```
2. Set up ngrok:
```bash
ngrok http 8012
```
3. Access via VR headset:
```
https://vuer.ai?ws=wss://xxxxx.ngrok.io
```
## Controlling the Simulation
### Method 1: Direct Control Values
```python
session.upsert @ MuJoCo(
ctrl=[0.5, -0.3, 0.0], # Control values for actuators
key="gripper-sim",
)
```
### Method 2: VR Controller Input
The `MotionControllerActuator` automatically maps VR controller movements to simulation controls.
### Method 3: Event-Based Control
```python
@app.add_handler("ON_MUJOCO_FRAME")
async def on_frame(event, sess):
# Read current state
qpos = event.value.get("qpos")
# Calculate control
ctrl = your_control_algorithm(qpos)
# Apply control
sess.upsert @ MuJoCo(ctrl=ctrl, key="gripper-sim")
```
## Best Practices
1. **Use SSL** - Required for WebXR functionality
2. **Add delays** - Wait for library to load before initializing simulation
3. **Handle events** - Use ON_MUJOCO_FRAME for responsive control
4. **Scale appropriately** - Adjust simulation scale for VR comfort
5. **Declare all assets** - Include every file in the assets list
## Source
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mocap_control.html

View File

@@ -0,0 +1,225 @@
# MuJoCo WASM Integration
## Overview
The MuJoCo component enables running physics simulations directly in the browser using WebAssembly technology. This allows for real-time physics simulation without requiring server-side computation.
## Key Components
### Required Libraries
- **Library**: `@vuer-ai/mujoco-ts`
- **Version**: `0.0.24`
- **Entry point**: `dist/index.umd.js`
### Asset Management
You need to supply a list of paths to relevant files via the `assets` attribute. This includes:
- XML configuration files
- 3D meshes (OBJ format)
- Textures (PNG format)
## Complete Example: Cassie Robot
```python
import asyncio
from vuer import Vuer
from vuer.schemas import Scene, Fog, MuJoCo, ContribLoader
app = Vuer()
# Define all assets needed for the simulation
CASSIE_ASSETS = [
"/static/mujoco/cassie/scene.xml",
"/static/mujoco/cassie/cassie.xml",
"/static/mujoco/cassie/pelvis.obj",
"/static/mujoco/cassie/left-hip.obj",
"/static/mujoco/cassie/left-thigh.obj",
"/static/mujoco/cassie/left-shin.obj",
"/static/mujoco/cassie/left-foot.obj",
"/static/mujoco/cassie/right-hip.obj",
"/static/mujoco/cassie/right-thigh.obj",
"/static/mujoco/cassie/right-shin.obj",
"/static/mujoco/cassie/right-foot.obj",
"/static/mujoco/cassie/texture.png",
]
@app.spawn(start=True)
async def main(session):
# Load the MuJoCo library
session.upsert @ ContribLoader(
library="@vuer-ai/mujoco-ts",
version="0.0.24",
entry="dist/index.umd.js",
key="mujoco-loader",
)
# Wait for library to load
await asyncio.sleep(2.0)
# Set up the scene with MuJoCo's default styling
session.set @ Scene(
Fog(
color=0x2C3F57, # MuJoCo default background
near=10,
far=20,
),
# Initialize MuJoCo simulation
MuJoCo(
src="/static/mujoco/cassie/scene.xml",
assets=CASSIE_ASSETS,
key="cassie-sim",
),
)
# Keep session alive
while True:
await asyncio.sleep(1.0)
app.run()
```
## Implementation Workflow
### 1. Load the Contrib Library
```python
session.upsert @ ContribLoader(
library="@vuer-ai/mujoco-ts",
version="0.0.24",
entry="dist/index.umd.js",
key="mujoco-loader",
)
```
### 2. Configure the Scene
Set up fog effects and background styling to match MuJoCo's default appearance:
```python
Fog(
color=0x2C3F57, # MuJoCo's default gray-blue
near=10,
far=20,
)
```
### 3. Provide Asset Paths
Supply URLs to all necessary model files:
```python
assets = [
"/static/scene.xml", # Main scene file
"/static/robot.xml", # Robot description
"/static/mesh1.obj", # 3D meshes
"/static/mesh2.obj",
"/static/texture.png", # Textures
]
```
### 4. Initialize MuJoCo Component
```python
MuJoCo(
src="/static/scene.xml", # Main XML file
assets=assets, # All required assets
key="mujoco-sim",
)
```
## Event Handling
Listen for simulation updates:
```python
async def on_mujoco_frame(event, session):
"""Handle physics updates"""
print("MuJoCo frame:", event.value)
# Access simulation state
# Apply control inputs
# Update visualization
app.add_handler("ON_MUJOCO_FRAME", on_mujoco_frame)
```
## Timing Considerations
### Option 1: Sleep Delay
```python
session.upsert @ ContribLoader(...)
await asyncio.sleep(2.0) # Wait for library to load
session.set @ MuJoCo(...)
```
### Option 2: Event Listener
```python
async def on_contrib_load(event, session):
"""Initialize MuJoCo after library loads"""
session.set @ MuJoCo(
src="/static/scene.xml",
assets=ASSETS,
)
app.add_handler("ON_CONTRIB_LOAD", on_contrib_load)
```
## Asset Organization
Organize your assets directory:
```
static/mujoco/
├── cassie/
│ ├── scene.xml # Main scene file
│ ├── cassie.xml # Robot configuration
│ ├── pelvis.obj # Body meshes
│ ├── left-hip.obj
│ ├── left-thigh.obj
│ ├── ...
│ └── texture.png # Textures
└── gripper/
├── scene.xml
├── ...
```
## Serving Assets
Configure Vuer to serve your assets:
```python
app = Vuer(static_root="assets")
```
Then reference assets with `/static/` prefix:
```python
src="/static/mujoco/cassie/scene.xml"
```
## Best Practices
1. **Load library first** - Always load ContribLoader before MuJoCo component
2. **List all assets** - Include every file referenced in XML
3. **Use relative paths** - XML files should reference meshes with relative paths
4. **Match MuJoCo styling** - Use fog and background colors for consistency
5. **Handle loading time** - Wait for library to load before initialization
## Troubleshooting
### Simulation not appearing
- Verify all assets are accessible
- Check ContribLoader has loaded (wait or use event)
- Ensure XML file is valid MuJoCo format
### Missing textures/meshes
- Confirm all assets are in the assets list
- Check file paths in XML files
- Verify static_root configuration
### Performance issues
- Consider simplifying the model
- Reduce mesh polygon counts
- Optimize texture sizes
## Source
Documentation: https://docs.vuer.ai/en/latest/tutorials/physics/mujoco_wasm.html