Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 09:05:02 +08:00
commit 265175ed82
23 changed files with 3329 additions and 0 deletions

View File

@@ -0,0 +1,212 @@
# Manipulating Camera Pose in Vuer
## Overview
This tutorial demonstrates how to programmatically control virtual camera positions and orientations within the Vuer framework, along with tracking user interactions.
## Key Concepts
### CameraView Component
Virtual cameras in Vuer are controlled through the `CameraView` component with parameters:
- **fov**: Field of view in degrees
- **width, height**: Resolution in pixels
- **position**: Camera position `[x, y, z]`
- **rotation**: Camera rotation `[x, y, z]`
- **matrix**: 4x4 transformation matrix (alternative to position/rotation)
- **stream**: Streaming mode (`"time"`, `"frame"`, or `"ondemand"`)
- **fps**: Frame rate for streaming
- **near, far**: Clipping planes
## Complete Example
```python
import asyncio
import pickle
from vuer import Vuer, VuerSession
from vuer.events import ClientEvent
from vuer.schemas import Scene, CameraView, DefaultScene, Urdf
from ml_logger import ML_Logger
# Initialize logger
logger = ML_Logger(root=".", prefix="assets")
# Load pre-recorded camera matrices
with open("assets/camera_movement.pkl", "rb") as f:
data = pickle.load(f)
matrices = [item["matrix"] for item in data]
app = Vuer()
# Event handler to track camera movements
async def track_movement(event: ClientEvent, sess: VuerSession):
"""Capture camera movement events"""
if event.key != "ego":
return
logger.log(**event.value, flush=True, silent=True)
print(f"Camera moved: {event.value['position']}")
app.add_handler("CAMERA_MOVE", track_movement)
@app.spawn(start=True)
async def main(proxy: VuerSession):
# Set up the scene
proxy.set @ Scene(
DefaultScene(),
# Add a robot for reference
Urdf(
src="/static/robot.urdf",
position=[0, 0, 0],
key="robot",
),
)
# Animate camera through recorded positions
for i in range(len(matrices)):
proxy.update @ [
CameraView(
key="ego",
fov=50,
width=320,
height=240,
matrix=matrices[i % len(matrices)],
stream="time",
fps=30,
near=0.1,
far=100,
),
]
await asyncio.sleep(0.033) # 30 FPS
# Keep session alive
while True:
await asyncio.sleep(1.0)
app.run()
```
## Dynamic Camera Control Methods
### Method 1: Using Transformation Matrix
```python
session.update @ CameraView(
key="ego",
matrix=[
[1, 0, 0, x],
[0, 1, 0, y],
[0, 0, 1, z],
[0, 0, 0, 1],
],
)
```
### Method 2: Using Position and Rotation
```python
session.update @ CameraView(
key="ego",
position=[x, y, z],
rotation=[rx, ry, rz], # Euler angles in radians
)
```
### Method 3: Animated Camera Path
```python
import math
for i in range(360):
theta = math.radians(i)
radius = 5
# Circular orbit
x = radius * math.cos(theta)
z = radius * math.sin(theta)
session.update @ CameraView(
key="ego",
position=[x, 2, z],
rotation=[0, theta, 0],
)
await asyncio.sleep(0.033) # 30 FPS
```
## Replaying Recorded Movements
Load and replay pre-recorded camera movements:
```python
import pickle
# Load recorded movements
with open("assets/camera_movement.pkl", "rb") as f:
movements = pickle.load(f)
# Replay movements
for movement in movements:
session.update @ CameraView(
key="ego",
matrix=movement["matrix"],
fov=50,
width=320,
height=240,
)
await asyncio.sleep(0.033) # 30 FPS
```
## Event Handling
Track user-initiated camera movements:
```python
async def track_movement(event: ClientEvent, sess: VuerSession):
"""Log user camera movements"""
if event.key != "ego":
return
# Access camera data
position = event.value.get("position")
rotation = event.value.get("rotation")
matrix = event.value.get("matrix")
print(f"Position: {position}")
print(f"Rotation: {rotation}")
# Save to logger
logger.log(**event.value, flush=True, silent=True)
app.add_handler("CAMERA_MOVE", track_movement)
```
## Streaming Modes
### "time" Mode
Continuous streaming at specified FPS:
```python
CameraView(stream="time", fps=30)
```
### "frame" Mode
Stream individual frames on demand.
### "ondemand" Mode
Only render when explicitly requested (most efficient):
```python
CameraView(stream="ondemand")
```
## Best Practices
1. **Use matrices for complex movements** - More precise than position/rotation
2. **Track user movements** - Enable interactive camera control
3. **Set appropriate FPS** - Balance smoothness and performance
4. **Use clipping planes** - Optimize rendering with near/far settings
5. **Use ondemand mode** - Save resources when continuous streaming isn't needed
## Source
Documentation: https://docs.vuer.ai/en/latest/tutorials/camera/move_camera.html