Initial commit

This commit is contained in:
Zhongwei Li
2025-11-29 18:26:08 +08:00
commit 8f22ddf339
295 changed files with 59710 additions and 0 deletions

View File

@@ -0,0 +1,732 @@
---
name: Plugin Build
description: Bundle plugin directory into a deployable Claude Code plugin package
---
# plugin.build
## Overview
**plugin.build** is the packaging tool that bundles your Betty Framework plugin into a distributable archive ready for deployment to Claude Code. It validates all entrypoints, gathers necessary files, and creates versioned packages with checksums.
## Purpose
Automates the creation of deployable plugin packages by:
- **Validating** all declared entrypoints and handler files
- **Gathering** all necessary plugin files (skills, utilities, registries)
- **Packaging** into `.tar.gz` or `.zip` archives
- **Generating** checksums for package verification
- **Reporting** validation results and build metrics
This eliminates manual packaging errors and ensures consistent, reproducible plugin distributions.
## What It Does
1. **Loads plugin.yaml**: Reads the plugin configuration
2. **Validates Entrypoints**: Checks that all command handlers exist on disk
3. **Gathers Files**: Collects skills, utilities, registries, and documentation
4. **Creates Package**: Bundles everything into a versioned archive
5. **Calculates Checksums**: Generates MD5 and SHA256 hashes
6. **Generates Manifest**: Creates manifest.json with entrypoint summary and checksums
7. **Creates Preview**: Generates plugin.preview.yaml for review before deployment
8. **Generates Report**: Outputs detailed build metrics as JSON
9. **Reports Issues**: Identifies missing files or validation errors
## Usage
### Basic Usage
```bash
python skills/plugin.build/plugin_build.py
```
Builds with defaults:
- Plugin: `./plugin.yaml`
- Format: `tar.gz`
- Output: `./dist/`
### Via Betty CLI
```bash
/plugin/build
```
### Custom Plugin Path
```bash
python skills/plugin.build/plugin_build.py /path/to/plugin.yaml
```
### Specify Output Format
```bash
python skills/plugin.build/plugin_build.py --format=zip
```
```bash
python skills/plugin.build/plugin_build.py --format=tar.gz
```
### Custom Output Directory
```bash
python skills/plugin.build/plugin_build.py --output-dir=/tmp/packages
```
### Full Options
```bash
python skills/plugin.build/plugin_build.py \
/custom/path/plugin.yaml \
--format=zip \
--output-dir=/var/packages
```
## Command-Line Arguments
| Argument | Type | Default | Description |
|----------|------|---------|-------------|
| `plugin_path` | Positional | `./plugin.yaml` | Path to plugin.yaml file |
| `--format` | Option | `tar.gz` | Package format (tar.gz or zip) |
| `--output-dir` | Option | `./dist` | Output directory for packages |
## Output Files
### Package Archive
**Naming convention**: `{plugin-name}-{version}.{format}`
Examples:
- `betty-framework-1.0.0.tar.gz`
- `betty-framework-1.0.0.zip`
**Location**: `{output-dir}/{package-name}.{format}`
### Manifest File
**Naming convention**: `manifest.json`
**Location**: `{output-dir}/manifest.json`
Contains plugin metadata, entrypoint summary, and package checksums. This is the primary file for plugin distribution and installation.
### Plugin Preview
**Naming convention**: `plugin.preview.yaml`
**Location**: `{output-dir}/plugin.preview.yaml`
Contains the current plugin configuration for review before deployment. Useful for comparing changes or validating the plugin structure.
### Build Report
**Naming convention**: `{plugin-name}-{version}-build-report.json`
Example:
- `betty-framework-1.0.0-build-report.json`
**Location**: `{output-dir}/{package-name}-build-report.json`
## Package Structure
The generated archive contains:
```
betty-framework-1.0.0/
├── plugin.yaml # Plugin manifest
├── requirements.txt # Python dependencies
├── README.md # Documentation (if exists)
├── LICENSE # License file (if exists)
├── CHANGELOG.md # Change log (if exists)
├── betty/ # Core utility package
│ ├── __init__.py
│ ├── config.py
│ ├── validation.py
│ ├── logging_utils.py
│ ├── file_utils.py
│ └── errors.py
├── registry/ # Registry files
│ ├── skills.json
│ ├── commands.json
│ ├── hooks.json
│ └── agents.json
└── skills/ # All active skills
├── api.define/
│ ├── skill.yaml
│ ├── SKILL.md
│ └── api_define.py
├── api.validate/
│ ├── skill.yaml
│ ├── SKILL.md
│ └── api_validate.py
└── ... (all other skills)
```
## Manifest Schema
The `manifest.json` file is the primary metadata file for plugin distribution:
```json
{
"name": "betty-framework",
"version": "1.0.0",
"description": "Betty Framework - Structured AI-assisted engineering",
"author": {
"name": "RiskExec",
"email": "platform@riskexec.com",
"url": "https://github.com/epieczko/betty"
},
"license": "MIT",
"metadata": {
"homepage": "https://github.com/epieczko/betty",
"repository": "https://github.com/epieczko/betty",
"documentation": "https://github.com/epieczko/betty/tree/main/docs",
"tags": ["framework", "api-development", "workflow"],
"generated_at": "2025-10-23T12:34:56.789012+00:00"
},
"requirements": {
"python": ">=3.11",
"packages": ["pyyaml"]
},
"permissions": ["filesystem:read", "filesystem:write", "process:execute"],
"package": {
"filename": "betty-framework-1.0.0.tar.gz",
"size_bytes": 245760,
"checksums": {
"md5": "a1b2c3d4e5f6...",
"sha256": "1234567890abcdef..."
}
},
"entrypoints": [
{
"command": "skill/define",
"handler": "skills/skill.define/skill_define.py",
"runtime": "python"
}
],
"commands_count": 18,
"agents": [
{
"name": "api.designer",
"description": "Design APIs with iterative refinement"
}
]
}
```
## Build Report Schema
```json
{
"build_timestamp": "2025-10-23T12:34:56.789012+00:00",
"plugin": {
"name": "betty-framework",
"version": "1.0.0",
"description": "Betty Framework - Structured AI-assisted engineering"
},
"validation": {
"total_commands": 18,
"valid_entrypoints": 18,
"missing_files": [],
"has_errors": false
},
"package": {
"path": "/home/user/betty/dist/betty-framework-1.0.0.tar.gz",
"size_bytes": 245760,
"size_human": "240.00 KB",
"files_count": 127,
"format": "tar.gz",
"checksums": {
"md5": "a1b2c3d4e5f6...",
"sha256": "1234567890abcdef..."
}
},
"entrypoints": [
{
"command": "skill/define",
"handler": "skills/skill.define/skill_define.py",
"runtime": "python",
"path": "/home/user/betty/skills/skill.define/skill_define.py"
}
]
}
```
## Outputs
### Success Response
```json
{
"ok": true,
"status": "success",
"package_path": "/home/user/betty/dist/betty-framework-1.0.0.tar.gz",
"report_path": "/home/user/betty/dist/betty-framework-1.0.0-build-report.json",
"build_report": { ... }
}
```
### Success with Warnings
```json
{
"ok": false,
"status": "success_with_warnings",
"package_path": "/home/user/betty/dist/betty-framework-1.0.0.tar.gz",
"report_path": "/home/user/betty/dist/betty-framework-1.0.0-build-report.json",
"build_report": {
"validation": {
"missing_files": [
"Command 'api.broken': handler not found at skills/api.broken/api_broken.py"
],
"has_errors": true
}
}
}
```
### Failure Response
```json
{
"ok": false,
"status": "failed",
"error": "plugin.yaml not found: /home/user/betty/plugin.yaml"
}
```
## Behavior
### 1. Plugin Loading
Reads and parses `plugin.yaml`:
- Validates YAML syntax
- Extracts plugin name, version, description
- Identifies all command entrypoints
### 2. Entrypoint Validation
For each command in `plugin.yaml`:
- Extracts handler script path
- Checks file existence on disk
- Reports valid and missing handlers
- Logs validation results
**Valid entrypoint**:
```yaml
- name: skill/validate
handler:
runtime: python
script: skills/skill.define/skill_define.py
```
**Missing handler** (reports warning):
```yaml
- name: broken/command
handler:
runtime: python
script: skills/broken/missing.py # File doesn't exist
```
### 3. File Gathering
Automatically includes:
**Always included**:
- `plugin.yaml` Plugin manifest
- `skills/*/` All skill directories referenced in commands
- `betty/` Core utility package
- `registry/*.json` All registry files
**Conditionally included** (if exist):
- `requirements.txt` Python dependencies
- `README.md` Plugin documentation
- `LICENSE` License file
- `CHANGELOG.md` Version history
**Excluded**:
- `__pycache__/` directories
- `.pyc` compiled Python files
- Hidden files (starting with `.`)
- Build artifacts and temporary files
### 4. Package Creation
**tar.gz format**:
- GZIP compression
- Preserves file permissions
- Cross-platform compatible
- Standard for Python packages
**zip format**:
- ZIP compression
- Wide compatibility
- Good for Windows environments
- Easy to extract without CLI tools
### 5. Checksum Generation
Calculates two checksums:
**MD5**: Fast, widely supported
```
a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6
```
**SHA256**: Cryptographically secure
```
1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef
```
Use checksums to verify package integrity after download or transfer.
## Examples
### Example 1: Standard Build
**Scenario**: Build Betty Framework plugin for distribution
```bash
# Ensure plugin.yaml is up to date
/plugin/sync
# Build the package
/plugin/build
```
**Output**:
```
INFO: 🏗️ Starting plugin build...
INFO: 📄 Plugin: /home/user/betty/plugin.yaml
INFO: 📦 Format: tar.gz
INFO: 📁 Output: /home/user/betty/dist
INFO: ✅ Loaded plugin.yaml from /home/user/betty/plugin.yaml
INFO: 📝 Validating 18 command entrypoints...
INFO: 📦 Gathering files from 16 skill directories...
INFO: 📦 Adding betty/ utility package...
INFO: 📦 Adding registry/ files...
INFO: 📦 Total files to package: 127
INFO: 🗜️ Creating tar.gz archive: /home/user/betty/dist/betty-framework-1.0.0.tar.gz
INFO: 🔐 Calculating checksums...
INFO: MD5: a1b2c3d4e5f6...
INFO: SHA256: 1234567890ab...
INFO: 📊 Build report: /home/user/betty/dist/betty-framework-1.0.0-build-report.json
INFO: 📋 Manifest: /home/user/betty/dist/manifest.json
INFO: 📋 Preview file created: /home/user/betty/dist/plugin.preview.yaml
INFO:
============================================================
INFO: 🎉 BUILD COMPLETE
============================================================
INFO: 📦 Package: /home/user/betty/dist/betty-framework-1.0.0.tar.gz
INFO: 📊 Report: /home/user/betty/dist/betty-framework-1.0.0-build-report.json
INFO: 📋 Manifest: /home/user/betty/dist/manifest.json
INFO: 👁️ Preview: /home/user/betty/dist/plugin.preview.yaml
INFO: ✅ Commands: 18/18
INFO: 📏 Size: 240.00 KB
INFO: 📝 Files: 127
============================================================
```
### Example 2: Build as ZIP
**Scenario**: Create Windows-friendly package
```bash
python skills/plugin.build/plugin_build.py --format=zip
```
**Result**: `dist/betty-framework-1.0.0.zip`
### Example 3: Build with Custom Output
**Scenario**: Build to specific release directory
```bash
python skills/plugin.build/plugin_build.py \
--format=tar.gz \
--output-dir=releases/v1.0.0
```
**Result**:
- `releases/v1.0.0/betty-framework-1.0.0.tar.gz`
- `releases/v1.0.0/betty-framework-1.0.0-build-report.json`
### Example 4: Detecting Missing Handlers
**Scenario**: Some handlers are missing
```bash
# Remove a handler file
rm skills/api.validate/api_validate.py
# Try to build
/plugin/build
```
**Output**:
```
INFO: 📝 Validating 18 command entrypoints...
WARNING: ❌ skill/api/validate: skills/api.validate/api_validate.py (not found)
INFO: ⚠️ Found 1 missing files:
INFO: - Command 'skill/api/validate': handler not found at skills/api.validate/api_validate.py
INFO: 📦 Gathering files from 15 skill directories...
...
INFO: ============================================================
INFO: 🎉 BUILD COMPLETE
INFO: ============================================================
INFO: ✅ Commands: 17/18
INFO: ⚠️ Warnings: 1
```
**Exit code**: 1 (failure due to validation errors)
### Example 5: Build Workflow
**Scenario**: Complete release workflow
```bash
# 1. Update registries
/registry/update
# 2. Sync plugin.yaml
/plugin/sync
# 3. Build package
/plugin/build
# 4. Verify package
tar -tzf dist/betty-framework-1.0.0.tar.gz | head -20
# 5. Check build report
cat dist/betty-framework-1.0.0-build-report.json | jq .
```
## Integration
### With plugin.sync
Always sync before building:
```bash
/plugin/sync && /plugin/build
```
### With Workflows
Include in release workflow:
```yaml
# workflows/plugin_release.yaml
name: plugin_release
version: 1.0.0
description: Release workflow for Betty Framework plugin
steps:
- skill: registry.update
description: Update all registries
- skill: plugin.sync
description: Generate plugin.yaml
- skill: plugin.build
args: ["--format=tar.gz"]
description: Build tar.gz package
- skill: plugin.build
args: ["--format=zip"]
description: Build zip package
```
### With CI/CD
Add to GitHub Actions:
```yaml
# .github/workflows/release.yml
- name: Build Plugin Package
run: |
python skills/plugin.sync/plugin_sync.py
python skills/plugin.build/plugin_build.py --format=tar.gz
python skills/plugin.build/plugin_build.py --format=zip
- name: Upload Packages
uses: actions/upload-artifact@v3
with:
name: plugin-packages
path: dist/betty-framework-*
```
## Validation Rules
### Entrypoint Validation
**Valid entrypoint**:
- Command name is defined
- Handler section exists
- Handler script path is specified
- Handler file exists on disk
- Runtime is specified
**Invalid entrypoint** (triggers warning):
- Missing handler script path
- Handler file doesn't exist
- Empty command name
### File Gathering Rules
**Skill directories included if**:
- Referenced in at least one command handler
- Contains valid handler file
**Files excluded**:
- Python cache files (`__pycache__`, `.pyc`)
- Hidden files (`.git`, `.env`, etc.)
- Build artifacts (`dist/`, `build/`)
- IDE files (`.vscode/`, `.idea/`)
## Common Errors
| Error | Cause | Solution |
|-------|-------|----------|
| "plugin.yaml not found" | Missing plugin.yaml | Run `/plugin/sync` first |
| "Invalid YAML" | Syntax error in plugin.yaml | Fix YAML syntax |
| "Unsupported output format" | Invalid --format value | Use `tar.gz` or `zip` |
| "Handler not found" | Missing handler file | Create handler or fix path |
| "Permission denied" | Cannot write to output dir | Check directory permissions |
## Files Read
- `plugin.yaml` Plugin manifest
- `skills/*/skill.yaml` Skill manifests (indirect via handlers)
- `skills/*/*.py` Handler scripts
- `betty/*.py` Utility modules
- `registry/*.json` Registry files
- `requirements.txt` Dependencies (if exists)
- `README.md`, `LICENSE`, `CHANGELOG.md` Documentation (if exist)
## Files Modified
- `{output-dir}/{plugin-name}-{version}.{format}` Created package archive
- `{output-dir}/{plugin-name}-{version}-build-report.json` Created build report
- `{output-dir}/manifest.json` Created plugin manifest with checksums and entrypoints
- `{output-dir}/plugin.preview.yaml` Created plugin preview for review
## Exit Codes
- **0**: Success (all handlers valid, package created)
- **1**: Failure (missing handlers or build error)
## Logging
Logs build progress with emojis for clarity:
```
INFO: 🏗️ Starting plugin build...
INFO: 📄 Plugin: /home/user/betty/plugin.yaml
INFO: ✅ Loaded plugin.yaml
INFO: 📝 Validating entrypoints...
INFO: 📦 Gathering files...
INFO: 🗜️ Creating tar.gz archive...
INFO: 🔐 Calculating checksums...
INFO: 📊 Build report written
INFO: 🎉 BUILD COMPLETE
```
## Best Practices
1. **Always Sync First**: Run `/plugin/sync` before `/plugin/build`
2. **Validate Before Building**: Ensure all skills are registered and active
3. **Check Build Reports**: Review validation warnings before distribution
4. **Verify Checksums**: Use checksums to verify package integrity
5. **Version Consistently**: Match plugin version with git tags
6. **Test Extraction**: Verify packages extract correctly
7. **Document Changes**: Keep CHANGELOG.md updated
## Troubleshooting
### Missing Files in Package
**Problem**: Expected files not in archive
**Solutions**:
- Ensure files exist in source directory
- Check that skills are referenced in commands
- Verify files aren't excluded (e.g., `.pyc`, `__pycache__`)
- Check file permissions
### Handler Validation Failures
**Problem**: Handlers marked as missing but files exist
**Solutions**:
- Verify exact path in plugin.yaml matches file location
- Check case sensitivity in file paths
- Ensure handler files have correct names
- Run `/plugin/sync` to update paths
### Package Size Too Large
**Problem**: Archive file is very large
**Solutions**:
- Remove unused skills from plugin.yaml
- Check for accidentally included large files
- Review skill directories for unnecessary data
- Consider splitting into multiple plugins
### Build Fails with Permission Error
**Problem**: Cannot write to output directory
**Solutions**:
- Create output directory with proper permissions
- Check disk space availability
- Verify write access to output directory
- Try different output directory with `--output-dir`
## Architecture
### Skill Category
**Infrastructure** Plugin.build is part of the distribution layer, preparing plugins for deployment.
### Design Principles
- **Validation First**: Check all handlers before packaging
- **Complete Packages**: Include all necessary dependencies
- **Reproducible**: Same source creates identical packages
- **Verifiable**: Checksums ensure package integrity
- **Transparent**: Detailed reporting of included files
- **Flexible**: Support multiple archive formats
### Package Philosophy
The package includes everything needed to run the plugin:
- **Skills**: All command handlers and manifests
- **Utilities**: Core betty package for shared functionality
- **Registries**: Skill, command, and hook definitions
- **Dependencies**: Python requirements
- **Documentation**: README, LICENSE, CHANGELOG
This creates a self-contained, portable plugin distribution.
## See Also
- **plugin.sync** Generate plugin.yaml ([SKILL.md](../plugin.sync/SKILL.md))
- **skill.define** Validate and register skills ([SKILL.md](../skill.define/SKILL.md))
- **registry.update** Update registries ([SKILL.md](../registry.update/SKILL.md))
- **Betty Distribution** Plugin marketplace guide ([docs/distribution.md](../../docs/distribution.md))
## Dependencies
- **plugin.sync**: Generate plugin.yaml before building
- **betty.config**: Configuration constants and paths
- **betty.logging_utils**: Logging infrastructure
## Status
**Active** Production-ready infrastructure skill
## Version History
- **0.1.0** (Oct 2025) Initial implementation with tar.gz/zip support, validation, and checksums

View File

@@ -0,0 +1 @@
# Auto-generated package initializer for skills.

View File

@@ -0,0 +1,648 @@
#!/usr/bin/env python3
"""
plugin_build.py - Implementation of the plugin.build Skill
Bundles plugin directory into a deployable Claude Code plugin package.
"""
import os
import sys
import json
import yaml
import tarfile
import zipfile
import hashlib
import shutil
from typing import Dict, Any, List, Optional, Tuple
from datetime import datetime, timezone
from pathlib import Path
from pydantic import ValidationError as PydanticValidationError
from betty.config import BASE_DIR
from betty.logging_utils import setup_logger
from betty.telemetry_capture import capture_skill_execution
from betty.models import PluginManifest
from utils.telemetry_utils import capture_telemetry
logger = setup_logger(__name__)
def load_plugin_yaml(plugin_path: str) -> Dict[str, Any]:
"""
Load and parse plugin.yaml file.
Args:
plugin_path: Path to plugin.yaml
Returns:
Parsed plugin configuration
Raises:
FileNotFoundError: If plugin.yaml doesn't exist
yaml.YAMLError: If YAML is invalid
ValueError: If schema validation fails
"""
try:
with open(plugin_path) as f:
config = yaml.safe_load(f)
# Validate with Pydantic schema
try:
PluginManifest.model_validate(config)
logger.info(f"✅ Loaded and validated plugin.yaml from {plugin_path}")
except PydanticValidationError as exc:
errors = []
for error in exc.errors():
field = ".".join(str(loc) for loc in error["loc"])
message = error["msg"]
errors.append(f"{field}: {message}")
error_msg = f"Plugin schema validation failed: {'; '.join(errors)}"
logger.error(f"{error_msg}")
raise ValueError(error_msg)
return config
except FileNotFoundError:
logger.error(f"❌ plugin.yaml not found: {plugin_path}")
raise
except yaml.YAMLError as e:
logger.error(f"❌ Invalid YAML in {plugin_path}: {e}")
raise
def validate_entrypoints(config: Dict[str, Any], base_dir: str) -> Tuple[List[Dict], List[str]]:
"""
Validate that all entrypoint handlers exist on disk.
Args:
config: Plugin configuration
base_dir: Base directory for the plugin
Returns:
Tuple of (valid_entrypoints, missing_files)
"""
valid_entrypoints = []
missing_files = []
commands = config.get("commands", [])
logger.info(f"📝 Validating {len(commands)} command entrypoints...")
for command in commands:
name = command.get("name", "unknown")
handler = command.get("handler", {})
script_path = handler.get("script", "")
if not script_path:
missing_files.append(f"Command '{name}': no handler script specified")
continue
full_path = os.path.join(base_dir, script_path)
if os.path.exists(full_path):
valid_entrypoints.append({
"command": name,
"handler": script_path,
"runtime": handler.get("runtime", "python"),
"path": full_path
})
logger.debug(f"{name}: {script_path}")
else:
missing_files.append(f"Command '{name}': handler not found at {script_path}")
logger.warning(f"{name}: {script_path} (not found)")
return valid_entrypoints, missing_files
def gather_package_files(config: Dict[str, Any], base_dir: str) -> List[Tuple[str, str]]:
"""
Gather all files that need to be included in the package.
Args:
config: Plugin configuration
base_dir: Base directory for the plugin
Returns:
List of (source_path, archive_path) tuples
"""
files_to_package = []
# Always include plugin.yaml
plugin_yaml_path = os.path.join(base_dir, "plugin.yaml")
if os.path.exists(plugin_yaml_path):
files_to_package.append((plugin_yaml_path, "plugin.yaml"))
logger.debug(" + plugin.yaml")
# Gather all skill directories from commands
skill_dirs = set()
commands = config.get("commands", [])
for command in commands:
handler = command.get("handler", {})
script_path = handler.get("script", "")
if script_path.startswith("skills/"):
# Extract skill directory (e.g., "skills/api.validate" from "skills/api.validate/api_validate.py")
parts = script_path.split("/")
if len(parts) >= 2:
skill_dir = f"{parts[0]}/{parts[1]}"
skill_dirs.add(skill_dir)
# Add all files from skill directories
logger.info(f"📦 Gathering files from {len(skill_dirs)} skill directories...")
for skill_dir in sorted(skill_dirs):
skill_path = os.path.join(base_dir, skill_dir)
if os.path.isdir(skill_path):
for root, dirs, files in os.walk(skill_path):
# Skip __pycache__ and .pyc files
dirs[:] = [d for d in dirs if d != "__pycache__"]
for file in files:
if file.endswith(".pyc") or file.startswith("."):
continue
source_path = os.path.join(root, file)
rel_path = os.path.relpath(source_path, base_dir)
files_to_package.append((source_path, rel_path))
logger.debug(f" + {rel_path}")
# Add betty/ utility package
betty_dir = os.path.join(base_dir, "betty")
if os.path.isdir(betty_dir):
logger.info("📦 Adding betty/ utility package...")
for root, dirs, files in os.walk(betty_dir):
dirs[:] = [d for d in dirs if d != "__pycache__"]
for file in files:
if file.endswith(".pyc") or file.startswith("."):
continue
source_path = os.path.join(root, file)
rel_path = os.path.relpath(source_path, base_dir)
files_to_package.append((source_path, rel_path))
logger.debug(f" + {rel_path}")
# Add registry/ files
registry_dir = os.path.join(base_dir, "registry")
if os.path.isdir(registry_dir):
logger.info("📦 Adding registry/ files...")
for file in os.listdir(registry_dir):
if file.endswith(".json"):
source_path = os.path.join(registry_dir, file)
rel_path = f"registry/{file}"
files_to_package.append((source_path, rel_path))
logger.debug(f" + {rel_path}")
# Add optional files if they exist
optional_files = [
"requirements.txt",
"README.md",
"LICENSE",
"CHANGELOG.md"
]
for filename in optional_files:
file_path = os.path.join(base_dir, filename)
if os.path.exists(file_path):
files_to_package.append((file_path, filename))
logger.debug(f" + {filename}")
logger.info(f"📦 Total files to package: {len(files_to_package)}")
return files_to_package
def create_tarball(files: List[Tuple[str, str]], output_path: str, plugin_name: str) -> str:
"""
Create a .tar.gz archive.
Args:
files: List of (source_path, archive_path) tuples
output_path: Output directory
plugin_name: Base name for the archive
Returns:
Path to created archive
"""
archive_path = os.path.join(output_path, f"{plugin_name}.tar.gz")
logger.info(f"🗜️ Creating tar.gz archive: {archive_path}")
with tarfile.open(archive_path, "w:gz") as tar:
for source_path, archive_path_in_tar in files:
# Add files with plugin name prefix
arcname = f"{plugin_name}/{archive_path_in_tar}"
tar.add(source_path, arcname=arcname)
logger.debug(f" Added: {arcname}")
return archive_path
def create_zip(files: List[Tuple[str, str]], output_path: str, plugin_name: str) -> str:
"""
Create a .zip archive.
Args:
files: List of (source_path, archive_path) tuples
output_path: Output directory
plugin_name: Base name for the archive
Returns:
Path to created archive
"""
archive_path = os.path.join(output_path, f"{plugin_name}.zip")
logger.info(f"🗜️ Creating zip archive: {archive_path}")
with zipfile.ZipFile(archive_path, "w", zipfile.ZIP_DEFLATED) as zf:
for source_path, archive_path_in_zip in files:
# Add files with plugin name prefix
arcname = f"{plugin_name}/{archive_path_in_zip}"
zf.write(source_path, arcname=arcname)
logger.debug(f" Added: {arcname}")
return archive_path
def calculate_checksum(file_path: str) -> Dict[str, str]:
"""
Calculate checksums for the package file.
Args:
file_path: Path to the package file
Returns:
Dictionary with md5 and sha256 checksums
"""
logger.info("🔐 Calculating checksums...")
md5_hash = hashlib.md5()
sha256_hash = hashlib.sha256()
with open(file_path, "rb") as f:
for chunk in iter(lambda: f.read(4096), b""):
md5_hash.update(chunk)
sha256_hash.update(chunk)
checksums = {
"md5": md5_hash.hexdigest(),
"sha256": sha256_hash.hexdigest()
}
logger.info(f" MD5: {checksums['md5']}")
logger.info(f" SHA256: {checksums['sha256']}")
return checksums
def generate_build_report(
config: Dict[str, Any],
valid_entrypoints: List[Dict],
missing_files: List[str],
package_path: str,
checksums: Dict[str, str],
files_count: int
) -> Dict[str, Any]:
"""
Generate build report JSON.
Args:
config: Plugin configuration
valid_entrypoints: List of validated entrypoints
missing_files: List of missing files
package_path: Path to created package
checksums: Package checksums
files_count: Number of files packaged
Returns:
Build report dictionary
"""
file_size = os.path.getsize(package_path)
report = {
"build_timestamp": datetime.now(timezone.utc).isoformat(),
"plugin": {
"name": config.get("name"),
"version": config.get("version"),
"description": config.get("description")
},
"validation": {
"total_commands": len(config.get("commands", [])),
"valid_entrypoints": len(valid_entrypoints),
"missing_files": missing_files,
"has_errors": len(missing_files) > 0
},
"package": {
"path": package_path,
"size_bytes": file_size,
"size_human": f"{file_size / 1024 / 1024:.2f} MB" if file_size > 1024 * 1024 else f"{file_size / 1024:.2f} KB",
"files_count": files_count,
"format": "tar.gz" if package_path.endswith(".tar.gz") else "zip",
"checksums": checksums
},
"entrypoints": valid_entrypoints
}
return report
def generate_manifest(
config: Dict[str, Any],
valid_entrypoints: List[Dict],
checksums: Dict[str, str],
package_path: str
) -> Dict[str, Any]:
"""
Generate manifest.json for the plugin package.
Args:
config: Plugin configuration
valid_entrypoints: List of validated entrypoints
checksums: Package checksums
package_path: Path to created package
Returns:
Manifest dictionary
"""
file_size = os.path.getsize(package_path)
package_filename = os.path.basename(package_path)
manifest = {
"name": config.get("name"),
"version": config.get("version"),
"description": config.get("description"),
"author": config.get("author", {}),
"license": config.get("license"),
"metadata": {
"homepage": config.get("metadata", {}).get("homepage"),
"repository": config.get("metadata", {}).get("repository"),
"documentation": config.get("metadata", {}).get("documentation"),
"tags": config.get("metadata", {}).get("tags", []),
"generated_at": datetime.now(timezone.utc).isoformat()
},
"requirements": {
"python": config.get("requirements", {}).get("python"),
"packages": config.get("requirements", {}).get("packages", [])
},
"permissions": config.get("permissions", []),
"package": {
"filename": package_filename,
"size_bytes": file_size,
"checksums": checksums
},
"entrypoints": [
{
"command": ep["command"],
"handler": ep["handler"],
"runtime": ep["runtime"]
}
for ep in valid_entrypoints
],
"commands_count": len(valid_entrypoints),
"agents": [
{
"name": agent.get("name"),
"description": agent.get("description")
}
for agent in config.get("agents", [])
]
}
return manifest
def create_plugin_preview(config: Dict[str, Any], output_path: str) -> Optional[str]:
"""
Create plugin.preview.yaml with current plugin configuration.
This allows reviewing changes before overwriting plugin.yaml.
Args:
config: Plugin configuration
output_path: Directory to write preview file
Returns:
Path to preview file or None if creation failed
"""
try:
preview_path = os.path.join(output_path, "plugin.preview.yaml")
# Add preview metadata
preview_config = config.copy()
if "metadata" not in preview_config:
preview_config["metadata"] = {}
preview_config["metadata"]["preview_generated_at"] = datetime.now(timezone.utc).isoformat()
preview_config["metadata"]["preview_note"] = "Review before applying to plugin.yaml"
with open(preview_path, "w") as f:
yaml.dump(preview_config, f, default_flow_style=False, sort_keys=False)
logger.info(f"📋 Preview file created: {preview_path}")
return preview_path
except Exception as e:
logger.warning(f"⚠️ Failed to create preview file: {e}")
return None
def build_plugin(
plugin_path: str = None,
output_format: str = "tar.gz",
output_dir: str = None
) -> Dict[str, Any]:
"""
Main build function.
Args:
plugin_path: Path to plugin.yaml (defaults to ./plugin.yaml)
output_format: Package format (tar.gz or zip)
output_dir: Output directory (defaults to ./dist)
Returns:
Build result dictionary
"""
# Track execution time for telemetry
start_time = datetime.now(timezone.utc)
# Set defaults
if plugin_path is None:
plugin_path = os.path.join(BASE_DIR, "plugin.yaml")
if output_dir is None:
output_dir = os.path.join(BASE_DIR, "dist")
# Normalize format
output_format = output_format.lower()
if output_format not in ["tar.gz", "zip"]:
raise ValueError(f"Unsupported output format: {output_format}. Use 'tar.gz' or 'zip'")
logger.info("🏗️ Starting plugin build...")
logger.info(f"📄 Plugin: {plugin_path}")
logger.info(f"📦 Format: {output_format}")
logger.info(f"📁 Output: {output_dir}")
# Load plugin.yaml
config = load_plugin_yaml(plugin_path)
# Get base directory (parent of plugin.yaml)
base_dir = os.path.dirname(os.path.abspath(plugin_path))
# Validate entrypoints
valid_entrypoints, missing_files = validate_entrypoints(config, base_dir)
if missing_files:
logger.warning(f"⚠️ Found {len(missing_files)} missing files:")
for missing in missing_files:
logger.warning(f" - {missing}")
# Gather files to package
files_to_package = gather_package_files(config, base_dir)
# Create output directory
os.makedirs(output_dir, exist_ok=True)
# Generate package name
plugin_name = config.get("name", "plugin")
plugin_version = config.get("version", "1.0.0")
package_basename = f"{plugin_name}-{plugin_version}"
# Create package
if output_format == "tar.gz":
package_path = create_tarball(files_to_package, output_dir, package_basename)
else:
package_path = create_zip(files_to_package, output_dir, package_basename)
# Calculate checksums
checksums = calculate_checksum(package_path)
# Generate build report
build_report = generate_build_report(
config,
valid_entrypoints,
missing_files,
package_path,
checksums,
len(files_to_package)
)
# Write build report JSON
report_path = os.path.join(output_dir, f"{package_basename}-build-report.json")
with open(report_path, "w") as f:
json.dump(build_report, f, indent=2)
logger.info(f"📊 Build report: {report_path}")
# Generate and write manifest.json
manifest = generate_manifest(config, valid_entrypoints, checksums, package_path)
manifest_path = os.path.join(output_dir, "manifest.json")
with open(manifest_path, "w") as f:
json.dump(manifest, f, indent=2)
logger.info(f"📋 Manifest: {manifest_path}")
# Create plugin preview (optional)
preview_path = create_plugin_preview(config, output_dir)
# Build result
result = {
"ok": not build_report["validation"]["has_errors"],
"status": "success" if not missing_files else "success_with_warnings",
"package_path": package_path,
"report_path": report_path,
"manifest_path": manifest_path,
"preview_path": preview_path,
"build_report": build_report,
"manifest": manifest
}
# Calculate execution duration and capture telemetry
end_time = datetime.now(timezone.utc)
duration_ms = int((end_time - start_time).total_seconds() * 1000)
capture_skill_execution(
skill_name="plugin.build",
inputs={
"plugin_path": plugin_path,
"output_format": output_format,
"output_dir": output_dir
},
status="success" if result["ok"] else "success_with_warnings",
duration_ms=duration_ms,
caller="cli",
plugin_name=config.get("name"),
plugin_version=config.get("version"),
files_count=len(files_to_package),
package_size_bytes=os.path.getsize(package_path),
warnings_count=len(missing_files)
)
return result
@capture_telemetry(skill_name="plugin.build", caller="cli")
def main():
"""Main CLI entry point."""
import argparse
parser = argparse.ArgumentParser(
description="Build deployable Claude Code plugin package"
)
parser.add_argument(
"plugin_path",
nargs="?",
default=None,
help="Path to plugin.yaml (defaults to ./plugin.yaml)"
)
parser.add_argument(
"--format",
choices=["tar.gz", "zip"],
default="tar.gz",
help="Package format (default: tar.gz)"
)
parser.add_argument(
"--output-dir",
default=None,
help="Output directory (default: ./dist)"
)
args = parser.parse_args()
try:
result = build_plugin(
plugin_path=args.plugin_path,
output_format=args.format,
output_dir=args.output_dir
)
# Print summary
report = result["build_report"]
logger.info("\n" + "=" * 60)
logger.info("🎉 BUILD COMPLETE")
logger.info("=" * 60)
logger.info(f"📦 Package: {result['package_path']}")
logger.info(f"📊 Report: {result['report_path']}")
logger.info(f"📋 Manifest: {result['manifest_path']}")
if result.get('preview_path'):
logger.info(f"👁️ Preview: {result['preview_path']}")
logger.info(f"✅ Commands: {report['validation']['valid_entrypoints']}/{report['validation']['total_commands']}")
logger.info(f"📏 Size: {report['package']['size_human']}")
logger.info(f"📝 Files: {report['package']['files_count']}")
if report["validation"]["missing_files"]:
logger.warning(f"⚠️ Warnings: {len(report['validation']['missing_files'])}")
logger.info("=" * 60)
print(json.dumps(result, indent=2))
sys.exit(0 if result["ok"] else 1)
except Exception as e:
logger.error(f"❌ Build failed: {e}")
result = {
"ok": False,
"status": "failed",
"error": str(e)
}
print(json.dumps(result, indent=2))
sys.exit(1)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,58 @@
name: plugin.build
version: 0.1.0
description: >
Automatically bundle a plugin directory (or the whole repo) into a deployable Claude Code plugin package.
Gathers all declared entrypoints, validates handler files exist, and packages everything into .tar.gz or .zip under /dist.
inputs:
- name: plugin_path
type: string
required: false
description: Path to plugin.yaml (defaults to ./plugin.yaml)
- name: output_format
type: string
required: false
description: Package format (tar.gz or zip, defaults to tar.gz)
- name: output_dir
type: string
required: false
description: Output directory (defaults to ./dist)
outputs:
- name: plugin_package
type: file
description: Packaged plugin archive (.tar.gz or .zip)
- name: build_report
type: object
description: JSON report with validated entrypoints, missing files, and package checksum
dependencies:
- plugin.sync
status: active
entrypoints:
- command: /plugin/build
handler: plugin_build.py
runtime: python
description: >
Bundle plugin directory into deployable package. Validates all entrypoints and handlers before packaging.
parameters:
- name: plugin_path
type: string
required: false
description: Path to plugin.yaml (defaults to ./plugin.yaml)
- name: output_format
type: string
required: false
description: Package format (tar.gz or zip, defaults to tar.gz)
- name: output_dir
type: string
required: false
description: Output directory (defaults to ./dist)
permissions:
- filesystem:read
- filesystem:write
tags:
- plugin
- packaging
- build
- deployment
- distribution