Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:19:24 +08:00
commit 74075be734
22 changed files with 2851 additions and 0 deletions

View File

@@ -0,0 +1,191 @@
---
name: fairdb-backup-manager
description: |
Automatically manages PostgreSQL backups with pgBackRest and Wasabi S3 storage when working with FairDB databases Activates when you request "fairdb backup manager" functionality.
allowed-tools: Read, Write, Edit, Grep, Glob, Bash
version: 1.0.0
---
# FairDB Backup Manager
## Purpose
I automatically handle all backup-related operations for FairDB PostgreSQL databases, including scheduling, verification, restoration, and monitoring of pgBackRest backups with Wasabi S3 storage.
## Activation Triggers
I activate when you:
- Mention "backup", "restore", "pgbackrest", or "recovery" in context of FairDB
- Work with PostgreSQL backup configurations
- Need to verify backup integrity
- Discuss disaster recovery or data protection
- Experience data loss or corruption issues
## Core Capabilities
### Backup Operations
- Configure pgBackRest with Wasabi S3
- Execute full, differential, and incremental backups
- Manage backup schedules and retention policies
- Compress and encrypt backup data
- Monitor backup health and success rates
### Restore Operations
- Perform point-in-time recovery (PITR)
- Restore specific databases or tables
- Test restore procedures without impacting production
- Validate restored data integrity
- Document recovery time objectives (RTO)
### Monitoring & Verification
- Check backup completion status
- Verify backup integrity with test restores
- Monitor backup size and growth trends
- Alert on backup failures or delays
- Generate backup compliance reports
## Automated Workflows
When activated, I will:
1. **Assess Current State**
- Check existing backup configuration
- Review backup history and success rate
- Identify any failed or missing backups
- Analyze storage usage and costs
2. **Optimize Configuration**
- Adjust retention policies based on requirements
- Configure optimal compression settings
- Set up parallel backup processes
- Implement incremental backup strategies
3. **Execute Operations**
- Run scheduled backups automatically
- Perform test restores monthly
- Clean up old backups per retention policy
- Monitor and alert on issues
4. **Document & Report**
- Maintain backup/restore runbooks
- Generate compliance reports
- Track metrics and trends
- Document recovery procedures
## Integration with FairDB Commands
I work seamlessly with these FairDB commands:
- `/fairdb-setup-backup` - Initial configuration
- `/fairdb-onboard-customer` - Customer-specific backups
- `/fairdb-emergency-response` - Disaster recovery
- `/fairdb-health-check` - Backup health monitoring
## Best Practices I Enforce
### Backup Strategy
- Full backups weekly (Sunday 2 AM)
- Differential backups daily
- Incremental backups hourly during business hours
- WAL archiving for point-in-time recovery
- Geographical redundancy with Wasabi regions
### Security
- AES-256 encryption for all backups
- Secure key management
- Access control and audit logging
- Encrypted transport to S3
- Immutable backup storage
### Testing
- Monthly restore tests
- Quarterly disaster recovery drills
- Automated integrity verification
- Performance benchmarking
- Documentation updates
## Proactive Monitoring
I continuously monitor for:
- Backup failures or delays
- Storage capacity issues
- Unusual backup sizes
- Performance degradation
- Compliance violations
## Emergency Response
During data loss incidents, I:
1. Assess the extent of data loss
2. Identify the best recovery point
3. Execute restore procedures
4. Verify data integrity
5. Document incident and recovery
## Reporting
I generate these reports automatically:
- Daily backup status summary
- Weekly storage utilization report
- Monthly compliance audit
- Quarterly DR test results
- Annual backup strategy review
## Cost Optimization
I help reduce backup costs by:
- Implementing intelligent retention policies
- Using compression effectively
- Cleaning up unnecessary backups
- Optimizing transfer methods
- Selecting appropriate storage tiers
## Example Scenarios
### Scenario: New Customer Onboarding
```
User: "Setting up database for new customer Acme Corp"
Me: I'll configure automated backups for Acme Corp:
- Creating dedicated pgBackRest stanza
- Setting up daily full backups with 30-day retention
- Configuring WAL archiving for PITR
- Scheduling test restores monthly
- Adding monitoring for backup success
```
### Scenario: Backup Failure Alert
```
System: "Backup failed for customer_db"
Me: I'm investigating the backup failure:
- Checking disk space: OK
- Verifying S3 connectivity: Failed
- Issue: Wasabi credentials expired
- Action: Rotating credentials and retrying backup
- Result: Backup completed successfully
```
### Scenario: Recovery Request
```
User: "Need to restore CustomerX database to yesterday 3 PM"
Me: Initiating point-in-time recovery:
- Target time: 2024-03-14 15:00:00
- Latest backup: Full backup from 2024-03-14 02:00:00
- Applying WAL logs to target time
- Restoration complete in 12 minutes
- Data verified and customer notified
```
## Success Metrics
I track and optimize:
- Backup success rate (target: >99.9%)
- Recovery time objective (target: <1 hour)
- Recovery point objective (target: <5 minutes)
- Storage efficiency (compression ratio >3:1)
- Cost per GB backed up
## Continuous Improvement
I learn from each operation to:
- Refine backup schedules
- Improve recovery procedures
- Optimize resource usage
- Enhance monitoring alerts
- Update documentation

View File

@@ -0,0 +1,26 @@
# Skill Assets
This directory contains static assets used by this skill.
## Purpose
Assets can include:
- Configuration files (JSON, YAML)
- Data files
- Templates
- Schemas
- Test fixtures
## Guidelines
- Keep assets small and focused
- Document asset purpose and format
- Use standard file formats
- Include schema validation where applicable
## Common Asset Types
- **config.json** - Configuration templates
- **schema.json** - JSON schemas
- **template.yaml** - YAML templates
- **test-data.json** - Test fixtures

View File

@@ -0,0 +1,26 @@
# Skill References
This directory contains reference materials that enhance this skill's capabilities.
## Purpose
References can include:
- Code examples
- Style guides
- Best practices documentation
- Template files
- Configuration examples
## Guidelines
- Keep references concise and actionable
- Use markdown for documentation
- Include clear examples
- Link to external resources when appropriate
## Types of References
- **examples.md** - Usage examples
- **style-guide.md** - Coding standards
- **templates/** - Reusable templates
- **patterns.md** - Design patterns

View File

@@ -0,0 +1,24 @@
# Skill Scripts
This directory contains optional helper scripts that support this skill's functionality.
## Purpose
Scripts here can be:
- Referenced by the skill for automation
- Used as examples for users
- Executed during skill activation
## Guidelines
- All scripts should be well-documented
- Include usage examples in comments
- Make scripts executable (`chmod +x`)
- Use `#!/bin/bash` or `#!/usr/bin/env python3` shebangs
## Adding Scripts
1. Create script file (e.g., `analyze.sh`, `process.py`)
2. Add documentation header
3. Make executable: `chmod +x script-name.sh`
4. Test thoroughly before committing

View File

@@ -0,0 +1,7 @@
# Assets
Bundled resources for fairdb-operations-kit skill
- [ ] customer_onboarding_template.md Template for customer onboarding documentation.
- [ ] health_check_report_template.json Template for health check reports.
- [ ] incident_response_checklist.md Checklist for incident response procedures.

View File

@@ -0,0 +1,32 @@
{
"skill": {
"name": "skill-name",
"version": "1.0.0",
"enabled": true,
"settings": {
"verbose": false,
"autoActivate": true,
"toolRestrictions": true
}
},
"triggers": {
"keywords": [
"example-trigger-1",
"example-trigger-2"
],
"patterns": []
},
"tools": {
"allowed": [
"Read",
"Grep",
"Bash"
],
"restricted": []
},
"metadata": {
"author": "Plugin Author",
"category": "general",
"tags": []
}
}

View File

@@ -0,0 +1,28 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Claude Skill Configuration",
"type": "object",
"required": ["name", "description"],
"properties": {
"name": {
"type": "string",
"pattern": "^[a-z0-9-]+$",
"maxLength": 64,
"description": "Skill identifier (lowercase, hyphens only)"
},
"description": {
"type": "string",
"maxLength": 1024,
"description": "What the skill does and when to use it"
},
"allowed-tools": {
"type": "string",
"description": "Comma-separated list of allowed tools"
},
"version": {
"type": "string",
"pattern": "^\\d+\\.\\d+\\.\\d+$",
"description": "Semantic version (x.y.z)"
}
}
}

View File

@@ -0,0 +1,27 @@
{
"testCases": [
{
"name": "Basic activation test",
"input": "trigger phrase example",
"expected": {
"activated": true,
"toolsUsed": ["Read", "Grep"],
"success": true
}
},
{
"name": "Complex workflow test",
"input": "multi-step trigger example",
"expected": {
"activated": true,
"steps": 3,
"toolsUsed": ["Read", "Write", "Bash"],
"success": true
}
}
],
"fixtures": {
"sampleInput": "example data",
"expectedOutput": "processed result"
}
}

View File

@@ -0,0 +1,11 @@
# References
Bundled resources for fairdb-operations-kit skill
- [ ] contabo_api_reference.md Contabo API documentation for VPS provisioning.
- [ ] postgres_configuration.md PostgreSQL 16 configuration best practices.
- [ ] pgbackrest_configuration.md pgBackRest configuration guide.
- [ ] wasabi_s3_configuration.md Wasabi S3 storage setup for backups.
- [ ] sop_001.md Standard Operating Procedure for VPS provisioning.
- [ ] sop_002.md Standard Operating Procedure for PostgreSQL installation.
- [ ] sop_003.md Standard Operating Procedure for backup configuration.

View File

@@ -0,0 +1,69 @@
# Skill Best Practices
Guidelines for optimal skill usage and development.
## For Users
### Activation Best Practices
1. **Use Clear Trigger Phrases**
- Match phrases from skill description
- Be specific about intent
- Provide necessary context
2. **Provide Sufficient Context**
- Include relevant file paths
- Specify scope of analysis
- Mention any constraints
3. **Understand Tool Permissions**
- Check allowed-tools in frontmatter
- Know what the skill can/cannot do
- Request appropriate actions
### Workflow Optimization
- Start with simple requests
- Build up to complex workflows
- Verify each step before proceeding
- Use skill consistently for related tasks
## For Developers
### Skill Development Guidelines
1. **Clear Descriptions**
- Include explicit trigger phrases
- Document all capabilities
- Specify limitations
2. **Proper Tool Permissions**
- Use minimal necessary tools
- Document security implications
- Test with restricted tools
3. **Comprehensive Documentation**
- Provide usage examples
- Document common pitfalls
- Include troubleshooting guide
### Maintenance
- Keep version updated
- Test after tool updates
- Monitor user feedback
- Iterate on descriptions
## Performance Tips
- Scope skills to specific domains
- Avoid overlapping trigger phrases
- Keep descriptions under 1024 chars
- Test activation reliability
## Security Considerations
- Never include secrets in skill files
- Validate all inputs
- Use read-only tools when possible
- Document security requirements

View File

@@ -0,0 +1,70 @@
# Skill Usage Examples
This document provides practical examples of how to use this skill effectively.
## Basic Usage
### Example 1: Simple Activation
**User Request:**
```
[Describe trigger phrase here]
```
**Skill Response:**
1. Analyzes the request
2. Performs the required action
3. Returns results
### Example 2: Complex Workflow
**User Request:**
```
[Describe complex scenario]
```
**Workflow:**
1. Step 1: Initial analysis
2. Step 2: Data processing
3. Step 3: Result generation
4. Step 4: Validation
## Advanced Patterns
### Pattern 1: Chaining Operations
Combine this skill with other tools:
```
Step 1: Use this skill for [purpose]
Step 2: Chain with [other tool]
Step 3: Finalize with [action]
```
### Pattern 2: Error Handling
If issues occur:
- Check trigger phrase matches
- Verify context is available
- Review allowed-tools permissions
## Tips & Best Practices
- ✅ Be specific with trigger phrases
- ✅ Provide necessary context
- ✅ Check tool permissions match needs
- ❌ Avoid vague requests
- ❌ Don't mix unrelated tasks
## Common Issues
**Issue:** Skill doesn't activate
**Solution:** Use exact trigger phrases from description
**Issue:** Unexpected results
**Solution:** Check input format and context
## See Also
- Main SKILL.md for full documentation
- scripts/ for automation helpers
- assets/ for configuration examples

View File

@@ -0,0 +1,10 @@
# Scripts
Bundled resources for fairdb-operations-kit skill
- [ ] fairdb_provision_vps.sh Automates VPS provisioning using Contabo API.
- [ ] fairdb_install_postgres.sh Installs and configures PostgreSQL 16.
- [ ] fairdb_setup_backup.sh Configures pgBackRest with Wasabi S3 storage.
- [ ] fairdb_onboard_customer.sh Automates customer onboarding process.
- [ ] fairdb_health_check.sh Performs comprehensive system health verification.
- [ ] fairdb_emergency_response.sh Guides incident response procedures.

View File

@@ -0,0 +1,42 @@
#!/bin/bash
# Helper script template for skill automation
# Customize this for your skill's specific needs
set -e
function show_usage() {
echo "Usage: $0 [options]"
echo ""
echo "Options:"
echo " -h, --help Show this help message"
echo " -v, --verbose Enable verbose output"
echo ""
}
# Parse arguments
VERBOSE=false
while [[ $# -gt 0 ]]; do
case $1 in
-h|--help)
show_usage
exit 0
;;
-v|--verbose)
VERBOSE=true
shift
;;
*)
echo "Unknown option: $1"
show_usage
exit 1
;;
esac
done
# Your skill logic here
if [ "$VERBOSE" = true ]; then
echo "Running skill automation..."
fi
echo "✅ Complete"

View File

@@ -0,0 +1,32 @@
#!/bin/bash
# Skill validation helper
# Validates skill activation and functionality
set -e
echo "🔍 Validating skill..."
# Check if SKILL.md exists
if [ ! -f "../SKILL.md" ]; then
echo "❌ Error: SKILL.md not found"
exit 1
fi
# Validate frontmatter
if ! grep -q "^---$" "../SKILL.md"; then
echo "❌ Error: No frontmatter found"
exit 1
fi
# Check required fields
if ! grep -q "^name:" "../SKILL.md"; then
echo "❌ Error: Missing 'name' field"
exit 1
fi
if ! grep -q "^description:" "../SKILL.md"; then
echo "❌ Error: Missing 'description' field"
exit 1
fi
echo "✅ Skill validation passed"