Initial commit
This commit is contained in:
24
.claude-plugin/plugin.json
Normal file
24
.claude-plugin/plugin.json
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"name": "asyncpg-to-sqlalchemy-converter",
|
||||
"description": "Convert asyncpg code in FastAPI projects to SQLAlchemy with asyncpg engine, supporting Supabase integration and lazy loading",
|
||||
"version": "1.0.0",
|
||||
"author": {
|
||||
"name": "Claude Code Plugin Developer",
|
||||
"email": "dev@example.com"
|
||||
},
|
||||
"skills": [
|
||||
"./skills"
|
||||
],
|
||||
"agents": [
|
||||
"./agents"
|
||||
],
|
||||
"commands": [
|
||||
"./commands"
|
||||
],
|
||||
"hooks": [
|
||||
"./hooks"
|
||||
],
|
||||
"mcp": [
|
||||
"./.mcp.json"
|
||||
]
|
||||
}
|
||||
13
.mcp.json
Normal file
13
.mcp.json
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"supabase-db": {
|
||||
"command": "python3",
|
||||
"args": [
|
||||
"/Users/kevinhill/.claude/plugins/marketplaces/claude-code-plugins/plugins/plugin-dev/scripts/supabase-mcp-server.py"
|
||||
],
|
||||
"env": {
|
||||
"PYTHONPATH": "/Users/kevinhill/.claude/plugins/marketplaces/claude-code-plugins/plugins/plugin-dev"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
3
README.md
Normal file
3
README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# asyncpg-to-sqlalchemy-converter
|
||||
|
||||
Convert asyncpg code in FastAPI projects to SQLAlchemy with asyncpg engine, supporting Supabase integration and lazy loading
|
||||
196
agents/conversion-analyzer.md
Normal file
196
agents/conversion-analyzer.md
Normal file
@@ -0,0 +1,196 @@
|
||||
# Conversion Analyzer Agent
|
||||
|
||||
A specialized AI agent that analyzes asyncpg code patterns and determines optimal SQLAlchemy conversion strategies. This agent handles complex conversion scenarios, edge cases, and provides detailed migration planning.
|
||||
|
||||
## Capabilities
|
||||
|
||||
### Code Pattern Analysis
|
||||
- Detects complex asyncpg usage patterns beyond simple detection
|
||||
- Analyzes query performance implications
|
||||
- Identifies conversion complexity and potential issues
|
||||
- Evaluates dependency chains and import relationships
|
||||
|
||||
### Conversion Strategy Planning
|
||||
- Creates detailed conversion plans with priorities
|
||||
- Identifies files that require manual intervention
|
||||
- Suggests optimal SQLAlchemy patterns for specific use cases
|
||||
- Plans testing and validation strategies
|
||||
|
||||
### Risk Assessment
|
||||
- Evaluates potential breaking changes
|
||||
- Identifies performance bottlenecks in conversion
|
||||
- Assesses data loss risks during migration
|
||||
- Provides rollback strategies
|
||||
|
||||
### Optimization Recommendations
|
||||
- Suggests performance improvements during conversion
|
||||
- Identifies opportunities for better async patterns
|
||||
- Recommends Supabase-specific optimizations
|
||||
- Evaluates connection pooling strategies
|
||||
|
||||
## Usage Patterns
|
||||
|
||||
### Complex Conversion Analysis
|
||||
When the detection phase identifies complex asyncpg patterns that require careful analysis:
|
||||
|
||||
```bash
|
||||
# Analyze specific complex files
|
||||
/agent:conversion-analyzer analyze --file ./src/database.py --complexity high
|
||||
|
||||
# Analyze entire project with detailed reporting
|
||||
/agent:conversion-analyzer analyze --path ./src --detailed-report
|
||||
```
|
||||
|
||||
### Risk Assessment
|
||||
Before performing large-scale conversions:
|
||||
|
||||
```bash
|
||||
# Assess conversion risks
|
||||
/agent:conversion-analyzer risk-assessment --path ./src --report-format json
|
||||
|
||||
# Generate rollback plan
|
||||
/agent:conversion-analyzer rollback-plan --backup-path ./backup
|
||||
```
|
||||
|
||||
### Performance Impact Analysis
|
||||
For performance-critical applications:
|
||||
|
||||
```bash
|
||||
# Analyze performance impact of conversion
|
||||
/agent:conversion-analyzer performance-analysis --baseline ./current_code
|
||||
|
||||
# Generate optimization recommendations
|
||||
/agent:conversion-analyzer optimize-recommendations --target-profile production
|
||||
```
|
||||
|
||||
## Analysis Features
|
||||
|
||||
### Deep Code Analysis
|
||||
- Understands asyncpg transaction patterns
|
||||
- Identifies custom connection pooling logic
|
||||
- Detects manual query building and optimization
|
||||
- Analyzes error handling and retry logic
|
||||
|
||||
### Dependency Mapping
|
||||
- Maps asyncpg dependencies across modules
|
||||
- Identifies shared database connection patterns
|
||||
- Analyzes middleware and dependency injection
|
||||
- Evaluates testing code dependencies
|
||||
|
||||
### Conversion Complexity Scoring
|
||||
- **Low Complexity**: Simple queries with standard patterns
|
||||
- **Medium Complexity**: Custom queries with moderate complexity
|
||||
- **High Complexity**: Advanced patterns, custom connection handling
|
||||
- **Critical**: Complex transaction logic, performance-critical code
|
||||
|
||||
### Manual Intervention Requirements
|
||||
- Complex query optimization patterns
|
||||
- Custom asyncpg extensions or wrappers
|
||||
- Performance-critical database operations
|
||||
- Business logic embedded in database operations
|
||||
|
||||
## Output Reports
|
||||
|
||||
### Conversion Plan Report
|
||||
```json
|
||||
{
|
||||
"conversion_plan": {
|
||||
"total_files": 45,
|
||||
"complexity_breakdown": {
|
||||
"low": 32,
|
||||
"medium": 10,
|
||||
"high": 2,
|
||||
"critical": 1
|
||||
},
|
||||
"recommended_approach": "incremental",
|
||||
"estimated_time": "4-6 hours",
|
||||
"manual_intervention_files": ["src/database.py", "src/complex_queries.py"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Risk Assessment Report
|
||||
```json
|
||||
{
|
||||
"risk_assessment": {
|
||||
"overall_risk": "medium",
|
||||
"breaking_changes": 3,
|
||||
"performance_impact": "minimal",
|
||||
"data_loss_risk": "low",
|
||||
"rollback_feasibility": "high"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Performance Impact Report
|
||||
```json
|
||||
{
|
||||
"performance_analysis": {
|
||||
"query_performance": "maintained_or_improved",
|
||||
"connection_efficiency": "improved",
|
||||
"memory_usage": "reduced",
|
||||
"recommendations": [
|
||||
"Implement connection pooling",
|
||||
"Add query result caching",
|
||||
"Optimize batch operations"
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Integration with Other Components
|
||||
|
||||
### Works with Detection Skill
|
||||
- Takes detection results as input for deeper analysis
|
||||
- Provides detailed conversion strategies for detected patterns
|
||||
- Prioritizes conversion order based on complexity and dependencies
|
||||
|
||||
### Supports Conversion Skill
|
||||
- Provides detailed conversion guidance
|
||||
- Suggests optimal SQLAlchemy patterns
|
||||
- Identifies edge cases that require special handling
|
||||
|
||||
### Enhances Validation Skill
|
||||
- Provides validation criteria for converted code
|
||||
- Identifies test scenarios based on original patterns
|
||||
- Suggests performance benchmarks
|
||||
|
||||
## Advanced Features
|
||||
|
||||
### Machine Learning Pattern Recognition
|
||||
- Learns from conversion patterns across multiple projects
|
||||
- Improves complexity scoring over time
|
||||
- Identifies common pitfalls and optimization opportunities
|
||||
- Provides pattern-based conversion recommendations
|
||||
|
||||
### Multi-Project Analysis
|
||||
- Can analyze dependencies across multiple services
|
||||
- Coordinates conversions for microservices architectures
|
||||
- Manages database schema changes across services
|
||||
- Coordinates testing across service boundaries
|
||||
|
||||
### Custom Rule Engine
|
||||
- Supports custom conversion rules for specific projects
|
||||
- Allows organization-specific patterns and conventions
|
||||
- Integrates with existing code quality tools
|
||||
- Supports compliance and security requirements
|
||||
|
||||
## Best Practices
|
||||
|
||||
### When to Use
|
||||
- Large codebases with complex asyncpg usage
|
||||
- Performance-critical applications requiring careful conversion
|
||||
- Projects with custom database logic and optimizations
|
||||
- Organizations with strict compliance requirements
|
||||
|
||||
### Integration Workflow
|
||||
1. Run detection phase first to identify patterns
|
||||
2. Use conversion analyzer for complex patterns
|
||||
3. Follow recommended conversion plan
|
||||
4. Use validation to ensure successful conversion
|
||||
|
||||
### Customization
|
||||
- Can be configured with project-specific rules
|
||||
- Supports custom complexity scoring criteria
|
||||
- Integrates with existing development workflows
|
||||
- Provides API for integration with CI/CD pipelines
|
||||
244
agents/schema-reflector.md
Normal file
244
agents/schema-reflector.md
Normal file
@@ -0,0 +1,244 @@
|
||||
# Schema Reflector Agent
|
||||
|
||||
A specialized AI agent that performs comprehensive database schema reflection, analyzes existing database structures, and generates optimized SQLAlchemy model definitions with proper relationships, constraints, and performance optimizations.
|
||||
|
||||
## Capabilities
|
||||
|
||||
### Database Schema Analysis
|
||||
- Connects to PostgreSQL/Supabase databases and reflects complete schema
|
||||
- Analyzes tables, columns, constraints, indexes, and relationships
|
||||
- Handles complex schemas including inheritance, partitions, and extensions
|
||||
- Supports multiple schemas and custom types
|
||||
|
||||
### Intelligent Model Generation
|
||||
- Generates SQLAlchemy models with proper type mappings and constraints
|
||||
- Creates bi-directional relationships with optimal loading strategies
|
||||
- Handles Supabase-specific features (UUIDs, JSONB, RLS policies)
|
||||
- Optimizes for performance with lazy loading and efficient querying
|
||||
|
||||
### Schema Documentation
|
||||
- Creates comprehensive documentation of database structure
|
||||
- Documents business logic embedded in schema constraints
|
||||
- Identifies potential issues and optimization opportunities
|
||||
- Generates visual schema diagrams and relationship maps
|
||||
|
||||
### Performance Optimization
|
||||
- Analyzes query patterns and suggests optimal indexing
|
||||
- Identifies N+1 query problems and suggests solutions
|
||||
- Recommends connection pooling configurations
|
||||
- Suggests denormalization opportunities for performance
|
||||
|
||||
## Usage Patterns
|
||||
|
||||
### Complete Schema Reflection
|
||||
For generating models from existing databases:
|
||||
|
||||
```bash
|
||||
# Reflect entire database
|
||||
/agent:schema-reflector reflect --connection-string $DATABASE_URL --output ./models/
|
||||
|
||||
# Reflect specific schema
|
||||
/agent:schema-reflector reflect --schema public --output ./models/base.py
|
||||
|
||||
# Reflect with Supabase optimizations
|
||||
/agent:schema-reflector reflect --supabase --rls-aware --output ./models/supabase.py
|
||||
```
|
||||
|
||||
### Incremental Schema Updates
|
||||
For updating existing models when schema changes:
|
||||
|
||||
```bash
|
||||
# Update existing models
|
||||
/agent:schema-reflector update --existing-models ./models/ --connection-string $DATABASE_URL
|
||||
|
||||
# Generate migration scripts
|
||||
/agent:schema-reflector generate-migration --from-schema ./current_schema.json --to-schema ./new_schema.json
|
||||
```
|
||||
|
||||
### Schema Analysis and Optimization
|
||||
For performance tuning and optimization:
|
||||
|
||||
```bash
|
||||
# Analyze performance issues
|
||||
/agent:schema-reflector analyze-performance --connection-string $DATABASE_URL --report
|
||||
|
||||
# Suggest optimizations
|
||||
/agent:schema-reflector optimize --connection-string $DATABASE_URL --recommendations
|
||||
|
||||
# Generate indexing strategy
|
||||
/agent:schema-reflector indexing-strategy --query-log ./slow_queries.log
|
||||
```
|
||||
|
||||
## Advanced Features
|
||||
|
||||
### Multi-Schema Support
|
||||
- Handles complex databases with multiple schemas
|
||||
- Maintains schema separation in generated models
|
||||
- Supports cross-schema relationships
|
||||
- Handles schema-specific configurations and permissions
|
||||
|
||||
### Custom Type Handling
|
||||
- Maps PostgreSQL custom types to SQLAlchemy types
|
||||
- Handles enum types and domain constraints
|
||||
- Supports array types and JSONB operations
|
||||
- Creates custom type definitions when needed
|
||||
|
||||
### Supabase Integration
|
||||
- Handles Supabase-specific table types and extensions
|
||||
- Integrates with Supabase auth tables
|
||||
- Understands Supabase RLS policy implications
|
||||
- Optimizes for Supabase connection pooling
|
||||
|
||||
### Performance-Aware Generation
|
||||
- Generates models optimized for common query patterns
|
||||
- Implements efficient relationship loading strategies
|
||||
- Suggests optimal indexing strategies
|
||||
- Identifies potential performance bottlenecks
|
||||
|
||||
## Output Formats
|
||||
|
||||
### SQLAlchemy Models
|
||||
```python
|
||||
# Generated model with relationships
|
||||
class User(Base):
|
||||
__tablename__ = "users"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
email = Column(String(255), unique=True, nullable=False, index=True)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||
|
||||
# Optimized relationships
|
||||
profiles = relationship("Profile", back_populates="user", lazy="selectin")
|
||||
posts = relationship("Post", back_populates="author", lazy="dynamic")
|
||||
```
|
||||
|
||||
### Schema Documentation
|
||||
```markdown
|
||||
## Database Schema Documentation
|
||||
|
||||
### Users Table
|
||||
- **Purpose**: User authentication and profile management
|
||||
- **Primary Key**: UUID (auto-generated)
|
||||
- **Indexes**: Unique index on email, created_at for sorting
|
||||
- **Relationships**: One-to-many with profiles and posts
|
||||
- **Constraints**: Email must be valid email format
|
||||
- **Business Logic**: Users can have multiple profiles for different contexts
|
||||
```
|
||||
|
||||
### Performance Analysis Report
|
||||
```json
|
||||
{
|
||||
"performance_analysis": {
|
||||
"query_patterns": {
|
||||
"frequent_queries": [
|
||||
"SELECT * FROM users WHERE email = ?",
|
||||
"SELECT users.*, profiles.* FROM users JOIN profiles ON users.id = profiles.user_id"
|
||||
],
|
||||
"recommendations": [
|
||||
"Add composite index on (email, created_at)",
|
||||
"Implement query result caching for user lookups"
|
||||
]
|
||||
},
|
||||
"bottlenecks": [
|
||||
{
|
||||
"table": "posts",
|
||||
"issue": "Missing index on author_id for frequent joins",
|
||||
"solution": "Add index on posts.author_id"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Migration Scripts
|
||||
```python
|
||||
# Alembic migration script
|
||||
def upgrade():
|
||||
# Add new column
|
||||
op.add_column('users', sa.Column('last_login', sa.DateTime(timezone=True), nullable=True))
|
||||
|
||||
# Create index for performance
|
||||
op.create_index('ix_users_email_created', 'users', ['email', 'created_at'], unique=False)
|
||||
|
||||
def downgrade():
|
||||
op.drop_index('ix_users_email_created', table_name='users')
|
||||
op.drop_column('users', 'last_login')
|
||||
```
|
||||
|
||||
## Integration with Other Components
|
||||
|
||||
### Works with Model Generation Command
|
||||
- Provides core reflection functionality for model generation
|
||||
- Handles complex schema scenarios beyond basic reflection
|
||||
- Generates optimized models with performance considerations
|
||||
|
||||
### Supports Validation Agent
|
||||
- Provides schema validation capabilities
|
||||
- Identifies inconsistencies between models and database
|
||||
- Validates relationships and constraints
|
||||
|
||||
### Enhances Supabase Integration
|
||||
- Understands Supabase-specific schema patterns
|
||||
- Optimizes for Supabase performance characteristics
|
||||
- Handles Supabase auth and storage integration
|
||||
|
||||
## Advanced Configuration
|
||||
|
||||
### Custom Type Mappings
|
||||
```python
|
||||
# Custom type mapping configuration
|
||||
TYPE_MAPPINGS = {
|
||||
"custom_enum": "sqlalchemy.Enum",
|
||||
"vector": "pgvector.Vector",
|
||||
"tsvector": "sqlalchemy.dialects.postgresql.TSVECTOR"
|
||||
}
|
||||
```
|
||||
|
||||
### Relationship Loading Strategies
|
||||
```python
|
||||
# Configure optimal loading strategies
|
||||
RELATIONSHIP_CONFIG = {
|
||||
"selectin": "small_result_sets",
|
||||
"joined": "always_needed",
|
||||
"subquery": "large_result_sets",
|
||||
"dynamic": "large_collections"
|
||||
}
|
||||
```
|
||||
|
||||
### Performance Optimization Rules
|
||||
```python
|
||||
# Custom optimization rules
|
||||
OPTIMIZATION_RULES = {
|
||||
"index_foreign_keys": True,
|
||||
"add_composite_indexes": True,
|
||||
"optimize_date_queries": True,
|
||||
"cache_frequent_lookups": True
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### When to Use
|
||||
- New projects starting from existing databases
|
||||
- Migrating projects with complex schemas
|
||||
- Performance optimization of existing SQLAlchemy models
|
||||
- Documentation and analysis of legacy databases
|
||||
|
||||
### Integration Workflow
|
||||
1. Connect to database and analyze schema structure
|
||||
2. Generate initial models with basic relationships
|
||||
3. Analyze query patterns and optimize models
|
||||
4. Create migration scripts for schema changes
|
||||
5. Validate generated models against database
|
||||
|
||||
### Performance Considerations
|
||||
- Use lazy loading strategies appropriate to data sizes
|
||||
- Implement proper indexing based on query patterns
|
||||
- Consider connection pooling for high-traffic applications
|
||||
- Monitor performance after deployment and optimize as needed
|
||||
|
||||
### Schema Evolution
|
||||
- Handle schema changes gracefully with migrations
|
||||
- Maintain backward compatibility when possible
|
||||
- Test migrations thoroughly before deployment
|
||||
- Document schema changes and their implications
|
||||
64
commands/convert-project.md
Normal file
64
commands/convert-project.md
Normal file
@@ -0,0 +1,64 @@
|
||||
Convert asyncpg FastAPI project to SQLAlchemy async patterns
|
||||
|
||||
This command analyzes a FastAPI project, detects all asyncpg usage patterns, and systematically converts them to SQLAlchemy 2.0+ with async support while maintaining full functionality.
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
/convert-asyncpg-to-sqlalchemy [options]
|
||||
```
|
||||
|
||||
## Options
|
||||
|
||||
- `--path <directory>`: Project directory to analyze (default: current directory)
|
||||
- `--backup <directory>`: Backup location before conversion (default: ./backup_asyncpg)
|
||||
- `--supabase`: Enable Supabase-specific optimizations and integrations
|
||||
- `--models-only`: Only convert models, skip utility functions
|
||||
- `--dry-run`: Preview changes without modifying files
|
||||
- `--interactive`: Prompt for confirmation on major changes
|
||||
|
||||
## Process
|
||||
|
||||
### Phase 1: Detection & Analysis
|
||||
1. Scan all Python files for asyncpg imports and usage patterns
|
||||
2. Analyze connection methods, query patterns, and transaction handling
|
||||
3. Generate detailed conversion report with complexity assessment
|
||||
|
||||
### Phase 2: Backup Creation
|
||||
1. Create complete backup of original code
|
||||
2. Generate conversion log for rollback capabilities
|
||||
3. Document all detected patterns and planned changes
|
||||
|
||||
### Phase 3: Systematic Conversion
|
||||
1. Update imports from asyncpg to SQLAlchemy
|
||||
2. Convert connection patterns to async session management
|
||||
3. Transform query syntax (fetch → execute, parameter binding)
|
||||
4. Update transaction handling patterns
|
||||
5. Convert error handling to SQLAlchemy exceptions
|
||||
|
||||
### Phase 4: Validation
|
||||
1. Syntax validation of converted code
|
||||
2. Import verification and dependency checking
|
||||
3. Basic functionality testing of converted patterns
|
||||
|
||||
### Phase 5: Documentation
|
||||
1. Generate conversion summary report
|
||||
2. Create migration guide with before/after examples
|
||||
3. Document any manual intervention requirements
|
||||
|
||||
## Examples
|
||||
|
||||
Convert current directory with Supabase support:
|
||||
```bash
|
||||
/convert-asyncpg-to-sqlalchemy --supabase
|
||||
```
|
||||
|
||||
Dry run to preview changes:
|
||||
```bash
|
||||
/convert-asyncpg-to-sqlalchemy --dry-run --path ./my-fastapi-app
|
||||
```
|
||||
|
||||
Interactive conversion with custom backup:
|
||||
```bash
|
||||
/convert-asyncpg-to-sqlalchemy --path ./src --backup ./original_code --interactive
|
||||
```
|
||||
92
commands/create-session.md
Normal file
92
commands/create-session.md
Normal file
@@ -0,0 +1,92 @@
|
||||
Create SQLAlchemy async session management setup
|
||||
|
||||
This command generates complete async session management configuration for FastAPI projects, including dependency injection, connection pooling, error handling, and Supabase integration patterns.
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
/create-async-session [options]
|
||||
```
|
||||
|
||||
## Options
|
||||
|
||||
- `--output <directory>`: Output directory for session files (default: ./database)
|
||||
- `--supabase`: Include Supabase-specific configurations
|
||||
- `--pool-size <number>`: Connection pool size (default: 10)
|
||||
- `--max-overflow <number>`: Maximum overflow connections (default: 0)
|
||||
- `--testing`: Include testing configuration and fixtures
|
||||
- `--migrations`: Include Alembic migration setup
|
||||
- `--docker`: Generate Docker Compose configuration
|
||||
|
||||
## Generated Components
|
||||
|
||||
### Core Session Management
|
||||
- Async engine configuration with proper connection pooling
|
||||
- Async session factory setup
|
||||
- FastAPI dependency injection patterns
|
||||
- Connection lifecycle management
|
||||
|
||||
### Database Configuration
|
||||
- Environment-based configuration management
|
||||
- Connection string handling with security
|
||||
- Pool optimization for different deployment targets
|
||||
- Serverless environment optimizations
|
||||
|
||||
### Error Handling & Monitoring
|
||||
- Database error handling patterns
|
||||
- Connection retry logic with exponential backoff
|
||||
- Health check endpoints for database connectivity
|
||||
- Logging and monitoring setup
|
||||
|
||||
### Testing Support
|
||||
- In-memory database configuration for testing
|
||||
- Test fixtures and utilities
|
||||
- Transaction rollback testing patterns
|
||||
- Mock session providers
|
||||
|
||||
### Supabase Integration (optional)
|
||||
- Supabase auth integration with RLS
|
||||
- Service key management
|
||||
- Row Level Security context handling
|
||||
- Supabase-specific connection optimizations
|
||||
|
||||
## Examples
|
||||
|
||||
Create basic session setup:
|
||||
```bash
|
||||
/create-async-session --output ./src/database
|
||||
```
|
||||
|
||||
Create Supabase-enabled session management:
|
||||
```bash
|
||||
/create-async-session --supabase --pool-size 20 --testing
|
||||
```
|
||||
|
||||
Complete setup with migrations and Docker:
|
||||
```bash
|
||||
/create-async-session --testing --migrations --docker --supabase
|
||||
```
|
||||
|
||||
## Generated Files
|
||||
|
||||
### Core Files
|
||||
- `database.py` - Main database configuration and session factory
|
||||
- `dependencies.py` - FastAPI dependency injection patterns
|
||||
- `config.py` - Environment-based configuration management
|
||||
- `exceptions.py` - Custom database exception handlers
|
||||
|
||||
### Optional Files
|
||||
- `testing.py` - Testing configuration and fixtures
|
||||
- `migrations/` - Alembic migration setup
|
||||
- `docker-compose.yml` - Database container configuration
|
||||
- `supabase_integration.py` - Supabase-specific integration patterns
|
||||
|
||||
### Features
|
||||
- Async session management with proper cleanup
|
||||
- Connection pooling optimized for different environments
|
||||
- Error handling with retry mechanisms
|
||||
- Testing utilities with in-memory database support
|
||||
- Supabase auth and RLS integration
|
||||
- Health check endpoints and monitoring
|
||||
- Docker development environment setup
|
||||
- Comprehensive logging and debugging support
|
||||
78
commands/generate-models.md
Normal file
78
commands/generate-models.md
Normal file
@@ -0,0 +1,78 @@
|
||||
Generate SQLAlchemy models from database schema
|
||||
|
||||
This command connects to your database (PostgreSQL/Supabase), reflects the schema structure, and generates complete SQLAlchemy model definitions with proper relationships, constraints, and type mappings.
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
/generate-sqlalchemy-models [options]
|
||||
```
|
||||
|
||||
## Options
|
||||
|
||||
- `--url <connection_string>`: Database connection string (or uses SUPABASE_URL env var)
|
||||
- `--schema <name>`: Schema to reflect (default: public)
|
||||
- `--output <file>`: Output file for generated models (default: models.py)
|
||||
- `--base-class <name>`: Base class for all models (default: Base)
|
||||
- `--lazy-load`: Enable lazy loading for large schemas
|
||||
- `--include-extensions`: Include table relationships from database extensions
|
||||
- `--supabase-optimize`: Optimize for Supabase-specific features (RLS, UUIDs, etc.)
|
||||
|
||||
## Schema Reflection Features
|
||||
|
||||
### Automatic Type Detection
|
||||
- Maps PostgreSQL types to SQLAlchemy types
|
||||
- Handles Supabase-specific types (uuid_generate_v4(), jsonb, timestamptz)
|
||||
- Detects auto-incrementing primary keys and sequences
|
||||
|
||||
### Relationship Generation
|
||||
- Automatically detects foreign key constraints
|
||||
- Creates bi-directional relationships with proper back_populates
|
||||
- Handles many-to-many relationships through junction tables
|
||||
|
||||
### Constraint Mapping
|
||||
- Primary key constraints (composite keys supported)
|
||||
- Unique constraints and indexes
|
||||
- Check constraints and default values
|
||||
- NOT NULL constraints and nullable columns
|
||||
|
||||
### Supabase Integration
|
||||
- Row Level Security (RLS) policy hints
|
||||
- Supabase auth user table relationships
|
||||
- Storage bucket integration patterns
|
||||
- Webhook table handling
|
||||
|
||||
## Examples
|
||||
|
||||
Generate models from Supabase:
|
||||
```bash
|
||||
/generate-sqlalchemy-models --url "postgresql+asyncpg://user:pass@host:5432/db" --supabase-optimize
|
||||
```
|
||||
|
||||
Generate for specific schema with lazy loading:
|
||||
```bash
|
||||
/generate-sqlalchemy-models --schema analytics --output analytics_models.py --lazy-load
|
||||
```
|
||||
|
||||
Reflect all schemas with extensions:
|
||||
```bash
|
||||
/generate-sqlalchemy-models --include-extensions --base-class CustomBase
|
||||
```
|
||||
|
||||
## Output Format
|
||||
|
||||
The command generates:
|
||||
- SQLAlchemy model classes with proper type hints
|
||||
- Column definitions with constraints and defaults
|
||||
- Relationship definitions with cascade options
|
||||
- Import statements and base class definition
|
||||
- Optional migration script for existing code
|
||||
|
||||
## Generated Features
|
||||
|
||||
- Type hints for all columns and relationships
|
||||
- Proper __repr__ methods for debugging
|
||||
- Validation methods for common use cases
|
||||
- Supabase-specific optimizations
|
||||
- Lazy loading support for large schemas
|
||||
- JSON serialization methods for API responses
|
||||
140
commands/validate-conversion.md
Normal file
140
commands/validate-conversion.md
Normal file
@@ -0,0 +1,140 @@
|
||||
Validate SQLAlchemy conversion and test functionality
|
||||
|
||||
This command validates that the asyncpg to SQLAlchemy conversion was successful by running comprehensive tests, checking syntax validity, verifying database connectivity, and ensuring all functionality works as expected.
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
/validate-sqlalchemy-conversion [options]
|
||||
```
|
||||
|
||||
## Options
|
||||
|
||||
- `--path <directory>`: Project directory to validate (default: current directory)
|
||||
- `--connection-string <url>`: Database connection for testing (required)
|
||||
- `--test-data`: Run tests with sample data
|
||||
- `--performance`: Include performance benchmarks
|
||||
- `--supabase`: Include Supabase-specific validations
|
||||
- `--detailed`: Provide detailed validation report
|
||||
- `--fix-issues`: Attempt to automatically fix detected issues
|
||||
|
||||
## Validation Categories
|
||||
|
||||
### Syntax & Import Validation
|
||||
- Check all Python files for syntax errors
|
||||
- Verify SQLAlchemy imports are correct
|
||||
- Validate async/await usage patterns
|
||||
- Check for proper type hints and annotations
|
||||
|
||||
### Database Connectivity
|
||||
- Test database connection establishment
|
||||
- Verify async session creation and cleanup
|
||||
- Test connection pooling functionality
|
||||
- Validate connection string parsing
|
||||
|
||||
### Query Functionality Tests
|
||||
- Test basic CRUD operations (Create, Read, Update, Delete)
|
||||
- Validate parameter binding and escaping
|
||||
- Test complex queries with joins and aggregations
|
||||
- Verify transaction handling and rollback scenarios
|
||||
|
||||
### Performance Benchmarks
|
||||
- Compare query performance between original and converted code
|
||||
- Test connection pooling efficiency
|
||||
- Memory usage analysis during database operations
|
||||
- Concurrent request handling validation
|
||||
|
||||
### Supabase Integration Tests (optional)
|
||||
- Row Level Security (RLS) functionality
|
||||
- JWT token validation with database sessions
|
||||
- Supabase auth integration testing
|
||||
- Storage integration with database operations
|
||||
|
||||
## Validation Process
|
||||
|
||||
### Phase 1: Static Analysis
|
||||
1. Syntax validation of all Python files
|
||||
2. Import verification and dependency checking
|
||||
3. Async pattern validation and coroutine checking
|
||||
4. Type hint verification for better IDE support
|
||||
|
||||
### Phase 2: Database Testing
|
||||
1. Connection establishment tests
|
||||
2. Session lifecycle validation
|
||||
3. Basic CRUD operation testing
|
||||
4. Error handling and recovery testing
|
||||
|
||||
### Phase 3: Integration Testing
|
||||
1. FastAPI endpoint testing with database operations
|
||||
2. Dependency injection validation
|
||||
3. Concurrent request handling
|
||||
4. Memory leak detection
|
||||
|
||||
### Phase 4: Performance Analysis
|
||||
1. Query execution time comparison
|
||||
2. Connection pool efficiency testing
|
||||
3. Memory usage profiling
|
||||
4. Scalability assessment
|
||||
|
||||
## Examples
|
||||
|
||||
Basic validation:
|
||||
```bash
|
||||
/validate-sqlalchemy-conversion --connection-string "postgresql+asyncpg://user:pass@host:5432/db"
|
||||
```
|
||||
|
||||
Comprehensive validation with Supabase support:
|
||||
```bash
|
||||
/validate-sqlalchemy-conversion --supabase --performance --test-data --detailed
|
||||
```
|
||||
|
||||
Validate specific directory with auto-fix:
|
||||
```bash
|
||||
/validate-sqlalchemy-conversion --path ./src/api --connection-string $DATABASE_URL --fix-issues
|
||||
```
|
||||
|
||||
## Output Reports
|
||||
|
||||
### Summary Report
|
||||
- Overall validation status (PASS/FAIL/WARNING)
|
||||
- Number of issues found and fixed
|
||||
- Performance metrics comparison
|
||||
- Recommendations for improvements
|
||||
|
||||
### Detailed Issues Report
|
||||
- File-by-file validation results
|
||||
- Specific syntax errors and fixes applied
|
||||
- Missing imports or incorrect patterns
|
||||
- Performance bottlenecks identified
|
||||
|
||||
### Performance Analysis
|
||||
- Query execution time comparisons
|
||||
- Connection pool efficiency metrics
|
||||
- Memory usage patterns
|
||||
- Scalability test results
|
||||
|
||||
### Recommendations
|
||||
- Code improvement suggestions
|
||||
- Performance optimization opportunities
|
||||
- Security considerations
|
||||
- Best practice recommendations
|
||||
|
||||
## Auto-Fix Capabilities
|
||||
|
||||
When `--fix-issues` is enabled, the command can automatically:
|
||||
|
||||
- Fix common import errors and missing dependencies
|
||||
- Correct async/await usage patterns
|
||||
- Update type hints for better IDE support
|
||||
- Fix basic syntax errors
|
||||
- Optimize connection pooling configurations
|
||||
- Update error handling patterns
|
||||
- Fix parameter binding issues
|
||||
- Correct transaction handling patterns
|
||||
|
||||
## Exit Codes
|
||||
|
||||
- `0`: Validation successful - all tests passed
|
||||
- `1`: Validation failed - critical issues found
|
||||
- `2`: Validation failed with warnings - non-critical issues present
|
||||
- `3`: Validation error - unable to complete validation due to environment issues
|
||||
11
hooks/hooks.json
Normal file
11
hooks/hooks.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"PreToolUse": [{
|
||||
"matcher": "Write|Edit",
|
||||
"hooks": [{
|
||||
"type": "command",
|
||||
"command": "bash /Users/kevinhill/.claude/plugins/marketplaces/claude-code-plugins/plugins/plugin-dev/scripts/validate-sqlalchemy.sh",
|
||||
"timeout": 30,
|
||||
"description": "Validate SQLAlchemy code patterns and syntax"
|
||||
}]
|
||||
}]
|
||||
}
|
||||
85
plugin.lock.json
Normal file
85
plugin.lock.json
Normal file
@@ -0,0 +1,85 @@
|
||||
{
|
||||
"$schema": "internal://schemas/plugin.lock.v1.json",
|
||||
"pluginId": "gh:kivo360/claude-toolbelt:asyncpg-to-sqlalchemy-converter",
|
||||
"normalized": {
|
||||
"repo": null,
|
||||
"ref": "refs/tags/v20251128.0",
|
||||
"commit": "5b72096cc91547f06f18d3d54dbaf40f7a1f2b81",
|
||||
"treeHash": "b8bc19d9109bfe7d039cae06fa344c37cc9c5e7e3903b514530563becbbaeb4e",
|
||||
"generatedAt": "2025-11-28T10:19:53.545631Z",
|
||||
"toolVersion": "publish_plugins.py@0.2.0"
|
||||
},
|
||||
"origin": {
|
||||
"remote": "git@github.com:zhongweili/42plugin-data.git",
|
||||
"branch": "master",
|
||||
"commit": "aa1497ed0949fd50e99e70d6324a29c5b34f9390",
|
||||
"repoRoot": "/Users/zhongweili/projects/openmind/42plugin-data"
|
||||
},
|
||||
"manifest": {
|
||||
"name": "asyncpg-to-sqlalchemy-converter",
|
||||
"description": "Convert asyncpg code in FastAPI projects to SQLAlchemy with asyncpg engine, supporting Supabase integration and lazy loading",
|
||||
"version": "1.0.0"
|
||||
},
|
||||
"content": {
|
||||
"files": [
|
||||
{
|
||||
"path": ".mcp.json",
|
||||
"sha256": "482f809da2c81bc7f7565edb29eb59af99a5310b19f1129f540d3d103b498af2"
|
||||
},
|
||||
{
|
||||
"path": "README.md",
|
||||
"sha256": "7a4f9a103e14b03d9829e89dbad210b7c71173f00c4882c537aa2f85d455bb52"
|
||||
},
|
||||
{
|
||||
"path": "agents/schema-reflector.md",
|
||||
"sha256": "c827d0d8dc425dd44dfe388773ebb6efb9b479f9a73a717435f50843af500ae4"
|
||||
},
|
||||
{
|
||||
"path": "agents/conversion-analyzer.md",
|
||||
"sha256": "9af724ba4b9a483fe4486ecc793a1ebcd52d80ea58fe0ba9aebb3c9d95f732fc"
|
||||
},
|
||||
{
|
||||
"path": "hooks/hooks.json",
|
||||
"sha256": "28103880178905fcc32867fa281847967f858c3c081196b9ba8641f7dd704741"
|
||||
},
|
||||
{
|
||||
"path": ".claude-plugin/plugin.json",
|
||||
"sha256": "25f58cb7d5c76b5a345325e1a386a97df778c987d7f236bea102c5ff875a3ed8"
|
||||
},
|
||||
{
|
||||
"path": "commands/convert-project.md",
|
||||
"sha256": "b63442ade17a24c5e0f3d9fd7828efe8f01888a3f1c247738271327772db30e1"
|
||||
},
|
||||
{
|
||||
"path": "commands/generate-models.md",
|
||||
"sha256": "ef37f6f2c82101513a8ed05c1fe360aae55e7f423e555d9592ec38f34de2a147"
|
||||
},
|
||||
{
|
||||
"path": "commands/create-session.md",
|
||||
"sha256": "0cb8acab704428f1d1fc1cba51e0727de54d6f7826c50891e4bd905b8901dec6"
|
||||
},
|
||||
{
|
||||
"path": "commands/validate-conversion.md",
|
||||
"sha256": "ee0d244c4e6d9bc61407ad80f54939e7a16162d8d662aae6095e109ac57ee6e6"
|
||||
},
|
||||
{
|
||||
"path": "skills/supabase-integration/SKILL.md",
|
||||
"sha256": "f921e03ab17fc5efff5e57ffc413f2ab4fcda5b16f879930f0fcd5a77cea9aff"
|
||||
},
|
||||
{
|
||||
"path": "skills/sqlalchemy-conversion/SKILL.md",
|
||||
"sha256": "af9f45ba8237bc15d2bf11a5621e5c22478d1ca22a5e4c35c3abf274a98915a3"
|
||||
},
|
||||
{
|
||||
"path": "skills/asyncpg-detection/SKILL.md",
|
||||
"sha256": "f9f9c672ad8d84f1dc086b3b790385260821d35d90719f4cd17cdc98b76c822d"
|
||||
}
|
||||
],
|
||||
"dirSha256": "b8bc19d9109bfe7d039cae06fa344c37cc9c5e7e3903b514530563becbbaeb4e"
|
||||
},
|
||||
"security": {
|
||||
"scannedAt": null,
|
||||
"scannerVersion": null,
|
||||
"flags": []
|
||||
}
|
||||
}
|
||||
65
skills/asyncpg-detection/SKILL.md
Normal file
65
skills/asyncpg-detection/SKILL.md
Normal file
@@ -0,0 +1,65 @@
|
||||
---
|
||||
name: asyncpg-detection
|
||||
description: This skill should be used when the user asks to "detect asyncpg usage", "find asyncpg patterns", "scan for asyncpg imports", or "identify asyncpg database code in FastAPI projects". It automatically scans Python files to identify asyncpg imports, connection patterns, and query execution methods that need conversion to SQLAlchemy.
|
||||
version: 1.0.0
|
||||
---
|
||||
|
||||
# AsyncPG Detection for FastAPI Projects
|
||||
|
||||
This skill provides comprehensive detection of asyncpg usage patterns in FastAPI applications, identifying all code that needs to be converted to SQLAlchemy with asyncpg engine support.
|
||||
|
||||
## Detection Overview
|
||||
|
||||
Scan FastAPI projects for asyncpg patterns including imports, connection management, queries, transactions, and error handling. Generate detailed reports with line numbers and conversion recommendations.
|
||||
|
||||
## Core Detection Patterns
|
||||
|
||||
### Import Detection
|
||||
Look for these import statements:
|
||||
- `import asyncpg`
|
||||
- `from asyncpg import`
|
||||
- `import asyncpg as pg`
|
||||
- `from asyncpg import Connection, Pool`
|
||||
|
||||
### Connection Patterns
|
||||
Identify these asyncpg connection approaches:
|
||||
- `asyncpg.connect()` calls
|
||||
- `asyncpg.create_pool()` usage
|
||||
- Manual connection string parsing
|
||||
- Environment-based connection configuration
|
||||
|
||||
### Query Patterns
|
||||
Detect these asyncpg execution methods:
|
||||
- `connection.fetch()` for SELECT queries
|
||||
- `connection.execute()` for INSERT/UPDATE/DELETE
|
||||
- `connection.fetchval()` for single values
|
||||
- `connection.fetchrow()` for single rows
|
||||
- `connection.iter()` for result iteration
|
||||
|
||||
## Usage Instructions
|
||||
|
||||
To detect asyncpg usage in your FastAPI project:
|
||||
|
||||
1. **Run comprehensive scan**: Use the `/convert-asyncpg-to-sqlalchemy` command to scan all Python files in the project
|
||||
2. **Analyze detection results**: Review the generated report for files containing asyncpg code
|
||||
3. **Prioritize conversion**: Focus on files with the most asyncpg usage first
|
||||
4. **Check for complex patterns**: Look for nested connections, transactions, and error handling that may require special attention
|
||||
|
||||
## Reporting Format
|
||||
|
||||
The detection generates reports with:
|
||||
- **File list**: All files containing asyncpg imports
|
||||
- **Pattern analysis**: Specific asyncpg methods found
|
||||
- **Complexity assessment**: Files requiring manual intervention
|
||||
- **Conversion recommendations**: Suggested SQLAlchemy equivalents
|
||||
|
||||
## Additional Resources
|
||||
|
||||
### Reference Files
|
||||
- **`references/patterns-mapping.md`** - Complete asyncpg to SQLAlchemy pattern mapping
|
||||
- **`references/complex-cases.md`** - Handling of complex asyncpg scenarios
|
||||
- **`references/supabase-specific.md`** - Supabase-specific asyncpg patterns
|
||||
|
||||
### Examples
|
||||
- **`examples/detection-report.md`** - Sample detection output
|
||||
- **`examples/fastapi-project-structure.md`** - Example FastAPI project with asyncpg usage
|
||||
135
skills/sqlalchemy-conversion/SKILL.md
Normal file
135
skills/sqlalchemy-conversion/SKILL.md
Normal file
@@ -0,0 +1,135 @@
|
||||
---
|
||||
name: sqlalchemy-conversion
|
||||
description: This skill should be used when the user asks to "convert asyncpg to SQLAlchemy", "convert database queries", "migrate asyncpg code", "transform asyncpg patterns to SQLAlchemy", or "update FastAPI database layer". It provides systematic conversion of asyncpg code to SQLAlchemy async patterns with proper error handling and transaction management.
|
||||
version: 1.0.0
|
||||
---
|
||||
|
||||
# SQLAlchemy Conversion for AsyncPG Migration
|
||||
|
||||
This skill provides systematic conversion of asyncpg database code to SQLAlchemy 2.0+ with async support, maintaining async performance while providing ORM benefits.
|
||||
|
||||
## Conversion Strategy
|
||||
|
||||
Convert asyncpg procedural code to SQLAlchemy declarative patterns while preserving async functionality and improving maintainability.
|
||||
|
||||
## Core Conversion Patterns
|
||||
|
||||
### Import Replacement
|
||||
Replace asyncpg imports with SQLAlchemy:
|
||||
- `import asyncpg` → `from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine`
|
||||
- `from asyncpg import Connection` → `from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker`
|
||||
|
||||
### Engine Configuration
|
||||
Convert connection setup:
|
||||
```python
|
||||
# Before (asyncpg)
|
||||
engine = await asyncpg.create_pool(dsn)
|
||||
|
||||
# After (SQLAlchemy)
|
||||
engine = create_async_engine(
|
||||
DATABASE_URL,
|
||||
echo=True,
|
||||
poolclass=NullPool # For asyncpg compatibility
|
||||
)
|
||||
```
|
||||
|
||||
### Session Management
|
||||
Replace connection objects with async sessions:
|
||||
```python
|
||||
# Before (asyncpg)
|
||||
async def get_user(db, user_id):
|
||||
async with db.acquire() as conn:
|
||||
result = await conn.fetchrow("SELECT * FROM users WHERE id = $1", user_id)
|
||||
return dict(result)
|
||||
|
||||
# After (SQLAlchemy)
|
||||
async def get_user(session: AsyncSession, user_id: int):
|
||||
result = await session.execute(
|
||||
select(User).where(User.id == user_id)
|
||||
)
|
||||
return result.scalar_one()
|
||||
```
|
||||
|
||||
## Query Conversion Guidelines
|
||||
|
||||
### SELECT Queries
|
||||
Transform fetch operations to SQLAlchemy Core/ORM:
|
||||
- `fetchall()` → `execute().scalars().all()`
|
||||
- `fetchrow()` → `execute().scalar_one()` or `execute().first()`
|
||||
- `fetchval()` → `execute().scalar()`
|
||||
- `iter()` → `execute().yield_per()`
|
||||
|
||||
### INSERT Operations
|
||||
Convert execute patterns:
|
||||
```python
|
||||
# Before (asyncpg)
|
||||
await conn.execute(
|
||||
"INSERT INTO users (name, email) VALUES ($1, $2)",
|
||||
name, email
|
||||
)
|
||||
|
||||
# After (SQLAlchemy ORM)
|
||||
session.add(User(name=name, email=email))
|
||||
await session.commit()
|
||||
```
|
||||
|
||||
### Transaction Handling
|
||||
Update transaction patterns:
|
||||
```python
|
||||
# Before (asyncpg)
|
||||
async with conn.transaction():
|
||||
await conn.execute("UPDATE users SET status = $1", status)
|
||||
|
||||
# After (SQLAlchemy)
|
||||
async with session.begin():
|
||||
await session.execute(
|
||||
update(User).where(User.id == user_id).values(status=status)
|
||||
)
|
||||
```
|
||||
|
||||
## Usage Instructions
|
||||
|
||||
To convert asyncpg code:
|
||||
|
||||
1. **Analyze detected patterns**: Use detection results to understand current codebase structure
|
||||
2. **Apply systematic conversion**: Follow the pattern mapping for each identified asyncpg usage
|
||||
3. **Handle edge cases**: Refer to complex cases documentation for advanced scenarios
|
||||
4. **Validate conversions**: Test converted code to ensure functionality is preserved
|
||||
|
||||
## Error Handling Conversion
|
||||
|
||||
### Exception Types
|
||||
Update exception handling:
|
||||
- `asyncpg.PostgresError` → `sqlalchemy.exc.DBAPIError`
|
||||
- `asyncpg.InterfaceError` → `sqlalchemy.exc.InterfaceError`
|
||||
- `asyncpg.exceptions` → Use SQLAlchemy's built-in exceptions
|
||||
|
||||
### Connection Errors
|
||||
Implement robust error handling:
|
||||
```python
|
||||
# Before
|
||||
try:
|
||||
conn = await asyncpg.connect(dsn)
|
||||
except asyncpg.PostgresError as e:
|
||||
logger.error(f"Database connection failed: {e}")
|
||||
|
||||
# After
|
||||
try:
|
||||
engine = create_async_engine(DATABASE_URL)
|
||||
async with engine.begin() as conn:
|
||||
pass
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database setup failed: {e}")
|
||||
```
|
||||
|
||||
## Additional Resources
|
||||
|
||||
### Reference Files
|
||||
- **`references/pattern-mapping.md`** - Comprehensive asyncpg to SQLAlchemy conversion mapping
|
||||
- **`references/async-patterns.md`** - Async SQLAlchemy best practices
|
||||
- **`references/error-handling.md`** - SQLAlchemy exception handling patterns
|
||||
|
||||
### Examples
|
||||
- **`examples/conversion-comparison.md`** - Side-by-side asyncpg vs SQLAlchemy examples
|
||||
- **`examples/migration-scripts.py`** - Automated conversion utilities
|
||||
- **`examples/test-validation.py`** - Testing converted code patterns
|
||||
459
skills/supabase-integration/SKILL.md
Normal file
459
skills/supabase-integration/SKILL.md
Normal file
@@ -0,0 +1,459 @@
|
||||
---
|
||||
name: supabase-integration
|
||||
description: This skill should be used when the user asks to "configure Supabase with SQLAlchemy", "set up Supabase async engine", "create Supabase models", "handle Supabase authentication with SQLAlchemy", or "integrate Supabase pooling with SQLAlchemy async patterns". It provides complete Supabase integration patterns for SQLAlchemy with async support, authentication, and connection pooling optimizations.
|
||||
version: 1.0.0
|
||||
---
|
||||
|
||||
# Supabase Integration for SQLAlchemy Async Projects
|
||||
|
||||
This skill provides comprehensive integration patterns for using SQLAlchemy with Supabase, including async engine configuration, authentication setup, connection pooling, and performance optimizations.
|
||||
|
||||
## Integration Overview
|
||||
|
||||
Configure SQLAlchemy to work seamlessly with Supabase PostgreSQL databases while maintaining async performance, proper authentication, and connection management optimizations for serverless environments.
|
||||
|
||||
## Supabase Engine Configuration
|
||||
|
||||
### Async Engine Setup
|
||||
Configure SQLAlchemy async engine for Supabase:
|
||||
```python
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker
|
||||
from sqlalchemy.orm import DeclarativeBase, sessionmaker
|
||||
import os
|
||||
|
||||
# Supabase connection string
|
||||
SUPABASE_URL = f"postgresql+asyncpg://postgres.{SUPABASE_PROJECT_ID}:{SUPABASE_PASSWORD}@aws-0-{SUPABASE_REGION}.pooler.supabase.com:6543/postgres"
|
||||
|
||||
# Async engine optimized for Supabase
|
||||
engine = create_async_engine(
|
||||
SUPABASE_URL,
|
||||
echo=True,
|
||||
pool_size=20,
|
||||
max_overflow=0,
|
||||
pool_pre_ping=True,
|
||||
pool_recycle=300,
|
||||
connect_args={
|
||||
"server_settings": {
|
||||
"application_name": "fastapi_supabase_app",
|
||||
"search_path": "public, extensions"
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Async session factory
|
||||
AsyncSessionFactory = async_sessionmaker(
|
||||
engine,
|
||||
class_=AsyncSession,
|
||||
expire_on_commit=False
|
||||
)
|
||||
```
|
||||
|
||||
### Environment-Based Configuration
|
||||
Set up flexible configuration for different environments:
|
||||
```python
|
||||
# config/database.py
|
||||
from pydantic_settings import BaseSettings
|
||||
from typing import Optional
|
||||
|
||||
class DatabaseSettings(BaseSettings):
|
||||
supabase_url: str
|
||||
supabase_key: str
|
||||
supabase_service_key: Optional[str] = None
|
||||
pool_size: int = 10
|
||||
max_overflow: int = 0
|
||||
|
||||
class Config:
|
||||
env_prefix = "DB_"
|
||||
case_sensitive = False
|
||||
|
||||
@property
|
||||
def async_url(self) -> str:
|
||||
return self.supabase_url.replace("postgresql://", "postgresql+asyncpg://")
|
||||
|
||||
# Dependency injection for FastAPI
|
||||
async def get_db_session() -> AsyncSession:
|
||||
async with AsyncSessionFactory() as session:
|
||||
try:
|
||||
yield session
|
||||
await session.commit()
|
||||
except Exception:
|
||||
await session.rollback()
|
||||
raise
|
||||
finally:
|
||||
await session.close()
|
||||
```
|
||||
|
||||
## Authentication Integration
|
||||
|
||||
### Row Level Security (RLS) Integration
|
||||
Handle Supabase RLS with SQLAlchemy:
|
||||
```python
|
||||
from fastapi import Request, HTTPException
|
||||
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
|
||||
import jwt
|
||||
|
||||
security = HTTPBearer()
|
||||
|
||||
async def get_supabase_user(request: Request) -> dict:
|
||||
"""Extract and validate Supabase JWT token"""
|
||||
authorization = request.headers.get("Authorization")
|
||||
if not authorization or not authorization.startswith("Bearer "):
|
||||
raise HTTPException(status_code=401, detail="Missing or invalid token")
|
||||
|
||||
token = authorization.split(" ")[1]
|
||||
try:
|
||||
# Decode Supabase JWT
|
||||
payload = jwt.decode(
|
||||
token,
|
||||
SUPABASE_JWT_SECRET,
|
||||
algorithms=["HS256"],
|
||||
options={"verify_aud": False}
|
||||
)
|
||||
return payload
|
||||
except jwt.ExpiredSignatureError:
|
||||
raise HTTPException(status_code=401, detail="Token expired")
|
||||
except jwt.InvalidTokenError:
|
||||
raise HTTPException(status_code=401, detail="Invalid token")
|
||||
|
||||
async def get_db_with_auth(request: Request) -> AsyncSession:
|
||||
"""Get database session with RLS context"""
|
||||
session = AsyncSessionFactory()
|
||||
|
||||
# Set RLS user context
|
||||
user = await get_supabase_user(request)
|
||||
await session.execute(
|
||||
text("SET request.jwt.claims.user_id = :user_id"),
|
||||
{"user_id": user.get("sub")}
|
||||
)
|
||||
|
||||
await session.execute(
|
||||
text("SET request.jwt.claims.role = :role"),
|
||||
{"role": user.get("role", "authenticated")}
|
||||
)
|
||||
|
||||
return session
|
||||
```
|
||||
|
||||
### Service Key Integration
|
||||
Use Supabase service key for admin operations:
|
||||
```python
|
||||
from supabase import create_client, Client
|
||||
|
||||
class SupabaseAdminClient:
|
||||
def __init__(self, supabase_url: str, service_key: str):
|
||||
self.supabase: Client = create_client(supabase_url, service_key)
|
||||
|
||||
async def upload_file(self, bucket: str, path: str, file_content: bytes) -> dict:
|
||||
"""Upload file to Supabase Storage"""
|
||||
return self.supabase.storage.from_(bucket).upload(path, file_content)
|
||||
|
||||
async def sign_url(self, bucket: str, path: str, expires_in: int = 3600) -> str:
|
||||
"""Generate signed URL for file access"""
|
||||
return self.supabase.storage.from_(bucket).create_signed_url(path, expires_in)
|
||||
|
||||
# FastAPI dependency
|
||||
async def get_supabase_admin() -> SupabaseAdminClient:
|
||||
return SupabaseAdminClient(SUPABASE_URL, SUPABASE_SERVICE_KEY)
|
||||
```
|
||||
|
||||
## Performance Optimization
|
||||
|
||||
### Connection Pooling for Serverless
|
||||
Optimize for Supabase connection limits:
|
||||
```python
|
||||
# config/pooling.py
|
||||
from sqlalchemy.ext.asyncio import create_async_engine
|
||||
from sqlalchemy.pool import QueuePool
|
||||
import asyncio
|
||||
|
||||
class SupabaseEngineManager:
|
||||
def __init__(self, supabase_url: str, max_connections: int = 20):
|
||||
self.engine = create_async_engine(
|
||||
supabase_url,
|
||||
poolclass=QueuePool,
|
||||
pool_size=max_connections - 5, # Leave room for admin connections
|
||||
max_overflow=5,
|
||||
pool_pre_ping=True,
|
||||
pool_recycle=300, # 5 minutes
|
||||
pool_timeout=30,
|
||||
connect_args={
|
||||
"command_timeout": 10,
|
||||
"server_settings": {
|
||||
"application_name": "fastapi_supabase_app",
|
||||
"jit": "off" # Disable JIT for serverless
|
||||
}
|
||||
}
|
||||
)
|
||||
self._background_heartbeater = None
|
||||
|
||||
async def start_heartbeat(self):
|
||||
"""Keep connections alive in serverless environments"""
|
||||
async def heartbeat():
|
||||
while True:
|
||||
await asyncio.sleep(240) # 4 minutes
|
||||
async with self.engine.connect() as conn:
|
||||
await conn.execute(text("SELECT 1"))
|
||||
|
||||
self._background_heartbeater = asyncio.create_task(heartbeat())
|
||||
|
||||
async def stop_heartbeat(self):
|
||||
if self._background_heartbeater:
|
||||
self._background_heartbeater.cancel()
|
||||
try:
|
||||
await self._background_heartbeater
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
```
|
||||
|
||||
### Lazy Loading Implementation
|
||||
Implement efficient lazy loading for large schemas:
|
||||
```python
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, text
|
||||
from typing import Type, TypeVar, Generic
|
||||
from pydantic import BaseModel
|
||||
|
||||
T = TypeVar('T')
|
||||
|
||||
class LazyLoader(Generic[T]):
|
||||
def __init__(self, model: Type[T], session: AsyncSession):
|
||||
self.model = model
|
||||
self.session = session
|
||||
self._loaded = None
|
||||
self._query = None
|
||||
|
||||
def where(self, *criteria):
|
||||
"""Add where conditions to query"""
|
||||
self._query = select(self.model).where(*criteria)
|
||||
return self
|
||||
|
||||
async def load(self) -> list[T]:
|
||||
"""Execute the query and cache results"""
|
||||
if self._loaded is None:
|
||||
if self._query is None:
|
||||
self._query = select(self.model)
|
||||
result = await self.session.execute(self._query)
|
||||
self._loaded = result.scalars().all()
|
||||
return self._loaded
|
||||
|
||||
async def first(self) -> T | None:
|
||||
"""Load first result only"""
|
||||
if self._query is None:
|
||||
self._query = select(self.model)
|
||||
result = await self.session.execute(self._query.limit(1))
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
# Usage in FastAPI endpoints
|
||||
@app.get("/users/{user_id}")
|
||||
async def get_user(user_id: int, session: AsyncSession = Depends(get_db_session)):
|
||||
lazy_users = LazyLoader(User, session)
|
||||
user = await lazy_users.where(User.id == user_id).first()
|
||||
|
||||
if not user:
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
|
||||
return user
|
||||
```
|
||||
|
||||
## Model Generation
|
||||
|
||||
### Supabase Schema Reflection
|
||||
Generate SQLAlchemy models from Supabase schema:
|
||||
```python
|
||||
from sqlalchemy.ext.asyncio import AsyncEngine
|
||||
from sqlalchemy import inspect, text
|
||||
from sqlalchemy.orm import DeclarativeBase
|
||||
from typing import Dict, List
|
||||
|
||||
async def reflect_supabase_schema(engine: AsyncEngine, schema: str = "public") -> Dict[str, dict]:
|
||||
"""Reflect Supabase database schema"""
|
||||
async with engine.connect() as conn:
|
||||
# Get table information
|
||||
tables_query = text("""
|
||||
SELECT table_name, column_name, data_type, is_nullable, column_default
|
||||
FROM information_schema.columns
|
||||
WHERE table_schema = :schema
|
||||
ORDER BY table_name, ordinal_position
|
||||
""")
|
||||
|
||||
result = await conn.execute(tables_query, {"schema": schema})
|
||||
columns = result.fetchall()
|
||||
|
||||
# Get foreign key constraints
|
||||
fk_query = text("""
|
||||
SELECT
|
||||
tc.table_name,
|
||||
kcu.column_name,
|
||||
ccu.table_name AS foreign_table_name,
|
||||
ccu.column_name AS foreign_column_name
|
||||
FROM information_schema.table_constraints tc
|
||||
JOIN information_schema.key_column_usage kcu
|
||||
ON tc.constraint_name = kcu.constraint_name
|
||||
JOIN information_schema.constraint_column_usage ccu
|
||||
ON ccu.constraint_name = tc.constraint_name
|
||||
WHERE tc.constraint_type = 'FOREIGN KEY'
|
||||
AND tc.table_schema = :schema
|
||||
""")
|
||||
|
||||
fk_result = await conn.execute(fk_query, {"schema": schema})
|
||||
foreign_keys = fk_result.fetchall()
|
||||
|
||||
# Process and return schema information
|
||||
schema_info = {}
|
||||
for table_name, column_name, data_type, is_nullable, column_default in columns:
|
||||
if table_name not in schema_info:
|
||||
schema_info[table_name] = {
|
||||
"columns": {},
|
||||
"foreign_keys": []
|
||||
}
|
||||
|
||||
schema_info[table_name]["columns"][column_name] = {
|
||||
"type": data_type,
|
||||
"nullable": is_nullable == "YES",
|
||||
"default": column_default
|
||||
}
|
||||
|
||||
# Add foreign key information
|
||||
for table_name, column_name, fk_table, fk_column in foreign_keys:
|
||||
schema_info[table_name]["foreign_keys"].append({
|
||||
"column": column_name,
|
||||
"references": f"{fk_table}.{fk_column}"
|
||||
})
|
||||
|
||||
return schema_info
|
||||
|
||||
# Model generation
|
||||
async def generate_sqlalchemy_models(schema_info: Dict[str, dict], base_class: DeclarativeBase) -> str:
|
||||
"""Generate SQLAlchemy model classes from schema info"""
|
||||
model_code = []
|
||||
|
||||
for table_name, table_info in schema_info.items():
|
||||
class_name = "".join(word.capitalize() for word in table_name.split("_"))
|
||||
|
||||
# Column definitions
|
||||
columns = []
|
||||
primary_key_columns = []
|
||||
|
||||
for column_name, column_info in table_info["columns"]..items():
|
||||
col_def = _generate_column_definition(column_name, column_info)
|
||||
columns.append(col_def)
|
||||
|
||||
# Detect primary keys (common patterns in Supabase)
|
||||
if column_name in ["id", f"{table_name}_id"] or column_info.get("default", "").startswith("nextval"):
|
||||
primary_key_columns.append(column_name)
|
||||
|
||||
# Foreign key relationships
|
||||
relationships = []
|
||||
for fk in table_info["foreign_keys"]:
|
||||
fk_table, fk_column = fk["references"].split(".")
|
||||
fk_class_name = "".join(word.capitalize() for word in fk_table.split("_"))
|
||||
relationship_name = fk_table if fk_table.endswith("s") else f"{fk_table}s"
|
||||
|
||||
if column_name.endswith("_id"):
|
||||
relationship_name = column_name[:-3] + ("s" if not column_name[:-3].endswith("s") else "")
|
||||
|
||||
relationships.append(
|
||||
f' {relationship_name} = relationship("{fk_class_name}", back_populates="{table_name}")'
|
||||
)
|
||||
|
||||
# Generate the complete class
|
||||
model_class = f"""
|
||||
class {class_name}({base_class.__name__}):
|
||||
__tablename__ = "{table_name}"
|
||||
|
||||
{chr(10).join(columns)}
|
||||
"""
|
||||
|
||||
if primary_key_columns:
|
||||
pk_declaration = f" __table_args__ = (PrimaryKeyConstraint({', '.join(map(lambda c: f'\"{c}\"', primary_key_columns))}),)"
|
||||
model_class += pk_declaration + "\n"
|
||||
|
||||
if relationships:
|
||||
model_class += "\n" + "\n".join(relationships) + "\n"
|
||||
|
||||
model_code.append(model_class)
|
||||
|
||||
return "\n".join(model_code)
|
||||
|
||||
def _generate_column_definition(name: str, info: dict) -> str:
|
||||
"""Generate SQLAlchemy column definition"""
|
||||
type_mapping = {
|
||||
"text": "Text",
|
||||
"varchar": "String",
|
||||
"character varying": "String",
|
||||
"integer": "Integer",
|
||||
"bigint": "BigInteger",
|
||||
"decimal": "Numeric",
|
||||
"numeric": "Numeric",
|
||||
"real": "Float",
|
||||
"double precision": "Float",
|
||||
"boolean": "Boolean",
|
||||
"date": "Date",
|
||||
"timestamp": "DateTime",
|
||||
"timestamp with time zone": "DateTime(timezone=True)",
|
||||
"uuid": "UUID",
|
||||
"jsonb": "JSON",
|
||||
"json": "JSON"
|
||||
}
|
||||
|
||||
sql_type = type_mapping.get(info["type"].lower(), "String")
|
||||
|
||||
nullable_str = "" if info["nullable"] else ", nullable=False"
|
||||
default_str = ""
|
||||
|
||||
if info["default"]:
|
||||
if info["default"].startswith("nextval"):
|
||||
default_str = ", autoincrement=True"
|
||||
elif "uuid_generate" in info["default"]:
|
||||
default_str = ", server_default=text('uuid_generate_v4()')"
|
||||
elif "now()" in info["default"]:
|
||||
default_str = ", server_default=text('now()')"
|
||||
|
||||
return f' {name} = Column({sql_type}{nullable_str}{default_str})'
|
||||
```
|
||||
|
||||
## Usage Instructions
|
||||
|
||||
To integrate Supabase with SQLAlchemy:
|
||||
|
||||
1. **Configure async engine**: Set up SQLAlchemy async engine with Supabase connection string
|
||||
2. **Implement authentication**: Handle JWT tokens and RLS policies
|
||||
3. **Optimize connection pooling**: Configure for serverless environments
|
||||
4. **Generate models**: Use schema reflection to create SQLAlchemy models
|
||||
5. **Test integration**: Validate queries and authentication work correctly
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Supabase-Specific Errors
|
||||
Handle Supabase-specific error scenarios:
|
||||
```python
|
||||
from sqlalchemy.exc import SQLAlchemyError, OperationalError, InterfaceError
|
||||
|
||||
async def handle_supabase_errors(func):
|
||||
"""Decorator for handling Supabase-specific errors"""
|
||||
async def wrapper(*args, **kwargs):
|
||||
try:
|
||||
return await func(*args, **kwargs)
|
||||
except OperationalError as e:
|
||||
if "connection" in str(e).lower():
|
||||
# Retry connection errors
|
||||
await asyncio.sleep(1)
|
||||
return await func(*args, **kwargs)
|
||||
raise
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Supabase database error: {e}")
|
||||
raise
|
||||
return wrapper
|
||||
```
|
||||
|
||||
## Additional Resources
|
||||
|
||||
### Reference Files
|
||||
- **`references/supabase-connection.md`** - Supabase connection configuration patterns
|
||||
- **`references/rls-integration.md`** - Row Level Security with SQLAlchemy
|
||||
- **`references/performance-optimization.md`** - Performance tuning for Supabase
|
||||
|
||||
### Examples
|
||||
- **`examples/supabase-fastapi-setup.py`** - Complete FastAPI + Supabase + SQLAlchemy setup
|
||||
- **`examples/async-patterns.py`** - Async patterns for Supabase integration
|
||||
- **`examples/schema-generation.py`** - Automated model generation from Supabase schema
|
||||
Reference in New Issue
Block a user