Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:43:11 +08:00
commit 5cf0559508
28 changed files with 5938 additions and 0 deletions

View File

@@ -0,0 +1,83 @@
---
name: add-neon-docs
description: Use this skill when the user asks to add documentation, add docs, add references, or install documentation about Neon. Adds Neon best practices reference links to project AI documentation (CLAUDE.md, AGENTS.md, or Cursor rules). Does not install packages or modify code.
allowed-tools: ["read_file", "write", "bash", "AskUserQuestion"]
---
# Add Neon Knowledge References to Project
This skill adds reference links to Neon documentation and best practices in your project's AI documentation file, enabling AI assistants to quickly access Neon-specific patterns and guidelines without cluttering your project with large documentation files.
## How It Works
This skill follows a simple workflow:
1. **Load metadata** - Read skill information from `skill-knowledge-map.json`
2. **Detect documentation file** - Find `CLAUDE.md`, `AGENTS.md`, or Cursor rules files
3. **Ask permission** - Show what will be added and where
4. **Add references** - Insert URLs in a "Resources & References" section
5. **Report completion** - Confirm successful installation
For detailed workflow steps, see `install-knowledge.md`.
## Parameters
### SKILL_NAME Parameter
Optional. Specifies which skill documentation to install (e.g., `"neon-drizzle"`). If not provided, you'll be prompted to choose from available skills defined in `skill-knowledge-map.json`.
## Usage Examples
**Called from another skill:**
```markdown
Execute the add-neon-docs skill with SKILL_NAME="neon-drizzle"
```
**Called directly by user:**
- "Add neon drizzle knowledge to my project"
- "Install neon serverless documentation"
- "Set up Neon best practices for my AI assistant"
## What Gets Added
References are added to a "Resources & References" section in your AI documentation file:
```markdown
## Resources & References
- **Neon and Drizzle ORM best practices**: https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-drizzle.mdc
- **Serverless connection patterns**: https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-serverless.mdc
```
### Target Files (in priority order):
- `CLAUDE.md` - Most common for Claude Code projects
- `AGENTS.md` - Custom AI documentation files
- `.cursor/README.md` or `.cursor/rules.md` - Cursor IDE projects
- Creates `CLAUDE.md` if none exist
### Behavior:
- Existing "Resources & References" sections: New links are appended
- No existing section: Section is created at end of file
- No documentation file: `CLAUDE.md` is created with references
## Related Skills
- **neon-drizzle** - Sets up Drizzle ORM, then offers this skill
- **neon-serverless** - Sets up connections, then offers this skill
- **neon-toolkit** - Sets up ephemeral databases, then offers this skill
## Workflow Reference
For complete implementation details:
- **Workflow**: `install-knowledge.md` - Step-by-step agent workflow with error handling
- **Metadata**: `skill-knowledge-map.json` - Skill definitions and reference URLs
---
## Workflow Implementation
Now I'll execute the installation workflow for you.
**Parameter received**: SKILL_NAME = ${SKILL_NAME || "not provided - will ask user"}
Execute `install-knowledge.md` with the specified SKILL_NAME.

View File

@@ -0,0 +1,258 @@
# Install Neon Knowledge to Project - Agent Workflow
**When to use**: When the user wants to add Neon's best practices reference links to their project's AI documentation.
**Required parameter**: `SKILL_NAME` (e.g., "neon-drizzle", "neon-serverless", "neon-toolkit")
---
> **IMPORTANT - Working Directory Context**
>
> This skill reads metadata from its own skill directory (`skill-knowledge-map.json`), but **ALL project file operations** (reading/writing `CLAUDE.md`, `AGENTS.md`, etc.) **MUST happen in the current working directory**.
>
> - ✅ Read skill metadata from skill directory (absolute paths provided by system)
> - ✅ Read/write project files using **relative paths only** (e.g., `CLAUDE.md`, `.cursor/rules.md`)
> - ❌ Never construct project file paths using absolute paths or the skill's base directory
---
## Step 1: Load Skill Metadata
Read the skill metadata file to get the reference URLs.
The metadata file is bundled with this skill at:
```
skill-knowledge-map.json
```
Use the Read tool to read the local file, then parse the JSON content.
Extract the metadata for the current `SKILL_NAME` from the JSON.
Store this information - you'll need:
- `displayName`: Human-readable skill name
- `files`: Array of .mdc files (each with `url`, `filename`, `description`)
If the skill is not found in metadata, inform the user and exit.
---
## Step 2: Detect AI Documentation File
Use your existing tools to detect where to add the reference links in the **current working directory**. **This is a read-only check - no files are created yet.**
Check in this priority order:
### 2.1 Check for CLAUDE.md (most common)
Use the Glob tool to search for `CLAUDE.md` in the current working directory:
```
pattern: "CLAUDE.md"
```
**If found**: Target is `CLAUDE.md` file
### 2.2 Check for AGENTS.md (custom AI docs)
Use the Glob tool to search for `AGENTS.md`:
```
pattern: "AGENTS.md"
```
**If found**: Target is `AGENTS.md` file
### 2.3 Check for Cursor rules file
Use the Glob tool to search for Cursor rules files:
```
pattern: ".cursor/README.md"
pattern: ".cursor/rules.md"
```
**If found**: Target is `.cursor/README.md` or `.cursor/rules.md`
### 2.4 No file found
If none of the above exist, set target as: "Will create `CLAUDE.md`"
**Store the detection result** for use in Step 3.
---
## Step 3: Present Plan and STOP for User Confirmation
**STOP HERE.** Do not proceed to Step 4 until the user explicitly confirms.
Now that you know WHAT to add (from Step 1) and WHERE to add it (from Step 2), present this plan to the user in natural language:
---
I've prepared to add **${displayName}** best practices references to your project.
**Target location:** ${detected_location or "Will create CLAUDE.md"}
**References to add:**
${list each file with a bullet point showing the description and URL}
This helps your AI assistant reference Neon best practices automatically in future conversations without cluttering your project with large documentation files.
Would you like me to proceed with adding these references?
---
**Wait for explicit user confirmation** (e.g., "yes", "go ahead", "proceed") before continuing to Step 4.
If the user declines or asks to skip, thank them and exit the workflow gracefully.
---
## Step 4: Add Reference Links
### 4.1 Build the reference content
For each file in the metadata, create a reference line:
```markdown
- **${description}**: ${url}
```
Combine all references into a section:
```markdown
## Resources & References
- **${file1.description}**: ${file1.url}
- **${file2.description}**: ${file2.url}
```
### 4.2 Check if "Resources & References" section exists
Read the target file and check if it already has a "## Resources & References" section.
**If section exists:**
- Use the Edit tool to append new references to that section
- Add the new links after existing content in that section
- Ensure proper spacing (blank line between entries)
**If section doesn't exist:**
- Append the entire section to the end of the file
- Add two blank lines before the section for proper spacing
**If target file doesn't exist yet:**
- Use the Write tool to create a new file with:
```markdown
# Project AI Documentation
## Resources & References
- **${file.description}**: ${file.url}
```
### 4.3 Perform the edit/write
**IMPORTANT**: Use relative paths only when calling Write/Edit tools.
Examples:
- ✅ Correct: `file_path: "CLAUDE.md"`
- ✅ Correct: `file_path: ".cursor/rules.md"`
- ❌ Wrong: `file_path: "/absolute/path/to/CLAUDE.md"`
Execute the appropriate tool operation based on the above conditions.
### 4.4 Confirm installation
Log: `✓ Added ${displayName} reference links to ${target_location}`
---
## Step 5: Report Completion
Build a completion message:
```markdown
✅ Reference links added successfully!
Location: ${target_location}
References added:
${list each reference with title and URL}
---
Your AI assistant can now reference these Neon best practices in future conversations by following the URLs. The documentation includes:
- Connection patterns and configuration
- Best practices and gotchas
- Code examples and templates
- Common patterns and solutions
${if target is CLAUDE.md:}
Note: I'll automatically reference these resources when you ask about ${SKILL_NAME} topics.
${end if}
${if target is Cursor:}
Note: Cursor can access these resources when working on related code.
${end if}
```
---
## Error Handling
### If metadata file cannot be read
- Log a clear error message
- Suggest checking internet connection
- Exit workflow
### If write permissions denied
- Inform user about permission issue
- Suggest running with appropriate permissions
- Provide manual instructions for adding links
### If target file is locked or unavailable
- Inform user of the issue
- Suggest closing editors or checking file permissions
- Provide the reference links for manual addition
---
## Testing This Workflow
As Claude, you can test this workflow by:
1. Reading the metadata file
2. Detecting the current project's documentation file
3. Simulating user responses (or asking real questions)
4. Adding references to test locations
Verify:
- [ ] Metadata loads correctly
- [ ] File detection works for multiple project types
- [ ] Permission prompt is clear
- [ ] References are added in the correct format
- [ ] Existing "Resources & References" sections are preserved
- [ ] New sections are created when needed
- [ ] Error messages are clear
---
## For Skill Developers
To use this workflow in your skill, add at the end of your guide or skill:
```markdown
## Add Best Practices References?
Setup is complete! Would you like me to add ${SKILL_NAME} best practices reference links to your project?
This helps your AI assistant (me!) remember where to find Neon patterns for future conversations.
${Execute workflow: skills/add-neon-docs/install-knowledge.md with SKILL_NAME="${skill-name}"}
```
Or explicitly call it:
```markdown
I'll now add reference links to help you in future conversations.
${Read and execute: skills/add-neon-docs/install-knowledge.md}
${Set SKILL_NAME = "neon-drizzle"}
```

View File

@@ -0,0 +1,56 @@
{
"neon-drizzle": {
"displayName": "Neon + Drizzle ORM",
"files": [
{
"url": "https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-drizzle.mdc",
"filename": "neon-drizzle.mdc",
"required": true,
"description": "Comprehensive guide for using Neon with Drizzle ORM"
},
{
"url": "https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-serverless.mdc",
"filename": "neon-serverless.mdc",
"required": false,
"description": "Optional: Serverless connection patterns"
}
],
"completionMessage": "Your AI assistant now has Neon + Drizzle best practices!"
},
"neon-serverless": {
"displayName": "Neon Serverless Connections",
"files": [
{
"url": "https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-serverless.mdc",
"filename": "neon-serverless.mdc",
"required": true,
"description": "Serverless database connection patterns"
}
],
"completionMessage": "Your AI assistant now has Neon serverless best practices!"
},
"neon-toolkit": {
"displayName": "Neon Toolkit (Ephemeral DBs)",
"files": [
{
"url": "https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-toolkit.mdc",
"filename": "neon-toolkit.mdc",
"required": true,
"description": "Creating ephemeral databases for testing"
}
],
"completionMessage": "Your AI assistant can now help with ephemeral databases!"
},
"neon-auth": {
"displayName": "Neon Auth",
"files": [
{
"url": "https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-auth.mdc",
"filename": "neon-auth.mdc",
"required": true,
"description": "Authentication with Stack Auth and Neon"
}
],
"completionMessage": "Your AI assistant now understands Neon Auth patterns!"
}
}

View File

@@ -0,0 +1,79 @@
---
name: neon-drizzle
description: Creates a fully functional Drizzle ORM setup with a provisioned Neon database. Installs dependencies, provisions database credentials, configures connections, generates schemas, and runs migrations. Results in working code that can immediately connect to and query the database. Use when creating new projects with Drizzle, adding ORM to existing applications, or modifying database schemas.
allowed-tools: ["bash", "write", "read_file"]
---
# Neon Drizzle Integration
Comprehensive Drizzle ORM setup for Neon databases with guided workflows.
## When to Use This Skill
- Setting up Drizzle in a new project (Next.js, Vite, Express, etc.)
- Integrating Drizzle into an existing application
- Creating or modifying database schemas
- Troubleshooting migration issues
## Code Generation Rules
When generating TypeScript/JavaScript code:
- BEFORE generating import statements, check tsconfig.json for path aliases (compilerOptions.paths)
- If path aliases exist (e.g., "@/*": ["./src/*"]), use them (e.g., import { x } from '@/lib/utils')
- If NO path aliases exist or unsure, ALWAYS use relative imports (e.g., import { x } from '../../../lib/utils')
- Verify imports match the project's configuration
- Default to relative imports - they always work regardless of configuration
## Available Guides
Each guide is a complete, self-contained walkthrough with numbered phases:
- **`guides/new-project.md`** - Full setup from scratch (see: Table of Contents)
- **`guides/existing-project.md`** - Add Drizzle to running apps (see: Table of Contents)
- **`guides/schema-only.md`** - Schema creation and modification (see: Table of Contents)
- **`guides/troubleshooting.md`** - Debug common issues (organized by error type)
I'll automatically detect your context (package manager, framework, deployment target) and select the appropriate guide based on your request.
## Quick Examples
Tell me what you're building - I'll handle the rest:
- "Setup Drizzle for my Next.js blog on Vercel" → Auto-detects Vercel + Next.js → HTTP adapter
- "Add Drizzle to my Express API" → Auto-detects Node.js server → WebSocket adapter
- "Create a users table with auth fields" → Loads schema guide → Generates schema
## Reference Documentation
For deeper technical details (loaded on-demand):
- `references/adapters.md` - HTTP vs WebSocket decision guide
- `references/migrations.md` - Migration patterns and troubleshooting
- `references/query-patterns.md` - Example queries and best practices
## Templates & Scripts
- `templates/schema-example.ts` - Multi-table schema with relations
- `templates/drizzle-config.ts` - Configuration examples
- `scripts/generate-schema.ts` - Automated migration generation
- `scripts/run-migration.ts` - Programmatic migration runner
## Related Skills
- **neon-serverless** - Connection setup without ORM
- **neon-toolkit** - Ephemeral databases for testing
- **add-neon-docs** - Add Neon best practices to your project (run after setup)
---
## Workflow
I will:
1. Detect your project context automatically
2. Select and load the appropriate guide
3. Follow the guide's phases sequentially
4. Track progress using the guide's workflow checklist
5. Load reference files only when needed
6. Offer to add Neon best practices to your project docs
Ready to get started? Just describe what you're building!

View File

@@ -0,0 +1,398 @@
# Existing Project Integration
> **Complete Walkthrough**: This is a self-contained, step-by-step guide with its own numbered phases (Phase 1-8).
> Follow each phase in order to safely add Drizzle to your existing application.
Guide for adding Drizzle ORM to an existing application with Neon.
### Important:
- Remember to run the neon-plugin:add-neon-docs skill with the parameter SKILL_NAME="neon-drizzle" after completing the guide.
## Table of Contents
- [Workflow Checklist](#workflow-checklist)
- [Phase 1: Pre-Integration Check](#phase-1-pre-integration-check)
- [Phase 2: Incremental Installation](#phase-2-incremental-installation)
- [Phase 3: Configuration](#phase-3-configuration)
- [Phase 4: Schema Strategy](#phase-4-schema-strategy)
- [Phase 5: Migration Handling](#phase-5-migration-handling)
- [Phase 6: Coexistence Patterns](#phase-6-coexistence-patterns)
- [Phase 7: Verification](#phase-7-verification)
- [Phase 8: Add Best Practices References](#phase-8-add-best-practices-references)
---
## Workflow Checklist
When following this guide, I will track these high-level tasks:
- [ ] Pre-integration check (detect existing ORMs, database schema, environment)
- [ ] Install Drizzle dependencies without disrupting existing setup
- [ ] Create isolated Drizzle configuration (separate from existing code)
- [ ] Choose and implement schema strategy (new tables vs mirroring existing)
- [ ] Handle migrations safely based on schema strategy
- [ ] Set up coexistence patterns and gradual migration approach
- [ ] Verify Drizzle integration without breaking existing functionality
- [ ] Add Neon Drizzle best practices to project docs
---
## Phase 1: Pre-Integration Check
Before adding Drizzle, check for conflicts:
### 1.1. Check for Other ORMs
```bash
grep -E '"(prisma|typeorm|sequelize|mongoose)"' package.json
```
**If found:**
- Consider migration strategy (coexistence vs replacement)
- Document which tables use which ORM
- Plan gradual migration if needed
### 1.2. Check Database Schema
Connect to your database and verify existing tables:
```bash
psql $DATABASE_URL -c "\dt"
```
**Important:** Note existing tables - Drizzle should not conflict with them.
### 1.3. Check Environment Setup
```bash
ls .env .env.local .env.production
grep DATABASE_URL .env*
```
**If DATABASE_URL exists:**
- Verify connection string format is compatible with Neon (`postgresql://...`)
- If it's a different database provider, you'll need to migrate or provision a Neon database
**If DATABASE_URL does NOT exist:**
Follow the database provisioning steps from `guides/new-project.md` Phase 3.1:
1. List the projects using the neon MCP Server to check existing projects
2. Create a new project using the neon MCP Server if needed
3. Get the connection string using the neon MCP Server
4. Write to appropriate environment file (.env.local for Next.js, .env for others)
5. Add environment file to .gitignore
## Phase 2: Incremental Installation
Add Drizzle without disrupting existing setup:
### 2.1. Install Dependencies
**For Vercel/Edge:**
```bash
[package-manager] add drizzle-orm @neondatabase/serverless
[package-manager] add -D drizzle-kit dotenv
```
**For Node.js:**
```bash
[package-manager] add drizzle-orm @neondatabase/serverless ws
[package-manager] add -D drizzle-kit dotenv @types/ws
```
### 2.2. Create Isolated Drizzle Directory
Keep Drizzle separate from existing code:
```bash
mkdir -p src/drizzle
```
Structure:
```
src/drizzle/
├── index.ts # Connection
├── schema.ts # New schemas only
└── migrations/ # Drizzle migrations
```
## Phase 3: Configuration
### 3.1. Create Drizzle Config
Create `drizzle.config.ts` with explicit environment loading:
**CRITICAL:** The `config({ path: '...' })` must match your environment file name.
**For Next.js (using .env.local):**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
// Load .env.local explicitly
config({ path: '.env.local' });
export default defineConfig({
schema: './src/drizzle/schema.ts',
out: './src/drizzle/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**For other projects (using .env):**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
// Load .env explicitly
config({ path: '.env' });
export default defineConfig({
schema: './src/drizzle/schema.ts',
out: './src/drizzle/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**Notes:**
- Point schema and migrations to `src/drizzle/` to avoid conflicts with existing code
- Explicit dotenv path prevents "url: undefined" errors during migrations
### 3.2. Create Connection
`src/drizzle/index.ts` - Choose adapter based on environment (see `references/adapters.md`):
**HTTP (Vercel/Edge):**
```typescript
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const drizzleDb = drizzle(sql);
```
**WebSocket (Node.js):**
```typescript
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
export const drizzleDb = drizzle(pool);
```
**Important:** Name export as `drizzleDb` to avoid conflicts with existing `db` exports.
## Phase 4: Schema Strategy
Choose integration approach:
### 4.1. Option A: New Tables Only
Create schemas for new features only, leave existing tables alone:
`src/drizzle/schema.ts`:
```typescript
import { pgTable, serial, text, timestamp } from 'drizzle-orm/pg-core';
export const newFeatureTable = pgTable('new_feature', {
id: serial('id').primaryKey(),
data: text('data').notNull(),
createdAt: timestamp('created_at').defaultNow(),
});
```
**Pros:**
- No migration of existing data
- Zero risk to current functionality
- Gradual adoption
**Cons:**
- Mixed query patterns (Drizzle + existing ORM)
- Two connection patterns in codebase
### 4.2. Option B: Mirror Existing Tables
Define schemas for existing tables to gradually migrate queries:
```typescript
import { pgTable, serial, varchar, timestamp } from 'drizzle-orm/pg-core';
export const existingUsers = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
name: varchar('name', { length: 255 }),
createdAt: timestamp('created_at'),
});
```
**Pros:**
- Can query existing data with Drizzle
- Gradually replace old ORM queries
- Type-safe access to existing tables
**Cons:**
- Must match existing schema exactly
- Requires careful migration strategy
### 4.3. Recommended: Hybrid Approach
1. Start with Option A (new tables only)
2. Once comfortable, add schemas for frequently-queried existing tables (Option B)
3. Gradually migrate queries from old ORM to Drizzle
4. Eventually remove old ORM
## Phase 5: Migration Handling
### 5.1. For New Tables
Generate and run migrations normally:
```bash
[package-manager] drizzle-kit generate
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
### 5.2. For Existing Tables
**Do NOT run migrations** - tables already exist!
Instead, use Drizzle schemas for querying only:
```typescript
import { drizzleDb } from './drizzle';
import { existingUsers } from './drizzle/schema';
const users = await drizzleDb.select().from(existingUsers);
```
### 5.3. Mixed Scenario
If you have both new and existing tables:
1. Define all schemas in `schema.ts`
2. Run `drizzle-kit generate`
3. **Manually edit** generated migration to remove SQL for existing tables
4. Apply migration
See `references/migrations.md` for advanced patterns.
### 5.4. Add Migration Scripts
Add these convenience scripts to your `package.json`:
```json
{
"scripts": {
"db:generate": "drizzle-kit generate",
"db:migrate": "drizzle-kit migrate",
"db:push": "drizzle-kit push",
"db:studio": "drizzle-kit studio"
}
}
```
**Usage:**
```bash
npm run db:generate # Generate migrations from schema changes
npm run db:migrate # Apply pending migrations
npm run db:push # Push schema directly (dev only)
npm run db:studio # Open Drizzle Studio
```
**Note:** Replace `npm run` with your package manager's equivalent (`pnpm`, `yarn`, `bun`).
## Phase 6: Coexistence Patterns
### 6.1. Naming Conventions
Keep clear separation:
```typescript
import { db as prismaDb } from './lib/prisma';
import { drizzleDb } from './drizzle';
const prismaUsers = await prismaDb.user.findMany();
const drizzleFeatures = await drizzleDb.select().from(newFeatureTable);
```
### 6.2. Gradual Migration
**Step 1:** New features use Drizzle
```typescript
async function createFeature(data: NewFeatureInput) {
return drizzleDb.insert(newFeatureTable).values(data).returning();
}
```
**Step 2:** Migrate read queries (safe, no data changes)
```typescript
async function getUsers() {
return drizzleDb.select().from(existingUsers);
}
```
**Step 3:** Migrate write queries (after thorough testing)
```typescript
async function updateUser(id: number, data: UserUpdate) {
return drizzleDb.update(existingUsers)
.set(data)
.where(eq(existingUsers.id, id));
}
```
**Step 4:** Remove old ORM once all queries migrated
## Phase 7: Verification
Test integration without breaking existing functionality:
### 7.1. Test New Tables
```typescript
import { drizzleDb } from './drizzle';
import { newFeatureTable } from './drizzle/schema';
const result = await drizzleDb.insert(newFeatureTable)
.values({ data: 'test' })
.returning();
console.log('New table works:', result);
```
### 7.2. Test Existing Tables (if mirrored)
```typescript
import { drizzleDb } from './drizzle';
import { existingUsers } from './drizzle/schema';
const users = await drizzleDb.select().from(existingUsers);
console.log('Existing table accessible:', users);
```
### 7.3. Verify Old ORM Still Works
```typescript
import { db as oldDb } from './lib/your-orm';
const oldQuery = await oldDb.users.findMany();
console.log('Old ORM still works:', oldQuery);
```
## Phase 8: Add Best Practices References
Before executing the add-neon-docs skill, provide a summary of everything that has been done:
"✅ ... Drizzle integration is complete! Now adding documentation references..."
Then execute the neon-plugin:add-neon-docs skill with the parameter SKILL_NAME="neon-drizzle"
This will add reference links to Neon + Drizzle best practices documentation in your project's AI documentation file, helping AI assistants provide better guidance in future conversations.
---
## ✅ Integration Complete!
Your Drizzle integration with the existing project is ready to use.

View File

@@ -0,0 +1,312 @@
# New Project Setup
> **Complete Walkthrough**: This is a self-contained, step-by-step guide with its own numbered phases (Phase 1-6).
> Follow each phase in order for a full Drizzle + Neon setup from scratch.
Complete guide for setting up Drizzle ORM with Neon from scratch.
### Important:
- Remember to run the neon-plugin:add-neon-docs skill with the parameter SKILL_NAME="neon-drizzle" after completing the guide.
## Table of Contents
- [New Project Setup](#new-project-setup)
- [Important:](#important)
- [Table of Contents](#table-of-contents)
- [Workflow Checklist](#workflow-checklist)
- [Phase 1: Context Detection](#phase-1-context-detection)
- [Phase 2: Installation](#phase-2-installation)
- [Phase 3: Configuration](#phase-3-configuration)
- [3.1. Neon Database Provisioning \& Environment File](#31-neon-database-provisioning--environment-file)
- [3.2. Drizzle Config](#32-drizzle-config)
- [3.3. Database Connection](#33-database-connection)
- [Phase 4: Schema Generation](#phase-4-schema-generation)
- [4.1. Common Patterns](#41-common-patterns)
- [Phase 5: Migrations](#phase-5-migrations)
- [5.1. Generate Migration](#51-generate-migration)
- [5.2. Apply Migration](#52-apply-migration)
- [5.3. Add Migration Scripts](#53-add-migration-scripts)
- [5.4. If Migration Fails](#54-if-migration-fails)
- [Phase 6: Add Best Practices References](#phase-6-add-best-practices-references)
- [✅ Setup Complete!](#-setup-complete)
---
## Workflow Checklist
When following this guide, I will track these high-level tasks:
- [ ] Detect project context (package manager, framework, existing setup)
- [ ] Install Drizzle dependencies based on deployment target
- [ ] Provision Neon database (list projects, create if needed, get connection string)
- [ ] Write connection string to environment file and verify
- [ ] Create Drizzle configuration files (drizzle.config.ts, db connection)
- [ ] Generate schema based on app type
- [ ] Run and verify migrations
- [ ] Add Neon Drizzle best practices to project docs
---
## Phase 1: Context Detection
Auto-detect project context:
**Check Package Manager:**
```bash
ls package-lock.json # → npm
ls bun.lockb # → bun
ls pnpm-lock.yaml # → pnpm
ls yarn.lock # → yarn
```
**Check Framework:**
```bash
grep '"next"' package.json # → Next.js
grep '"express"' package.json # → Express
grep '"vite"' package.json # → Vite
```
**Check Existing Setup:**
```bash
ls drizzle.config.ts # Already configured?
ls src/db/schema.ts # Schema exists?
```
**Check Environment Files:**
```bash
ls .env .env.local .env.production
```
## Phase 2: Installation
Based on detection, install dependencies:
**For Vercel/Edge Environments (Next.js, Vite on Vercel):**
```bash
[package-manager] add drizzle-orm @neondatabase/serverless
[package-manager] add -D drizzle-kit dotenv @vercel/node
```
**For Node.js Servers (Express, Fastify, standard Node):**
```bash
[package-manager] add drizzle-orm @neondatabase/serverless ws
[package-manager] add -D drizzle-kit dotenv @types/ws
```
## Phase 3: Configuration
Create configuration files in dependency order:
### 3.1. Neon Database Provisioning & Environment File
**Outcome**: A working `.env` or `.env.local` file with a real Neon connection string that the application can use immediately.
Use MCP tools to list or create a Neon project and get its connection string. Write the actual credentials to the environment file (`.env.local` for Next.js, `.env` for other projects). Add the file to `.gitignore`.
**Environment file format:**
```bash
DATABASE_URL=postgresql://user:password@host/database?sslmode=require
```
### 3.2. Drizzle Config
Create `drizzle.config.ts` with explicit environment loading:
**CRITICAL:** The `config({ path: '...' })` line must match the environment file from Step 3.1.
**For Next.js (using .env.local):**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
// Load .env.local explicitly
config({ path: '.env.local' });
export default defineConfig({
schema: './src/db/schema.ts',
out: './src/db/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**For other projects (using .env):**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
// Load .env explicitly
config({ path: '.env' });
export default defineConfig({
schema: './src/db/schema.ts',
out: './src/db/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**Why this matters:**
- Without explicit `config({ path: '...' })`, drizzle-kit may not load environment variables
- This prevents "url: undefined" errors during migrations
- The path must match your environment file name from Phase 3.1
### 3.3. Database Connection
Create `src/db/index.ts` with appropriate adapter (see `references/adapters.md` for decision guide):
**For Vercel/Edge:**
```typescript
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql);
```
**For Node.js:**
```typescript
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
export const db = drizzle(pool);
```
See `templates/db-http.ts` and `templates/db-websocket.ts` for complete examples.
## Phase 4: Schema Generation
Based on app type, create appropriate schema:
### 4.1. Common Patterns
**Todo App:**
```typescript
import { pgTable, serial, text, boolean, timestamp, varchar } from 'drizzle-orm/pg-core';
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
name: varchar('name', { length: 255 }).notNull(),
createdAt: timestamp('created_at').defaultNow(),
});
export const todos = pgTable('todos', {
id: serial('id').primaryKey(),
userId: serial('user_id').notNull().references(() => users.id),
title: text('title').notNull(),
completed: boolean('completed').default(false),
createdAt: timestamp('created_at').defaultNow(),
});
```
**Blog App:**
```typescript
import { pgTable, serial, text, timestamp, varchar, index } from 'drizzle-orm/pg-core';
import { relations } from 'drizzle-orm';
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
name: varchar('name', { length: 255 }).notNull(),
createdAt: timestamp('created_at').defaultNow(),
});
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
userId: serial('user_id').notNull().references(() => users.id),
title: text('title').notNull(),
content: text('content').notNull(),
createdAt: timestamp('created_at').defaultNow(),
}, (table) => ({
userIdIdx: index('posts_user_id_idx').on(table.userId),
}));
export const usersRelations = relations(users, ({ many }) => ({
posts: many(posts),
}));
export const postsRelations = relations(posts, ({ one }) => ({
author: one(users, {
fields: [posts.userId],
references: [users.id],
}),
}));
```
See `templates/schema-example.ts` for more complex examples.
## Phase 5: Migrations
Run migrations with proper error handling:
### 5.1. Generate Migration
```bash
[package-manager] drizzle-kit generate
```
This creates SQL files in `src/db/migrations/`.
### 5.2. Apply Migration
**Recommended approach (explicit env loading):**
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
**Why this works:** Ensures `DATABASE_URL` is available, preventing "url: undefined" errors.
### 5.3. Add Migration Scripts
Add these convenience scripts to your `package.json`:
```json
{
"scripts": {
"db:generate": "drizzle-kit generate",
"db:migrate": "drizzle-kit migrate",
"db:push": "drizzle-kit push",
"db:studio": "drizzle-kit studio"
}
}
```
**Usage:**
```bash
npm run db:generate # Generate migrations from schema changes
npm run db:migrate # Apply pending migrations
npm run db:push # Push schema directly (dev only)
npm run db:studio # Open Drizzle Studio
```
**Note:** Replace `npm run` with your package manager's equivalent (`pnpm`, `yarn`, `bun`).
### 5.4. If Migration Fails
See `guides/troubleshooting.md` for common issues and fixes.
Also reference `references/migrations.md` for deep dive on migration patterns.
## Phase 6: Add Best Practices References
Before executing the add-neon-docs skill, provide a summary of everything that has been done:
"✅ ... Drizzle integration is complete! Now adding documentation references..."
Then execute the neon-plugin:add-neon-docs skill with the parameter SKILL_NAME="neon-drizzle"
This will add reference links to Neon + Drizzle best practices documentation in your project's AI documentation file, helping AI assistants provide better guidance in future conversations.
## ✅ Setup Complete!
Your Drizzle + Neon integration is ready to use.

View File

@@ -0,0 +1,415 @@
# Schema Creation and Modification
> **Complete Walkthrough**: This is a self-contained, step-by-step guide with its own numbered phases (Phase 1-6).
> Follow each phase in order for schema design, modification, and migration workflows.
Guide for creating or modifying database schemas with Drizzle.
## Table of Contents
- [Workflow Checklist](#workflow-checklist)
- [Phase 1: Schema Design Patterns](#phase-1-schema-design-patterns)
- [Phase 2: Common Schema Patterns](#phase-2-common-schema-patterns)
- [Phase 3: Schema Modifications](#phase-3-schema-modifications)
- [Phase 4: Indexes and Constraints](#phase-4-indexes-and-constraints)
- [Phase 5: Generate and Apply Changes](#phase-5-generate-and-apply-changes)
- [Phase 6: Advanced Patterns](#phase-6-advanced-patterns)
- [Common Issues](#common-issues)
- [Next Steps](#next-steps)
---
## Workflow Checklist
When following this guide, I will track these high-level tasks:
- [ ] Design schema using appropriate patterns (tables, relationships, types)
- [ ] Apply common schema patterns (auth, soft deletes, enums, JSON)
- [ ] Implement schema modifications (add/rename/drop columns, change types)
- [ ] Add indexes and constraints for performance and data integrity
- [ ] Generate and apply migrations
- [ ] Verify changes and test with queries
---
## Phase 1: Schema Design Patterns
### 1.1. Basic Table Structure
```typescript
import { pgTable, serial, text, varchar, timestamp, boolean } from 'drizzle-orm/pg-core';
export const tableName = pgTable('table_name', {
id: serial('id').primaryKey(),
name: varchar('name', { length: 255 }).notNull(),
description: text('description'),
isActive: boolean('is_active').default(true),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
});
```
**Key conventions:**
- Use `serial` for auto-incrementing IDs
- Use `varchar` for short strings (with length limit)
- Use `text` for long strings
- Use `timestamp` for dates/times
- Always add `createdAt` for audit trails
### 1.2. Relationships
**One-to-Many:**
```typescript
import { pgTable, serial, text, timestamp, index } from 'drizzle-orm/pg-core';
export const authors = pgTable('authors', {
id: serial('id').primaryKey(),
name: text('name').notNull(),
});
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
authorId: serial('author_id')
.notNull()
.references(() => authors.id),
title: text('title').notNull(),
content: text('content').notNull(),
}, (table) => ({
authorIdIdx: index('posts_author_id_idx').on(table.authorId),
}));
```
**Important:** Always add index on foreign keys for query performance.
**Many-to-Many:**
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
});
export const tags = pgTable('tags', {
id: serial('id').primaryKey(),
name: text('name').notNull(),
});
export const postsTags = pgTable('posts_tags', {
postId: serial('post_id')
.notNull()
.references(() => posts.id),
tagId: serial('tag_id')
.notNull()
.references(() => tags.id),
}, (table) => ({
pk: index('posts_tags_pk').on(table.postId, table.tagId),
}));
```
### 1.3. Type-Safe Relations
Enable relational queries:
```typescript
import { relations } from 'drizzle-orm';
export const authorsRelations = relations(authors, ({ many }) => ({
posts: many(posts),
}));
export const postsRelations = relations(posts, ({ one }) => ({
author: one(authors, {
fields: [posts.authorId],
references: [authors.id],
}),
}));
```
**Benefits:**
- Type-safe joins
- Automatic loading of related data
- No manual JOIN queries needed
## Phase 2: Common Schema Patterns
### 2.1. User Authentication
```typescript
import { pgTable, serial, varchar, timestamp, boolean } from 'drizzle-orm/pg-core';
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
passwordHash: varchar('password_hash', { length: 255 }),
name: varchar('name', { length: 255 }).notNull(),
emailVerified: boolean('email_verified').default(false),
createdAt: timestamp('created_at').defaultNow(),
lastLoginAt: timestamp('last_login_at'),
});
```
### 2.2. Soft Deletes
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
content: text('content').notNull(),
deletedAt: timestamp('deleted_at'),
createdAt: timestamp('created_at').defaultNow(),
});
```
Query with soft deletes:
```typescript
const activePosts = await db
.select()
.from(posts)
.where(isNull(posts.deletedAt));
```
### 2.3. Enums
```typescript
import { pgEnum, pgTable, serial, text } from 'drizzle-orm/pg-core';
export const statusEnum = pgEnum('status', ['draft', 'published', 'archived']);
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
status: statusEnum('status').default('draft'),
});
```
### 2.4. JSON Fields
```typescript
import { pgTable, serial, jsonb } from 'drizzle-orm/pg-core';
export const products = pgTable('products', {
id: serial('id').primaryKey(),
name: text('name').notNull(),
metadata: jsonb('metadata').$type<{
color?: string;
size?: string;
tags?: string[];
}>(),
});
```
## Phase 3: Schema Modifications
### 3.1. Adding Columns
**Step 1:** Update schema:
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
phoneNumber: varchar('phone_number', { length: 20 }), // NEW
});
```
**Step 2:** Generate migration:
```bash
[package-manager] drizzle-kit generate
```
**Step 3:** Apply migration:
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
### 3.2. Renaming Columns
**Important:** Drizzle sees renames as drop + add. Manual migration required.
**Step 1:** Update schema:
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
fullName: varchar('full_name', { length: 255 }), // was 'name'
});
```
**Step 2:** Generate migration (will create drop + add):
```bash
[package-manager] drizzle-kit generate
```
**Step 3:** Edit migration file manually:
```sql
-- Change from:
-- ALTER TABLE users DROP COLUMN name;
-- ALTER TABLE users ADD COLUMN full_name VARCHAR(255);
-- To:
ALTER TABLE users RENAME COLUMN name TO full_name;
```
**Step 4:** Apply migration:
```bash
[package-manager] drizzle-kit migrate
```
### 3.3. Dropping Columns
**Step 1:** Remove from schema:
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
// removed: phoneNumber
});
```
**Step 2:** Generate and apply:
```bash
[package-manager] drizzle-kit generate
[package-manager] drizzle-kit migrate
```
**Warning:** This permanently deletes data. Back up first!
### 3.4. Changing Column Types
**Step 1:** Update schema:
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
views: bigint('views', { mode: 'number' }), // was: integer
});
```
**Step 2:** Generate migration:
```bash
[package-manager] drizzle-kit generate
```
**Step 3:** Review generated SQL - may need data migration if incompatible types.
## Phase 4: Indexes and Constraints
### 4.1. Add Indexes
**Single column:**
```typescript
import { pgTable, serial, text, index } from 'drizzle-orm/pg-core';
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
authorId: serial('author_id').notNull(),
}, (table) => ({
titleIdx: index('posts_title_idx').on(table.title),
authorIdIdx: index('posts_author_id_idx').on(table.authorId),
}));
```
**Composite index:**
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
authorId: serial('author_id').notNull(),
status: text('status').notNull(),
}, (table) => ({
authorStatusIdx: index('posts_author_status_idx').on(table.authorId, table.status),
}));
```
### 4.2. Unique Constraints
**Single column:**
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
});
```
**Multiple columns:**
```typescript
import { pgTable, serial, text, unique } from 'drizzle-orm/pg-core';
export const postsTags = pgTable('posts_tags', {
postId: serial('post_id').notNull(),
tagId: serial('tag_id').notNull(),
}, (table) => ({
unq: unique('posts_tags_unique').on(table.postId, table.tagId),
}));
```
### 4.3. Check Constraints
```typescript
import { pgTable, serial, integer, check } from 'drizzle-orm/pg-core';
export const products = pgTable('products', {
id: serial('id').primaryKey(),
price: integer('price').notNull(),
discountedPrice: integer('discounted_price'),
}, (table) => ({
priceCheck: check('price_check', 'price >= 0'),
discountCheck: check('discount_check', 'discounted_price < price'),
}));
```
## Phase 5: Generate and Apply Changes
### 5.1. Generate Migration
After any schema changes:
```bash
[package-manager] drizzle-kit generate
```
Review generated SQL in `src/db/migrations/`.
### 5.2. Apply Migration
With proper environment loading:
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
Or use the migration script:
```bash
[package-manager] tsx scripts/run-migration.ts
```
### 5.3. Verify Changes
**Check in database:**
```bash
psql $DATABASE_URL -c "\d table_name"
```
**Test with queries:**
```typescript
import { db } from './src/db';
import { tableName } from './src/db/schema';
const result = await db.select().from(tableName);
console.log('Schema works:', result);
```
## Phase 6: Advanced Patterns
For complex schemas, see:
- `templates/schema-example.ts` - Multi-table examples with relations
- `references/migrations.md` - Advanced migration patterns
## Common Issues
- **Migration conflicts:** See `guides/troubleshooting.md`
- **Relationship errors:** Ensure foreign keys reference correct columns
- **Type mismatches:** Match TypeScript types with SQL types carefully
## Next Steps
After schema creation:
1. Run migrations (see above)
2. Create queries (see `references/query-patterns.md`)
3. Add validation (use Zod or similar)
4. Test thoroughly before production

View File

@@ -0,0 +1,539 @@
# Troubleshooting Guide
> **Reference Guide**: This is organized by error type and solution, not sequential phases.
> Jump directly to the error you're experiencing for quick resolution.
Common issues and solutions for Drizzle ORM with Neon.
## Table of Contents
- [Migration Errors](#migration-errors)
- [Connection Errors](#connection-errors)
- [Adapter Issues](#adapter-issues)
- [Type Errors](#type-errors)
- [Query Errors](#query-errors)
- [Performance Issues](#performance-issues)
- [Environment Issues](#environment-issues)
- [Getting More Help](#getting-more-help)
- [Prevention Checklist](#prevention-checklist)
---
## Migration Errors
### Error: "url: undefined"
**Symptom:**
```
Error: url is undefined in dbCredentials
```
**Cause:** Environment variables not loaded during migration.
**Solutions:**
**Option 1: Explicit env loading**
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
**Option 2: Update drizzle.config.ts**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
config({ path: '.env.local' });
export default defineConfig({
schema: './src/db/schema.ts',
out: './src/db/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**Option 3: Use programmatic migration**
```typescript
import { migrate } from 'drizzle-orm/neon-http/migrator';
import { db } from './src/db';
import { config } from 'dotenv';
config({ path: '.env.local' });
await migrate(db, { migrationsFolder: './src/db/migrations' });
```
### Error: "Cannot find migrations folder"
**Symptom:**
```
Error: ENOENT: no such file or directory, scandir './src/db/migrations'
```
**Cause:** Migrations folder doesn't exist yet.
**Solution:**
```bash
mkdir -p src/db/migrations
[package-manager] drizzle-kit generate
```
### Error: "Column already exists"
**Symptom:**
```
Error: column "name" of relation "users" already exists
```
**Cause:** Trying to add a column that already exists in the database.
**Solutions:**
**Option 1: Skip migration (dev only)**
```bash
rm src/db/migrations/[latest-migration-file].sql
[package-manager] drizzle-kit generate
```
**Option 2: Drop and recreate table (dev only, DATA LOSS)**
```bash
psql $DATABASE_URL -c "DROP TABLE users CASCADE;"
[package-manager] drizzle-kit migrate
```
**Option 3: Manual migration (production)**
Edit the migration file to check if column exists:
```sql
ALTER TABLE users
ADD COLUMN IF NOT EXISTS name VARCHAR(255);
```
### Error: "Migration already applied"
**Symptom:**
```
Error: migration has already been applied
```
**Cause:** Drizzle tracks applied migrations. Trying to reapply.
**Solution:**
Check migration journal:
```bash
cat src/db/migrations/meta/_journal.json
```
Remove duplicate entry or regenerate:
```bash
rm -rf src/db/migrations
mkdir src/db/migrations
[package-manager] drizzle-kit generate
```
**Warning:** Only do this in development!
## Connection Errors
### Error: "Connection refused"
**Symptom:**
```
Error: connect ECONNREFUSED
```
**Causes and Solutions:**
**1. Wrong DATABASE_URL format**
Check format:
```bash
echo $DATABASE_URL
```
Should be:
```
postgresql://user:password@host.neon.tech/dbname?sslmode=require
```
**2. Missing sslmode**
Add to DATABASE_URL:
```
?sslmode=require
```
**3. Firewall/network issue**
Test connectivity:
```bash
psql $DATABASE_URL -c "SELECT 1"
```
### Error: "WebSocket connection failed"
**Symptom:**
```
Error: WebSocket connection to 'wss://...' failed
```
**Cause:** Missing WebSocket constructor in Node.js.
**Solution:**
Add to your connection file:
```typescript
import { neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
```
Install ws if missing:
```bash
[package-manager] add ws
[package-manager] add -D @types/ws
```
### Error: "Too many connections"
**Symptom:**
```
Error: sorry, too many clients already
```
**Cause:** Connection pool exhausted.
**Solutions:**
**For HTTP adapter:** This shouldn't happen (stateless).
**For WebSocket adapter:** Implement connection pooling:
```typescript
import { Pool } from '@neondatabase/serverless';
const pool = new Pool({
connectionString: process.env.DATABASE_URL!,
max: 10, // Limit connections
});
export const db = drizzle(pool);
```
**Close connections properly:**
```typescript
process.on('SIGTERM', async () => {
await pool.end();
process.exit(0);
});
```
## Adapter Issues
### Wrong Adapter for Environment
**Symptom:** App works locally but fails in production (or vice versa).
**Cause:** Using wrong adapter for environment.
**Solutions:**
See `references/adapters.md` for decision guide.
**Quick reference:**
- Vercel/Cloudflare/Edge → HTTP adapter
- Node.js/Express/Long-lived → WebSocket adapter
**HTTP adapter:**
```typescript
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql);
```
**WebSocket adapter:**
```typescript
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
export const db = drizzle(pool);
```
## Type Errors
### Error: "Type 'number' is not assignable to type 'string'"
**Symptom:**
```typescript
const user = await db.insert(users).values({
id: 1, // Error here
email: 'test@example.com',
});
```
**Cause:** Trying to manually set auto-increment ID.
**Solution:**
Remove `id` from insert (it's auto-generated):
```typescript
const user = await db.insert(users).values({
email: 'test@example.com',
});
```
### Error: "Property 'xyz' does not exist"
**Symptom:**
```typescript
const user = await db.select().from(users);
console.log(user[0].nonExistentField); // Error
```
**Cause:** Column not defined in schema.
**Solution:**
Add column to schema:
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
nonExistentField: text('non_existent_field'),
});
```
Then regenerate and apply migration.
## Query Errors
### Error: "relation does not exist"
**Symptom:**
```
Error: relation "users" does not exist
```
**Cause:** Table not created in database yet.
**Solution:**
Run migrations:
```bash
[package-manager] drizzle-kit generate
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
[package-manager] drizzle-kit migrate
```
### Error: "column does not exist"
**Symptom:**
```
Error: column "email" does not exist
```
**Causes:**
**1. Schema out of sync with database**
Regenerate and apply migrations:
```bash
[package-manager] drizzle-kit generate
[package-manager] drizzle-kit migrate
```
**2. Wrong table name in query**
Check schema definition vs query.
**3. Case sensitivity**
PostgreSQL is case-sensitive. Ensure column names match exactly.
### Error: "Cannot perform transactions with HTTP adapter"
**Symptom:**
```typescript
await db.transaction(async (tx) => {
// Error: transactions not supported
});
```
**Cause:** HTTP adapter doesn't support transactions.
**Solutions:**
**Option 1: Switch to WebSocket adapter** (if environment allows)
See `references/adapters.md`.
**Option 2: Use batch operations**
```typescript
await db.batch([
db.insert(users).values({ email: 'test1@example.com' }),
db.insert(posts).values({ title: 'Test' }),
]);
```
**Option 3: Implement application-level rollback**
Not ideal, but possible for simple cases.
## Performance Issues
### Slow Queries
**Symptoms:** Queries taking seconds instead of milliseconds.
**Diagnose:**
**1. Missing indexes**
Check if foreign keys have indexes:
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
authorId: serial('author_id').notNull(),
}, (table) => ({
authorIdIdx: index('posts_author_id_idx').on(table.authorId), // ADD THIS
}));
```
**2. N+1 queries**
Use relations instead of multiple queries:
```typescript
const postsWithAuthors = await db.query.posts.findMany({
with: {
author: true,
},
});
```
**3. Selecting too much data**
Select only needed columns:
```typescript
const users = await db.select({
id: users.id,
email: users.email,
}).from(users);
```
### Connection Timeout
**Symptom:** Queries timeout in production.
**Solutions:**
**1. For Vercel:** Ensure using HTTP adapter (see `references/adapters.md`)
**2. For Node.js:** Implement connection pooling with retry:
```typescript
import { Pool } from '@neondatabase/serverless';
const pool = new Pool({
connectionString: process.env.DATABASE_URL!,
max: 10,
connectionTimeoutMillis: 5000,
idleTimeoutMillis: 30000,
});
```
**3. Add query timeout:**
```typescript
const result = await Promise.race([
db.select().from(users),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('Query timeout')), 5000)
),
]);
```
## Environment Issues
### Error: "DATABASE_URL is undefined"
**Symptom:** App can't find DATABASE_URL.
**Solutions:**
**1. Check env file exists:**
```bash
ls .env .env.local
```
**2. Verify var is set:**
```bash
grep DATABASE_URL .env.local
```
**3. Load env vars:**
```typescript
import { config } from 'dotenv';
config({ path: '.env.local' });
```
**4. For Next.js:** Use `NEXT_PUBLIC_` prefix if accessing client-side (NOT recommended for DATABASE_URL):
```
# Don't do this - security risk
NEXT_PUBLIC_DATABASE_URL="..."
# Do this - server-only
DATABASE_URL="..."
```
### Error: "Invalid connection string"
**Symptom:**
```
Error: invalid connection string
```
**Cause:** Malformed DATABASE_URL.
**Check format:**
```
postgresql://USER:PASSWORD@HOST:PORT/DATABASE?sslmode=require
```
**Common mistakes:**
- Missing `postgresql://` prefix
- Special characters in password not URL-encoded
- Missing `?sslmode=require`
**Fix special characters:**
```bash
# If password is "p@ss&word!"
# Encode to: p%40ss%26word%21
```
## Getting More Help
If your issue isn't listed here:
1. **Check adapter configuration:** `references/adapters.md`
2. **Review migration patterns:** `references/migrations.md`
3. **Check query syntax:** `references/query-patterns.md`
4. **Search Drizzle docs:** https://orm.drizzle.team/docs
5. **Check Neon docs:** https://neon.com/docs
## Prevention Checklist
Before deploying:
- [ ] Environment variables properly loaded
- [ ] Correct adapter for environment
- [ ] Migrations applied successfully
- [ ] Indexes on foreign keys
- [ ] Connection pooling configured (if Node.js)
- [ ] Error handling for database operations
- [ ] .env files in .gitignore
- [ ] Test queries work in production environment

View File

@@ -0,0 +1,478 @@
# Adapter Reference Guide
Complete guide for choosing between HTTP and WebSocket adapters.
## Table of Contents
- [Quick Decision Matrix](#quick-decision-matrix)
- [HTTP Adapter](#http-adapter-neondatabaseserverless-with-neon-http)
- [WebSocket Adapter](#websocket-adapter-neondatabaseserverless-with-neon-serverless)
- [Framework-Specific Recommendations](#framework-specific-recommendations)
- [Mixed Environments](#mixed-environments)
- [Feature Comparison Table](#feature-comparison-table)
- [Performance Considerations](#performance-considerations)
- [Troubleshooting](#troubleshooting)
- [Migration Between Adapters](#migration-between-adapters)
- [Choosing the Right Adapter](#choosing-the-right-adapter)
- [Related Resources](#related-resources)
---
## Quick Decision Matrix
| Environment | Adapter | Reason |
|-------------|---------|--------|
| Vercel | HTTP | Edge functions, stateless |
| Cloudflare Workers | HTTP | Edge runtime, no WebSocket |
| AWS Lambda | HTTP | Stateless, cold starts |
| Next.js (Vercel) | HTTP | App Router, Edge Runtime |
| Express/Fastify | WebSocket | Long-lived connections |
| Node.js server | WebSocket | Connection pooling |
| Bun server | WebSocket | Persistent runtime |
## HTTP Adapter (@neondatabase/serverless with neon-http)
### When to Use
**Serverless/Edge environments:**
- Vercel Edge Functions
- Cloudflare Workers
- AWS Lambda
- Deno Deploy
- Next.js App Router (default)
**Characteristics:**
- Stateless requests
- Cold starts
- Short execution time
- No persistent connections
### Setup
**Installation:**
```bash
npm add drizzle-orm @neondatabase/serverless
npm add -D drizzle-kit
```
**Connection:**
```typescript
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql);
```
**Complete example:** See `templates/db-http.ts`
### Pros
**Perfect for serverless:**
- No connection management needed
- Works in edge environments
- Fast cold starts
- Auto-scales
**Simple:**
- Minimal configuration
- No connection pooling complexity
- Stateless = predictable
### Cons
**Limited features:**
- No transactions
- No prepared statements
- No streaming
- Higher latency per query
**Not ideal for:**
- Batch operations
- Complex transactions
- High-frequency queries from same process
### Best Practices
**1. Use batch for multiple operations:**
```typescript
await db.batch([
db.insert(users).values({ email: 'test@example.com' }),
db.insert(posts).values({ title: 'Test' }),
]);
```
**2. Cache query results:**
```typescript
import { unstable_cache } from 'next/cache';
const getUsers = unstable_cache(
async () => db.select().from(users),
['users'],
{ revalidate: 60 }
);
```
**3. Minimize round trips:**
```typescript
const usersWithPosts = await db.query.users.findMany({
with: { posts: true },
});
```
## WebSocket Adapter (@neondatabase/serverless with neon-serverless)
### When to Use
**Long-lived processes:**
- Express/Fastify servers
- Standard Node.js applications
- Background workers
- WebSocket servers
- Bun applications
**Characteristics:**
- Persistent connections
- Long execution time
- Connection pooling
- Complex transactions
### Setup
**Installation:**
```bash
npm add drizzle-orm @neondatabase/serverless ws
npm add -D drizzle-kit @types/ws
```
**Connection:**
```typescript
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({
connectionString: process.env.DATABASE_URL!,
max: 10,
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 5000,
});
export const db = drizzle(pool);
```
**Complete example:** See `templates/db-websocket.ts`
### Pros
**Full features:**
- Transactions
- Prepared statements
- Streaming
- Lower latency (persistent connection)
**Better for:**
- Multiple queries per request
- Complex business logic
- High-frequency operations
### Cons
**More complex:**
- Connection pool management
- Need to handle connection errors
- Not available in edge environments
**Resource considerations:**
- Connection limits
- Memory usage
- Cold start overhead
### Best Practices
**1. Configure connection pool:**
```typescript
const pool = new Pool({
connectionString: process.env.DATABASE_URL!,
max: 10, // Max connections
idleTimeoutMillis: 30000, // Close idle after 30s
connectionTimeoutMillis: 5000, // Timeout after 5s
});
```
**2. Graceful shutdown:**
```typescript
process.on('SIGTERM', async () => {
await pool.end();
process.exit(0);
});
process.on('SIGINT', async () => {
await pool.end();
process.exit(0);
});
```
**3. Use transactions:**
```typescript
await db.transaction(async (tx) => {
const user = await tx.insert(users)
.values({ email: 'test@example.com' })
.returning();
await tx.insert(posts)
.values({ userId: user[0].id, title: 'First post' });
});
```
**4. Handle connection errors:**
```typescript
pool.on('error', (err) => {
console.error('Unexpected pool error:', err);
});
pool.on('connect', () => {
console.log('Pool connection established');
});
```
## Framework-Specific Recommendations
### Next.js
**App Router (default):**
- Use HTTP adapter (Edge Runtime)
- Server Actions → HTTP
- Route Handlers → HTTP
**Pages Router:**
- API Routes → Either adapter works
- Recommend HTTP for consistency
**Example:**
```typescript
// app/actions/users.ts
'use server';
import { db } from '@/db'; // HTTP adapter
import { users } from '@/db/schema';
export async function createUser(email: string) {
return db.insert(users).values({ email }).returning();
}
```
### Express
**Standard setup:**
- Use WebSocket adapter
- Configure connection pool
- Implement health checks
**Example:**
```typescript
import express from 'express';
import { db } from './db'; // WebSocket adapter
import { users } from './db/schema';
const app = express();
app.get('/health', async (req, res) => {
try {
await db.select().from(users).limit(1);
res.json({ status: 'healthy' });
} catch (err) {
res.status(500).json({ status: 'unhealthy', error: err.message });
}
});
app.listen(3000);
```
### Vite/React (SPA)
**Deployment matters:**
**If deploying to Vercel:**
- API routes → HTTP adapter
- Static files → No backend needed
**If deploying to Node.js server:**
- Backend API → WebSocket adapter
- Frontend → Fetch from API
### Bun
**Recommendation:**
- Use WebSocket adapter
- Bun has built-in WebSocket support
- No need for `ws` package
**Setup:**
```typescript
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool } from '@neondatabase/serverless';
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
export const db = drizzle(pool);
```
## Mixed Environments
### Using Both Adapters
If you have both serverless and long-lived components:
**Structure:**
```
src/
├── db/
│ ├── http.ts # HTTP adapter for serverless
│ ├── ws.ts # WebSocket for servers
│ └── schema.ts # Shared schema
```
**HTTP adapter:**
```typescript
// src/db/http.ts
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const httpDb = drizzle(sql);
```
**WebSocket adapter:**
```typescript
// src/db/ws.ts
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
export const wsDb = drizzle(pool);
```
**Usage:**
```typescript
// Vercel Edge Function
import { httpDb as db } from '@/db/http';
// Express route
import { wsDb as db } from '@/db/ws';
```
## Feature Comparison Table
| Feature | HTTP Adapter | WebSocket Adapter |
|---------|-------------|-------------------|
| Transactions | ❌ No | ✅ Yes |
| Prepared statements | ❌ No | ✅ Yes |
| Streaming results | ❌ No | ✅ Yes |
| Connection pooling | N/A (stateless) | ✅ Yes |
| Edge runtime | ✅ Yes | ❌ No |
| Cold start speed | ✅ Fast | ⚠️ Slower |
| Latency per query | ⚠️ Higher | ✅ Lower |
| Batch operations | ✅ Yes | ✅ Yes |
| Max connection limit | N/A | ⚠️ Applies |
## Performance Considerations
### HTTP Adapter Performance
**Optimize by:**
- Minimizing round trips
- Using batch operations
- Caching query results
- Pre-fetching related data
**Typical latency:**
- Single query: 50-200ms
- Batch operation: 100-300ms
### WebSocket Adapter Performance
**Optimize by:**
- Configuring pool size correctly
- Using transactions for related operations
- Implementing query caching
- Monitoring connection usage
**Typical latency:**
- First query (connection): 50-100ms
- Subsequent queries: 10-50ms
## Troubleshooting
### HTTP Adapter Issues
**Problem:** "fetch is not defined"
- **Solution:** Ensure running in environment with fetch API (Node 18+, edge runtime)
**Problem:** Slow queries
- **Solution:** Use batch operations, reduce round trips
### WebSocket Adapter Issues
**Problem:** "WebSocket is not defined"
- **Solution:** Add `neonConfig.webSocketConstructor = ws`
**Problem:** "Too many connections"
- **Solution:** Reduce pool `max` size, ensure connections are closed
**Problem:** Connection timeouts
- **Solution:** Increase `connectionTimeoutMillis`, implement retry logic
## Migration Between Adapters
### HTTP → WebSocket
**When:** Moving from serverless to dedicated server.
**Steps:**
1. Install ws: `npm add ws @types/ws`
2. Update connection file to WebSocket adapter
3. Update drizzle.config.ts if needed
4. Test transactions (now available)
### WebSocket → HTTP
**When:** Moving to serverless/edge deployment.
**Steps:**
1. Update connection file to HTTP adapter
2. Remove ws dependency
3. **Important:** Replace transactions with batch operations
4. Test thoroughly (feature differences)
## Choosing the Right Adapter
**Ask yourself:**
1. **Where am I deploying?**
- Edge/Serverless → HTTP
- Node.js server → WebSocket
2. **Do I need transactions?**
- Yes → WebSocket
- No → Either works
3. **What's my request pattern?**
- Short, infrequent → HTTP
- Long, frequent → WebSocket
4. **Am I optimizing for?**
- Cold starts → HTTP
- Latency → WebSocket
**When in doubt:** Start with HTTP (simpler), migrate to WebSocket if needed.
## Related Resources
- `guides/new-project.md` - Setup guides for both adapters
- `guides/troubleshooting.md` - Connection error solutions
- `templates/db-http.ts` - HTTP adapter template
- `templates/db-websocket.ts` - WebSocket adapter template

View File

@@ -0,0 +1,652 @@
# Migration Reference Guide
Complete guide for database migrations with Drizzle and Neon.
## Table of Contents
- [Migration Lifecycle](#migration-lifecycle)
- [Environment Loading Deep-Dive](#environment-loading-deep-dive)
- [Migration Patterns](#migration-patterns)
- [Advanced Patterns](#advanced-patterns)
- [Migration in CI/CD](#migration-in-cicd)
- [Common Migration Errors](#common-migration-errors)
- [Best Practices](#best-practices)
- [Related Resources](#related-resources)
---
## Migration Lifecycle
### 1. Schema Change
Update your schema file:
```typescript
// src/db/schema.ts
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
phoneNumber: varchar('phone_number', { length: 20 }), // NEW
});
```
### 2. Generate Migration
Run drizzle-kit to generate SQL:
```bash
npm run drizzle-kit generate
```
**What this does:**
- Compares schema.ts with database
- Generates SQL in migrations folder
- Creates migration metadata
**Output:**
```
src/db/migrations/
├── 0000_initial.sql
├── 0001_add_phone_number.sql
└── meta/
├── _journal.json
└── 0001_snapshot.json
```
### 3. Review Migration
**Always review** generated SQL before applying:
```sql
-- 0001_add_phone_number.sql
ALTER TABLE users ADD COLUMN phone_number VARCHAR(20);
```
### 4. Apply Migration
Execute migration against database:
```bash
npm run drizzle-kit migrate
```
**Or with explicit env loading:**
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
npm run drizzle-kit migrate
```
## Environment Loading Deep-Dive
### Why Environment Loading Matters
**Problem:** drizzle-kit runs as separate process, may not inherit env vars.
**Symptom:**
```
Error: url is undefined in dbCredentials
```
### Solution 1: Config File Loading (Recommended)
**drizzle.config.ts:**
```typescript
import { defineConfig } from 'drizzle-kit';
import { config } from 'dotenv';
config({ path: '.env.local' });
export default defineConfig({
schema: './src/db/schema.ts',
out: './src/db/migrations',
dialect: 'postgresql',
dbCredentials: {
url: process.env.DATABASE_URL!,
},
});
```
**Key:** `config({ path: '.env.local' })` loads before exporting config.
### Solution 2: Shell Export
**Bash/Zsh:**
```bash
export DATABASE_URL="$(grep DATABASE_URL .env.local | cut -d '=' -f2)" && \
npm run drizzle-kit migrate
```
**Fish:**
```fish
set -x DATABASE_URL (grep DATABASE_URL .env.local | cut -d '=' -f2)
npm run drizzle-kit migrate
```
**PowerShell:**
```powershell
$env:DATABASE_URL = (Select-String -Path .env.local -Pattern "DATABASE_URL").Line.Split("=")[1]
npm run drizzle-kit migrate
```
### Solution 3: NPM Scripts
**package.json:**
```json
{
"scripts": {
"db:generate": "drizzle-kit generate",
"db:migrate": "dotenv -e .env.local -- drizzle-kit migrate",
"db:push": "dotenv -e .env.local -- drizzle-kit push"
}
}
```
**Install dotenv-cli:**
```bash
npm add -D dotenv-cli
```
### Solution 4: Programmatic Migration
**scripts/migrate.ts:**
```typescript
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
import { migrate } from 'drizzle-orm/neon-http/migrator';
import { config } from 'dotenv';
config({ path: '.env.local' });
const sql = neon(process.env.DATABASE_URL!);
const db = drizzle(sql);
await migrate(db, { migrationsFolder: './src/db/migrations' });
console.log('Migrations complete');
```
**Run:**
```bash
tsx scripts/migrate.ts
```
## Migration Patterns
### Initial Setup
**First migration creates all tables:**
```sql
-- 0000_initial.sql
CREATE TABLE users (
id SERIAL PRIMARY KEY,
email VARCHAR(255) NOT NULL UNIQUE,
name VARCHAR(255) NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
);
CREATE TABLE posts (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES users(id),
title TEXT NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
);
CREATE INDEX posts_user_id_idx ON posts(user_id);
```
### Adding Columns
**Schema:**
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
phoneNumber: varchar('phone_number', { length: 20 }), // NEW
});
```
**Generated:**
```sql
ALTER TABLE users ADD COLUMN phone_number VARCHAR(20);
```
### Dropping Columns
**Schema:**
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull(),
// removed: phoneNumber
});
```
**Generated:**
```sql
ALTER TABLE users DROP COLUMN phone_number;
```
**Warning:** Data loss! Back up first.
### Renaming Columns
**Problem:** Drizzle sees rename as drop + add (data loss).
**Schema:**
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
fullName: varchar('full_name', { length: 255 }), // was 'name'
});
```
**Generated (WRONG):**
```sql
ALTER TABLE users DROP COLUMN name;
ALTER TABLE users ADD COLUMN full_name VARCHAR(255);
```
**Solution:** Manually edit migration:
```sql
-- Change to:
ALTER TABLE users RENAME COLUMN name TO full_name;
```
### Changing Column Types
**Schema:**
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
views: bigint('views', { mode: 'number' }), // was integer
});
```
**Generated:**
```sql
ALTER TABLE posts ALTER COLUMN views TYPE BIGINT;
```
**Caution:** May require data migration if types incompatible.
### Adding Indexes
**Schema:**
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
}, (table) => ({
titleIdx: index('posts_title_idx').on(table.title), // NEW
}));
```
**Generated:**
```sql
CREATE INDEX posts_title_idx ON posts(title);
```
### Adding Foreign Keys
**Schema:**
```typescript
export const comments = pgTable('comments', {
id: serial('id').primaryKey(),
postId: serial('post_id')
.notNull()
.references(() => posts.id), // NEW
content: text('content').notNull(),
});
```
**Generated:**
```sql
ALTER TABLE comments
ADD CONSTRAINT comments_post_id_fkey
FOREIGN KEY (post_id) REFERENCES posts(id);
```
### Adding Constraints
**Unique:**
```typescript
export const users = pgTable('users', {
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
});
```
**Generated:**
```sql
ALTER TABLE users ADD CONSTRAINT users_email_unique UNIQUE (email);
```
**Check:**
```typescript
export const products = pgTable('products', {
id: serial('id').primaryKey(),
price: integer('price').notNull(),
}, (table) => ({
priceCheck: check('price_check', 'price >= 0'),
}));
```
**Generated:**
```sql
ALTER TABLE products ADD CONSTRAINT price_check CHECK (price >= 0);
```
## Advanced Patterns
### Data Migrations
**Scenario:** Add column with computed value from existing data.
**Step 1:** Generate migration:
```bash
npm run drizzle-kit generate
```
**Step 2:** Edit migration to add data transformation:
```sql
-- Add column
ALTER TABLE users ADD COLUMN full_name VARCHAR(255);
-- Populate with data
UPDATE users SET full_name = first_name || ' ' || last_name;
-- Make not null after population
ALTER TABLE users ALTER COLUMN full_name SET NOT NULL;
```
### Conditional Migrations
**Add IF NOT EXISTS for idempotency:**
```sql
ALTER TABLE users
ADD COLUMN IF NOT EXISTS phone_number VARCHAR(20);
CREATE INDEX IF NOT EXISTS posts_title_idx ON posts(title);
```
**Useful for:**
- Re-running migrations
- Partial deployments
- Development environments
### Multi-Step Migrations
**Scenario:** Rename with zero downtime.
**Migration 1 (Deploy this first):**
```sql
-- Add new column
ALTER TABLE users ADD COLUMN full_name VARCHAR(255);
-- Copy data
UPDATE users SET full_name = name;
```
**Application update:** Write to both `name` and `full_name`.
**Migration 2 (Deploy after apps updated):**
```sql
-- Make new column not null
ALTER TABLE users ALTER COLUMN full_name SET NOT NULL;
-- Drop old column
ALTER TABLE users DROP COLUMN name;
```
### Rollback Strategies
**Option 1: Down migrations (manual)**
Create reverse migration:
```sql
-- up.sql
ALTER TABLE users ADD COLUMN phone_number VARCHAR(20);
-- down.sql (create manually)
ALTER TABLE users DROP COLUMN phone_number;
```
**Option 2: Snapshot and restore**
**Before migration:**
```bash
pg_dump $DATABASE_URL > backup.sql
```
**If problems:**
```bash
psql $DATABASE_URL < backup.sql
```
**Option 3: Drizzle push (dev only)**
Reset to schema state:
```bash
npm run drizzle-kit push --force
```
**Warning:** Data loss in dev!
## Migration in CI/CD
### GitHub Actions Example
```yaml
name: Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install dependencies
run: npm ci
- name: Run migrations
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
run: npm run db:migrate
- name: Deploy application
run: npm run deploy
```
### Vercel Example
**vercel.json:**
```json
{
"buildCommand": "npm run build && npm run db:migrate",
"env": {
"DATABASE_URL": "@database_url"
}
}
```
**package.json:**
```json
{
"scripts": {
"build": "next build",
"db:migrate": "drizzle-kit migrate"
}
}
```
### Safety Checks
**Pre-migration script:**
```typescript
// scripts/pre-migrate.ts
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
const db = drizzle(sql);
async function preMigrationChecks() {
try {
await sql`SELECT 1`;
console.log('✅ Database connection successful');
const tables = await sql`
SELECT tablename FROM pg_tables
WHERE schemaname = 'public'
`;
console.log(`✅ Found ${tables.length} tables`);
return true;
} catch (err) {
console.error('❌ Pre-migration check failed:', err);
process.exit(1);
}
}
preMigrationChecks();
```
## Common Migration Errors
### Error: "migration already applied"
**Cause:** Journal shows migration as applied.
**Solution:**
```bash
# Check journal
cat src/db/migrations/meta/_journal.json
# Remove entry if needed (dev only!)
# Or regenerate migrations
rm -rf src/db/migrations/*
npm run drizzle-kit generate
```
### Error: "column already exists"
**Cause:** Schema out of sync with database.
**Solutions:**
**Option 1:** Edit migration to use IF NOT EXISTS:
```sql
ALTER TABLE users
ADD COLUMN IF NOT EXISTS phone_number VARCHAR(20);
```
**Option 2:** Reset migrations (dev only):
```bash
npm run drizzle-kit drop # Drops all tables!
npm run drizzle-kit migrate
```
### Error: "violates foreign key constraint"
**Cause:** Trying to drop table referenced by foreign keys.
**Solution:** Drop in reverse dependency order:
```sql
DROP TABLE comments; -- First (depends on posts)
DROP TABLE posts; -- Then (depends on users)
DROP TABLE users; -- Finally
```
Or use CASCADE (data loss!):
```sql
DROP TABLE users CASCADE;
```
### Error: "cannot drop column"
**Cause:** Column referenced by views, functions, or constraints.
**Solution:**
```sql
-- Find dependencies
SELECT * FROM information_schema.view_column_usage
WHERE column_name = 'your_column';
-- Drop views first
DROP VIEW view_name;
-- Then drop column
ALTER TABLE users DROP COLUMN your_column;
```
## Best Practices
### 1. Always Review Generated SQL
Don't blindly apply migrations:
```bash
# Generate
npm run drizzle-kit generate
# Review
cat src/db/migrations/0001_*.sql
# Apply only after review
npm run drizzle-kit migrate
```
### 2. Test Migrations in Development
**Before production:**
```bash
# On dev database
export DATABASE_URL=$DEV_DATABASE_URL
npm run db:migrate
# Test application
npm run test
# Only then deploy to production
```
### 3. Back Up Before Major Migrations
```bash
pg_dump $DATABASE_URL > backup_$(date +%Y%m%d).sql
```
### 4. Use Transactions (when possible)
Wrap multiple operations:
```sql
BEGIN;
ALTER TABLE users ADD COLUMN phone_number VARCHAR(20);
UPDATE users SET phone_number = '000-000-0000' WHERE phone_number IS NULL;
ALTER TABLE users ALTER COLUMN phone_number SET NOT NULL;
COMMIT;
```
### 5. Document Breaking Changes
Add comments in migration files:
```sql
-- Breaking change: Removing deprecated 'username' column
-- Applications must use 'email' instead
-- Migration date: 2024-01-15
ALTER TABLE users DROP COLUMN username;
```
### 6. Keep Migrations Small
One logical change per migration:
- ✅ Good: "Add phone number column"
- ❌ Bad: "Add phone number, refactor users table, update indexes"
## Related Resources
- `guides/troubleshooting.md` - Migration error solutions
- `guides/schema-only.md` - Schema change patterns
- `references/adapters.md` - Connection configuration
- Scripts: `scripts/run-migration.ts`

View File

@@ -0,0 +1,761 @@
# Query Patterns Reference Guide
Complete reference for querying with Drizzle ORM.
## Table of Contents
- [Basic CRUD Operations](#basic-crud-operations)
- [Advanced Filtering](#advanced-filtering)
- [Joins and Relations](#joins-and-relations)
- [Aggregations](#aggregations)
- [Subqueries](#subqueries)
- [Transactions](#transactions)
- [Batch Operations](#batch-operations)
- [Raw SQL](#raw-sql)
- [Performance Optimization](#performance-optimization)
- [Type Safety](#type-safety)
- [Common Patterns](#common-patterns)
- [Related Resources](#related-resources)
---
## Basic CRUD Operations
### Create (Insert)
**Single record:**
```typescript
import { db } from './db';
import { users } from './db/schema';
const newUser = await db.insert(users)
.values({
email: 'user@example.com',
name: 'John Doe',
})
.returning();
console.log(newUser[0]); // { id: 1, email: '...', name: '...' }
```
**Multiple records:**
```typescript
const newUsers = await db.insert(users)
.values([
{ email: 'user1@example.com', name: 'User 1' },
{ email: 'user2@example.com', name: 'User 2' },
{ email: 'user3@example.com', name: 'User 3' },
])
.returning();
```
**With onConflictDoNothing:**
```typescript
await db.insert(users)
.values({ email: 'user@example.com', name: 'John' })
.onConflictDoNothing();
```
**With onConflictDoUpdate (upsert):**
```typescript
await db.insert(users)
.values({ email: 'user@example.com', name: 'John' })
.onConflictDoUpdate({
target: users.email,
set: { name: 'John Updated' },
});
```
### Read (Select)
**All records:**
```typescript
const allUsers = await db.select().from(users);
```
**Specific columns:**
```typescript
const userEmails = await db.select({
id: users.id,
email: users.email,
}).from(users);
```
**With WHERE clause:**
```typescript
import { eq, gt, lt, like, and, or } from 'drizzle-orm';
const user = await db.select()
.from(users)
.where(eq(users.email, 'user@example.com'));
const activeUsers = await db.select()
.from(users)
.where(eq(users.isActive, true));
```
**Multiple conditions:**
```typescript
const filteredUsers = await db.select()
.from(users)
.where(
and(
eq(users.isActive, true),
gt(users.createdAt, new Date('2024-01-01'))
)
);
```
**With LIMIT and OFFSET:**
```typescript
const paginatedUsers = await db.select()
.from(users)
.limit(10)
.offset(20); // Page 3
```
**With ORDER BY:**
```typescript
const sortedUsers = await db.select()
.from(users)
.orderBy(users.createdAt); // ASC by default
import { desc } from 'drizzle-orm';
const recentUsers = await db.select()
.from(users)
.orderBy(desc(users.createdAt));
```
### Update
**Single record:**
```typescript
await db.update(users)
.set({ name: 'Jane Doe' })
.where(eq(users.id, 1));
```
**Multiple records:**
```typescript
await db.update(users)
.set({ isActive: false })
.where(eq(users.deletedAt, null));
```
**With returning:**
```typescript
const updated = await db.update(users)
.set({ name: 'Updated Name' })
.where(eq(users.id, 1))
.returning();
```
**Partial updates:**
```typescript
const updates: Partial<typeof users.$inferSelect> = {
name: 'New Name',
};
await db.update(users)
.set(updates)
.where(eq(users.id, 1));
```
### Delete
**Single record:**
```typescript
await db.delete(users)
.where(eq(users.id, 1));
```
**Multiple records:**
```typescript
await db.delete(users)
.where(eq(users.isActive, false));
```
**With returning:**
```typescript
const deleted = await db.delete(users)
.where(eq(users.id, 1))
.returning();
```
**Soft delete (recommended):**
```typescript
await db.update(users)
.set({ deletedAt: new Date() })
.where(eq(users.id, 1));
```
## Advanced Filtering
### Comparison Operators
```typescript
import { eq, ne, gt, gte, lt, lte } from 'drizzle-orm';
const adults = await db.select()
.from(users)
.where(gte(users.age, 18));
const recentPosts = await db.select()
.from(posts)
.where(gt(posts.createdAt, new Date('2024-01-01')));
const excludeAdmin = await db.select()
.from(users)
.where(ne(users.role, 'admin'));
```
### Pattern Matching
```typescript
import { like, ilike } from 'drizzle-orm';
const gmailUsers = await db.select()
.from(users)
.where(like(users.email, '%@gmail.com'));
const searchByName = await db.select()
.from(users)
.where(ilike(users.name, '%john%')); // Case-insensitive
```
### NULL Checks
```typescript
import { isNull, isNotNull } from 'drizzle-orm';
const usersWithPhone = await db.select()
.from(users)
.where(isNotNull(users.phoneNumber));
const unverifiedUsers = await db.select()
.from(users)
.where(isNull(users.emailVerifiedAt));
```
### IN Operator
```typescript
import { inArray } from 'drizzle-orm';
const specificUsers = await db.select()
.from(users)
.where(inArray(users.id, [1, 2, 3, 4, 5]));
```
### BETWEEN
```typescript
import { between } from 'drizzle-orm';
const postsThisMonth = await db.select()
.from(posts)
.where(
between(
posts.createdAt,
new Date('2024-01-01'),
new Date('2024-01-31')
)
);
```
### Complex Conditions
```typescript
import { and, or, not } from 'drizzle-orm';
const complexQuery = await db.select()
.from(users)
.where(
or(
and(
eq(users.isActive, true),
gte(users.age, 18)
),
eq(users.role, 'admin')
)
);
```
## Joins and Relations
### Manual Joins
**Inner join:**
```typescript
const postsWithAuthors = await db.select({
postId: posts.id,
postTitle: posts.title,
authorName: users.name,
authorEmail: users.email,
})
.from(posts)
.innerJoin(users, eq(posts.authorId, users.id));
```
**Left join:**
```typescript
const allPostsWithOptionalAuthors = await db.select()
.from(posts)
.leftJoin(users, eq(posts.authorId, users.id));
```
### Relational Queries (Recommended)
**Define relations first:**
```typescript
import { relations } from 'drizzle-orm';
export const usersRelations = relations(users, ({ many }) => ({
posts: many(posts),
}));
export const postsRelations = relations(posts, ({ one }) => ({
author: one(users, {
fields: [posts.authorId],
references: [users.id],
}),
}));
```
**Query with relations:**
```typescript
const usersWithPosts = await db.query.users.findMany({
with: {
posts: true,
},
});
console.log(usersWithPosts[0].posts); // Array of posts
```
**Nested relations:**
```typescript
const postsWithAuthorsAndComments = await db.query.posts.findMany({
with: {
author: true,
comments: {
with: {
author: true,
},
},
},
});
```
**Filtered relations:**
```typescript
const usersWithRecentPosts = await db.query.users.findMany({
with: {
posts: {
where: gt(posts.createdAt, new Date('2024-01-01')),
orderBy: desc(posts.createdAt),
limit: 5,
},
},
});
```
**Partial selection:**
```typescript
const usersWithPostTitles = await db.query.users.findMany({
columns: {
id: true,
name: true,
},
with: {
posts: {
columns: {
id: true,
title: true,
},
},
},
});
```
## Aggregations
### Count
```typescript
import { count } from 'drizzle-orm';
const userCount = await db.select({
count: count(),
}).from(users);
console.log(userCount[0].count); // Total users
```
**Count with grouping:**
```typescript
const postsByAuthor = await db.select({
authorId: posts.authorId,
postCount: count(),
})
.from(posts)
.groupBy(posts.authorId);
```
### Sum, Avg, Min, Max
```typescript
import { sum, avg, min, max } from 'drizzle-orm';
const stats = await db.select({
totalViews: sum(posts.views),
avgViews: avg(posts.views),
minViews: min(posts.views),
maxViews: max(posts.views),
}).from(posts);
```
### Having
```typescript
const activeAuthors = await db.select({
authorId: posts.authorId,
postCount: count(),
})
.from(posts)
.groupBy(posts.authorId)
.having(gt(count(), 5)); // Authors with > 5 posts
```
## Subqueries
### In WHERE clause
```typescript
const activeUserIds = db.select({ id: users.id })
.from(users)
.where(eq(users.isActive, true));
const postsFromActiveUsers = await db.select()
.from(posts)
.where(inArray(posts.authorId, activeUserIds));
```
### As derived table
```typescript
const recentPosts = db.select()
.from(posts)
.where(gt(posts.createdAt, new Date('2024-01-01')))
.as('recentPosts');
const authorsOfRecentPosts = await db.select()
.from(users)
.innerJoin(recentPosts, eq(users.id, recentPosts.authorId));
```
## Transactions
**Only available with WebSocket adapter!**
```typescript
await db.transaction(async (tx) => {
const user = await tx.insert(users)
.values({ email: 'user@example.com', name: 'John' })
.returning();
await tx.insert(posts)
.values({
authorId: user[0].id,
title: 'First post',
content: 'Hello world',
});
});
```
**With error handling:**
```typescript
try {
await db.transaction(async (tx) => {
await tx.insert(users).values({ email: 'user@example.com' });
await tx.insert(posts).values({ title: 'Post' });
throw new Error('Rollback!'); // Transaction rolls back
});
} catch (err) {
console.error('Transaction failed:', err);
}
```
**Nested transactions:**
```typescript
await db.transaction(async (tx) => {
await tx.insert(users).values({ email: 'user1@example.com' });
await tx.transaction(async (tx2) => {
await tx2.insert(posts).values({ title: 'Post 1' });
});
});
```
## Batch Operations
**HTTP adapter alternative to transactions:**
```typescript
await db.batch([
db.insert(users).values({ email: 'user1@example.com' }),
db.insert(users).values({ email: 'user2@example.com' }),
db.insert(posts).values({ title: 'Post 1' }),
]);
```
**Note:** Not atomic! Use transactions if you need rollback capability.
## Raw SQL
### Execute raw query
```typescript
import { sql } from 'drizzle-orm';
const result = await db.execute(sql`
SELECT * FROM users
WHERE email LIKE ${'%@gmail.com'}
`);
```
### SQL in WHERE clause
```typescript
const users = await db.select()
.from(users)
.where(sql`${users.email} LIKE '%@gmail.com'`);
```
### SQL expressions
```typescript
const posts = await db.select({
id: posts.id,
title: posts.title,
excerpt: sql<string>`LEFT(${posts.content}, 100)`,
}).from(posts);
```
### Custom functions
```typescript
const searchResults = await db.select()
.from(posts)
.where(
sql`to_tsvector('english', ${posts.content}) @@ to_tsquery('english', ${'search query'})`
);
```
## Performance Optimization
### Select only needed columns
**Bad:**
```typescript
const users = await db.select().from(users); // All columns
```
**Good:**
```typescript
const users = await db.select({
id: users.id,
email: users.email,
}).from(users);
```
### Use indexes
**Ensure indexed columns in WHERE:**
```typescript
// Assuming index on users.email
const user = await db.select()
.from(users)
.where(eq(users.email, 'user@example.com')); // Fast
```
### Avoid N+1 queries
**Bad:**
```typescript
const posts = await db.select().from(posts);
for (const post of posts) {
const author = await db.select()
.from(users)
.where(eq(users.id, post.authorId)); // N queries!
}
```
**Good:**
```typescript
const posts = await db.query.posts.findMany({
with: {
author: true, // Single query with join
},
});
```
### Use pagination
```typescript
async function getPaginatedUsers(page: number, pageSize: number = 10) {
return db.select()
.from(users)
.limit(pageSize)
.offset((page - 1) * pageSize);
}
```
### Batch inserts
**Bad:**
```typescript
for (const user of users) {
await db.insert(users).values(user); // N queries
}
```
**Good:**
```typescript
await db.insert(users).values(users); // Single query
```
## Type Safety
### Infer types from schema
```typescript
type User = typeof users.$inferSelect;
type NewUser = typeof users.$inferInsert;
const user: User = {
id: 1,
email: 'user@example.com',
name: 'John',
createdAt: new Date(),
};
const newUser: NewUser = {
email: 'user@example.com',
name: 'John',
};
```
### Type-safe WHERE conditions
```typescript
function getUsersByStatus(status: User['status']) {
return db.select()
.from(users)
.where(eq(users.status, status));
}
```
### Type-safe updates
```typescript
function updateUser(id: number, data: Partial<NewUser>) {
return db.update(users)
.set(data)
.where(eq(users.id, id))
.returning();
}
```
## Common Patterns
### Soft deletes
**Schema:**
```typescript
export const posts = pgTable('posts', {
id: serial('id').primaryKey(),
title: text('title').notNull(),
deletedAt: timestamp('deleted_at'),
});
```
**Queries:**
```typescript
const activePosts = await db.select()
.from(posts)
.where(isNull(posts.deletedAt));
const deletedPosts = await db.select()
.from(posts)
.where(isNotNull(posts.deletedAt));
```
### Timestamps
**Auto-update:**
```typescript
async function updatePost(id: number, data: Partial<NewPost>) {
return db.update(posts)
.set({
...data,
updatedAt: new Date(),
})
.where(eq(posts.id, id))
.returning();
}
```
### Search
**Simple search:**
```typescript
const searchUsers = await db.select()
.from(users)
.where(
or(
ilike(users.name, `%${query}%`),
ilike(users.email, `%${query}%`)
)
);
```
**Full-text search:**
```typescript
const searchPosts = await db.select()
.from(posts)
.where(
sql`to_tsvector('english', ${posts.title} || ' ' || ${posts.content}) @@ plainto_tsquery('english', ${query})`
);
```
### Unique constraints
**Handle duplicates:**
```typescript
try {
await db.insert(users).values({ email: 'user@example.com' });
} catch (err) {
if (err.code === '23505') { // Unique violation
console.error('Email already exists');
}
}
```
**Or use upsert:**
```typescript
await db.insert(users)
.values({ email: 'user@example.com', name: 'John' })
.onConflictDoUpdate({
target: users.email,
set: { name: 'John Updated' },
});
```
## Related Resources
- `guides/schema-only.md` - Schema design patterns
- `references/adapters.md` - Transaction availability by adapter
- `guides/troubleshooting.md` - Query error solutions
- `templates/schema-example.ts` - Complete schema with relations

View File

@@ -0,0 +1,77 @@
/**
* Generate Schema Script
*
* Generates Drizzle migration files based on schema changes.
* Run with: npx drizzle-kit generate
*
* This creates SQL migration files in the migrations directory
* based on differences between your schema.ts and the database.
*/
import { exec } from 'child_process';
import { promisify } from 'util';
const execAsync = promisify(exec);
async function generateSchema() {
console.log('🔄 Generating Drizzle migrations...\n');
try {
const { stdout, stderr } = await execAsync('npx drizzle-kit generate');
if (stdout) {
console.log('📝 Generated migrations:');
console.log(stdout);
}
if (stderr) {
console.warn('⚠️ Warnings:');
console.warn(stderr);
}
console.log('\n✅ Migration generation complete!');
console.log('\n📋 Next steps:');
console.log(' 1. Review the generated migration files in ./src/db/migrations');
console.log(' 2. Run: npx drizzle-kit migrate');
console.log(' 3. Test your application\n');
return true;
} catch (error) {
console.error('❌ Migration generation failed');
console.error((error as any).message);
console.log('\n💡 Troubleshooting:');
console.log(' • Ensure drizzle.config.ts is in your project root');
console.log(' • Check that DATABASE_URL is set correctly');
console.log(' • Verify your schema.ts file exists at the configured path');
console.log(' • Review guides/troubleshooting.md for common issues');
console.log(' • See references/migrations.md for migration patterns');
const errorMessage = (error as any).message.toLowerCase();
if (errorMessage.includes('url') || errorMessage.includes('undefined')) {
console.log('\n⚠ Environment variable issue detected:');
console.log(' • Ensure DATABASE_URL is loaded in drizzle.config.ts');
console.log(' • Add: import { config } from "dotenv"; config({ path: ".env.local" });');
console.log(' • See guides/troubleshooting.md section: "Error: url: undefined"');
}
if (errorMessage.includes('schema') || errorMessage.includes('not found')) {
console.log('\n⚠ Schema file issue detected:');
console.log(' • Verify schema path in drizzle.config.ts matches actual file location');
console.log(' • Default: ./src/db/schema.ts');
}
if (errorMessage.includes('enoent')) {
console.log('\n⚠ File/directory missing:');
console.log(' • Create migrations folder: mkdir -p src/db/migrations');
console.log(' • Ensure schema file exists: src/db/schema.ts');
}
return false;
}
}
generateSchema().then((success) => {
process.exit(success ? 0 : 1);
});

View File

@@ -0,0 +1,133 @@
/**
* Run Migration Script
*
* Applies pending Drizzle migrations to your Neon database.
* Run with: npx ts-node run-migration.ts
*
* This script will:
* 1. Connect to your Neon database
* 2. Apply all pending migrations
* 3. Report success or failure
*/
import { drizzle } from 'drizzle-orm/neon-http';
import { migrate } from 'drizzle-orm/neon-http/migrator';
import { neon } from '@neondatabase/serverless';
const DATABASE_URL = process.env.DATABASE_URL;
if (!DATABASE_URL) {
console.error('❌ DATABASE_URL environment variable is not set');
process.exit(1);
}
async function runMigrations() {
console.log('🔄 Running Drizzle migrations...\n');
try {
// Create SQL client
const sql = neon(DATABASE_URL);
// Create Drizzle instance
const db = drizzle(sql);
// Run migrations
console.log('⏳ Applying migrations...');
await migrate(db, {
migrationsFolder: './src/db/migrations',
});
console.log('✅ All migrations applied successfully!\n');
// Show migration status
console.log('📊 Migration Summary:');
console.log(' Database: ' + new URL(DATABASE_URL).pathname.slice(1));
console.log(' Migrations folder: ./src/db/migrations');
console.log(' Status: Up to date\n');
return true;
} catch (error) {
console.error('❌ Migration failed');
console.error((error as any).message);
console.log('\n💡 Troubleshooting:');
console.log(' • Ensure ./src/db/migrations directory exists');
console.log(' • Verify DATABASE_URL is correct');
console.log(' • Check that migrations are properly formatted SQL files');
console.log(' • Try running: npx drizzle-kit generate first');
console.log(' • Review guides/troubleshooting.md for common migration errors');
console.log(' • See references/migrations.md for detailed migration guide');
const errorMessage = (error as any).message.toLowerCase();
if (errorMessage.includes('connect') || errorMessage.includes('connection')) {
console.log('\n⚠ Connection issue detected:');
console.log(' • Verify DATABASE_URL format: postgresql://user:pass@host/db?sslmode=require');
console.log(' • Ensure database is accessible');
console.log(' • Check firewall/network settings');
console.log(' • See guides/troubleshooting.md section: "Connection Errors"');
}
if (errorMessage.includes('already exists') || errorMessage.includes('duplicate')) {
console.log('\n⚠ Migration conflict detected:');
console.log(' • Migration may have been partially applied');
console.log(' • Check database state: psql $DATABASE_URL -c "\\dt"');
console.log(' • See references/migrations.md for handling conflicts');
}
if (errorMessage.includes('not found') || errorMessage.includes('enoent')) {
console.log('\n⚠ Migrations folder missing:');
console.log(' • Run: npx drizzle-kit generate');
console.log(' • Ensure migrations folder path matches drizzle.config.ts');
}
if (errorMessage.includes('syntax')) {
console.log('\n⚠ SQL syntax error:');
console.log(' • Review generated migration files in ./src/db/migrations');
console.log(' • Check for manually edited migrations');
console.log(' • See references/migrations.md for safe editing practices');
}
console.log('');
return false;
}
}
/**
* Alternative: Run migrations with WebSocket (for Node.js)
* Uncomment below if using WebSocket connections
*/
/*
import { drizzle } from 'drizzle-orm/neon-serverless';
import { migrate } from 'drizzle-orm/neon-serverless/migrator';
import { Pool } from '@neondatabase/serverless';
async function runMigrationsWebSocket() {
console.log('🔄 Running Drizzle migrations (WebSocket)...\n');
const pool = new Pool({ connectionString: DATABASE_URL });
try {
const db = drizzle(pool);
console.log('⏳ Applying migrations...');
await migrate(db, {
migrationsFolder: './src/db/migrations',
});
console.log('✅ All migrations applied successfully!\n');
await pool.end();
return true;
} catch (error) {
console.error('❌ Migration failed:', (error as any).message);
await pool.end();
return false;
}
}
*/
// Run migrations
runMigrations().then((success) => {
process.exit(success ? 0 : 1);
});

View File

@@ -0,0 +1,6 @@
import { drizzle } from 'drizzle-orm/neon-http';
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql);

View File

@@ -0,0 +1,24 @@
import { drizzle } from 'drizzle-orm/neon-serverless';
import { Pool, neonConfig } from '@neondatabase/serverless';
import ws from 'ws';
neonConfig.webSocketConstructor = ws;
const pool = new Pool({
connectionString: process.env.DATABASE_URL!,
max: 10,
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 5000,
});
export const db = drizzle(pool);
process.on('SIGTERM', async () => {
await pool.end();
process.exit(0);
});
process.on('SIGINT', async () => {
await pool.end();
process.exit(0);
});

View File

@@ -0,0 +1,96 @@
/**
* Drizzle Configuration
*
* This file configures Drizzle ORM for use with Neon.
* Place this in your project root or src/ directory.
*
* Usage: Reference this in your drizzle.config.ts
*/
import { config } from 'dotenv';
import type { Config } from 'drizzle-kit';
config({ path: '.env.local' });
/**
* Drizzle Configuration for Neon Postgres
*
* Supports both HTTP and WebSocket connections.
* Automatically detects which driver to use based on environment.
*/
const dbUrl = process.env.DATABASE_URL;
if (!dbUrl) {
throw new Error('DATABASE_URL environment variable is not set');
}
// Determine connection type based on environment
const isServerless = process.env.RUNTIME === 'edge' ||
process.env.VERCEL_ENV === 'production';
export default {
schema: './src/db/schema.ts', // Path to your schema file
out: './src/db/migrations', // Output directory for migrations
// Database connection
dbCredentials: {
url: dbUrl,
},
// Migration options
migrations: {
prefix: 'timestamp', // or 'none'
},
// Verbose logging for debugging
verbose: process.env.DEBUG === 'true',
// Strict mode ensures all migrations are applied
strict: true,
} satisfies Config;
/**
* HTTP Connection Configuration (for Vercel Edge, etc.)
*
* export default {
* schema: './src/db/schema.ts',
* out: './src/db/migrations',
* driver: 'postgres',
* dbCredentials: {
* url: process.env.DATABASE_URL!,
* },
* } satisfies Config;
*/
/**
* WebSocket Connection Configuration (for Node.js servers)
*
* export default {
* schema: './src/db/schema.ts',
* out: './src/db/migrations',
* driver: 'pg',
* dbCredentials: {
* url: process.env.DATABASE_URL!,
* },
* } satisfies Config;
*/
/**
* Migration Commands
*
* # Generate migration files from schema changes
* npx drizzle-kit generate
*
* # Apply migrations to database
* npx drizzle-kit migrate
*
* # Drop all tables (careful!)
* npx drizzle-kit drop
*
* # Introspect existing database
* npx drizzle-kit introspect
*
* # Push schema changes directly (development only)
* npx drizzle-kit push
*/

View File

@@ -0,0 +1,231 @@
/**
* Drizzle Schema Example
*
* This file demonstrates how to define database tables and relationships
* using Drizzle ORM with Neon Postgres.
*
* Usage: Import these tables in your application code for type-safe queries
*/
import {
pgTable,
serial,
text,
varchar,
integer,
timestamp,
boolean,
decimal,
json,
index,
unique,
foreignKey,
} from 'drizzle-orm/pg-core';
import { relations } from 'drizzle-orm';
/**
* Users Table
*
* Stores basic user information. Can be extended with additional fields
* as needed by your application.
*/
export const users = pgTable(
'users',
{
id: serial('id').primaryKey(),
email: varchar('email', { length: 255 }).notNull().unique(),
name: varchar('name', { length: 255 }).notNull(),
password: text('password'), // If not using external auth
avatar: text('avatar'), // URL to avatar image
isActive: boolean('is_active').default(true),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
},
(table) => ({
emailIdx: index('users_email_idx').on(table.email),
createdAtIdx: index('users_created_at_idx').on(table.createdAt),
})
);
/**
* Profiles Table
*
* Extended user information. Uses a foreign key to link with users.
*/
export const profiles = pgTable('profiles', {
id: serial('id').primaryKey(),
userId: integer('user_id')
.notNull()
.references(() => users.id, { onDelete: 'cascade' }),
bio: text('bio'),
location: varchar('location', { length: 255 }),
website: varchar('website', { length: 255 }),
phone: varchar('phone', { length: 20 }),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
});
/**
* Posts Table
*
* Blog posts created by users.
*/
export const posts = pgTable(
'posts',
{
id: serial('id').primaryKey(),
userId: integer('user_id')
.notNull()
.references(() => users.id, { onDelete: 'cascade' }),
title: varchar('title', { length: 255 }).notNull(),
slug: varchar('slug', { length: 255 }).notNull().unique(),
content: text('content').notNull(),
excerpt: text('excerpt'),
published: boolean('published').default(false),
publishedAt: timestamp('published_at'),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
},
(table) => ({
userIdIdx: index('posts_user_id_idx').on(table.userId),
publishedIdx: index('posts_published_idx').on(table.published),
slugIdx: index('posts_slug_idx').on(table.slug),
})
);
/**
* Comments Table
*
* Comments on blog posts. Supports nested comments via parent_id.
*/
export const comments = pgTable(
'comments',
{
id: serial('id').primaryKey(),
postId: integer('post_id')
.notNull()
.references(() => posts.id, { onDelete: 'cascade' }),
userId: integer('user_id')
.notNull()
.references(() => users.id, { onDelete: 'cascade' }),
parentId: integer('parent_id').references(() => comments.id, {
onDelete: 'cascade',
}),
content: text('content').notNull(),
approved: boolean('approved').default(false),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
},
(table) => ({
postIdIdx: index('comments_post_id_idx').on(table.postId),
userIdIdx: index('comments_user_id_idx').on(table.userId),
parentIdIdx: index('comments_parent_id_idx').on(table.parentId),
})
);
/**
* Tags Table
*
* Tags for categorizing posts.
*/
export const tags = pgTable('tags', {
id: serial('id').primaryKey(),
name: varchar('name', { length: 100 }).notNull().unique(),
slug: varchar('slug', { length: 100 }).notNull().unique(),
createdAt: timestamp('created_at').defaultNow(),
});
/**
* PostTags Junction Table
*
* Many-to-many relationship between posts and tags.
*/
export const postTags = pgTable(
'post_tags',
{
postId: integer('post_id')
.notNull()
.references(() => posts.id, { onDelete: 'cascade' }),
tagId: integer('tag_id')
.notNull()
.references(() => tags.id, { onDelete: 'cascade' }),
},
(table) => ({
pk: { name: 'post_tags_pk', columns: [table.postId, table.tagId] },
postIdIdx: index('post_tags_post_id_idx').on(table.postId),
tagIdIdx: index('post_tags_tag_id_idx').on(table.tagId),
})
);
/**
* Settings Table
*
* Application-wide or user-specific settings stored as JSON.
*/
export const settings = pgTable('settings', {
id: serial('id').primaryKey(),
userId: integer('user_id').references(() => users.id, {
onDelete: 'cascade',
}), // null = global settings
key: varchar('key', { length: 255 }).notNull(),
value: json('value'),
createdAt: timestamp('created_at').defaultNow(),
updatedAt: timestamp('updated_at').defaultNow(),
});
// ============================================================================
// Relations (optional but recommended for better type safety)
// ============================================================================
export const usersRelations = relations(users, ({ many, one }) => ({
profile: one(profiles),
posts: many(posts),
comments: many(comments),
}));
export const profilesRelations = relations(profiles, ({ one }) => ({
user: one(users, {
fields: [profiles.userId],
references: [users.id],
}),
}));
export const postsRelations = relations(posts, ({ one, many }) => ({
author: one(users, {
fields: [posts.userId],
references: [users.id],
}),
comments: many(comments),
tags: many(postTags),
}));
export const commentsRelations = relations(comments, ({ one, many }) => ({
post: one(posts, {
fields: [comments.postId],
references: [posts.id],
}),
author: one(users, {
fields: [comments.userId],
references: [users.id],
}),
parent: one(comments, {
fields: [comments.parentId],
references: [comments.id],
}),
replies: many(comments),
}));
export const tagsRelations = relations(tags, ({ many }) => ({
posts: many(postTags),
}));
export const postTagsRelations = relations(postTags, ({ one }) => ({
post: one(posts, {
fields: [postTags.postId],
references: [posts.id],
}),
tag: one(tags, {
fields: [postTags.tagId],
references: [tags.id],
}),
}));

View File

@@ -0,0 +1,76 @@
---
name: neon-serverless
description: Configures Neon Serverless Driver for Next.js, Vercel Edge Functions, AWS Lambda, and other serverless environments. Installs @neondatabase/serverless, sets up environment variables, and creates working API route examples with TypeScript types. Use when users need to connect their application to Neon, fetch or query data from a Neon database, integrate Neon with Next.js or serverless frameworks, or set up database access in edge/serverless environments where traditional PostgreSQL clients don't work.
allowed-tools: ["bash"]
---
# Neon Serverless Skill
Configures the Neon Serverless Driver for optimal performance in serverless and edge computing environments.
## When to Use
- Setting up connections for edge functions (Vercel Edge, Cloudflare Workers)
- Configuring serverless APIs (AWS Lambda, Google Cloud Functions)
- Optimizing for low-latency database access
- Implementing connection pooling for high-throughput apps
**Not recommended for:** Complex multi-statement transactions (use WebSocket Pool), persistent servers (use native PostgreSQL drivers), or offline-first applications.
## Code Generation Rules
When generating TypeScript/JavaScript code:
- BEFORE generating import statements, check tsconfig.json for path aliases (compilerOptions.paths)
- If path aliases exist (e.g., "@/*": ["./src/*"]), use them (e.g., import { x } from '@/lib/utils')
- If NO path aliases exist or unsure, ALWAYS use relative imports (e.g., import { x } from '../../../lib/utils')
- Verify imports match the project's configuration
- Default to relative imports - they always work regardless of configuration
## Reference Documentation
**Primary Resource:** See `[neon-serverless.mdc](https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-serverless.mdc)` in project root for comprehensive guidelines including:
- Installation and compatibility requirements
- HTTP vs WebSocket adapter selection
- Connection pooling strategies
- Query optimization patterns
- Error handling and troubleshooting
## Quick Setup
### Installation
```bash
npm install @neondatabase/serverless
```
### Connection Patterns
**HTTP Client** (recommended for edge/serverless):
```typescript
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL!);
const rows = await sql`SELECT * FROM users WHERE id = ${userId}`;
```
**WebSocket Pool** (for Node.js long-lived connections):
```typescript
import { Pool } from '@neondatabase/serverless';
const pool = new Pool({ connectionString: process.env.DATABASE_URL! });
const result = await pool.query('SELECT * FROM users WHERE id = $1', [userId]);
```
See `templates/` for complete examples:
- `templates/http-connection.ts` - HTTP client setup
- `templates/websocket-pool.ts` - WebSocket pool configuration
## Validation
Use `scripts/validate-connection.ts` to test your database connection before deployment.
## Related Skills
- **neon-drizzle** - For ORM with serverless connections
- **neon-toolkit** - For ephemeral database testing
---
**Want best practices in your project?** Run `neon-plugin:add-neon-docs` with parameter `SKILL_NAME="neon-serverless"` to add reference links.

View File

@@ -0,0 +1,170 @@
/**
* Connection Validator Script
*
* This script tests your Neon database connection and provides diagnostic information.
* Run with: npx ts-node validate-connection.ts
*
* Environment variables:
* - DATABASE_URL: Your Neon connection string
* - CONNECTION_TYPE: 'http' or 'websocket' (default: 'http')
*/
import { neon } from '@neondatabase/serverless';
import { Pool } from '@neondatabase/serverless';
const DATABASE_URL = process.env.DATABASE_URL;
const CONNECTION_TYPE = process.env.CONNECTION_TYPE || 'http';
if (!DATABASE_URL) {
console.error('❌ DATABASE_URL environment variable is not set');
process.exit(1);
}
async function validateHttpConnection() {
console.log('\n🔍 Testing HTTP Connection...');
try {
const sql = neon(DATABASE_URL);
// Test 1: Simple query
console.log(' • Testing basic query...');
const result = await sql`SELECT NOW() as current_time, version() as version`;
console.log(' ✅ Query successful');
// Test 2: Get database info
console.log(' • Fetching database info...');
const dbInfo = await sql`
SELECT
current_database() as database,
current_user as user,
version() as postgresql_version,
(SELECT count(*) FROM information_schema.tables WHERE table_schema = 'public') as table_count
`;
console.log('\n📊 Database Information:');
const info = dbInfo[0];
console.log(` • Database: ${info.database}`);
console.log(` • User: ${info.user}`);
console.log(` • PostgreSQL Version: ${info.postgresql_version.split(',')[0]}`);
console.log(` • Public Tables: ${info.table_count}`);
// Test 3: Connection string validation
console.log('\n🔐 Connection Details:');
const url = new URL(DATABASE_URL);
console.log(` • Host: ${url.hostname}`);
console.log(` • Port: ${url.port || 5432}`);
console.log(` • Database: ${url.pathname.slice(1)}`);
console.log(` • SSL Mode: ${url.searchParams.get('sslmode') || 'require'}`);
return true;
} catch (error) {
console.error(' ❌ Connection failed');
console.error(` Error: ${(error as any).message}`);
return false;
}
}
async function validateWebSocketConnection() {
console.log('\n🔍 Testing WebSocket Connection...');
try {
const pool = new Pool({
connectionString: DATABASE_URL,
max: 1,
});
// Test 1: Get connection
console.log(' • Acquiring connection...');
const client = await pool.connect();
console.log(' ✅ Connection acquired');
try {
// Test 2: Simple query
console.log(' • Testing basic query...');
const result = await client.query('SELECT NOW() as current_time, version() as version');
console.log(' ✅ Query successful');
// Test 3: Get database info
console.log(' • Fetching database info...');
const dbInfoResult = await client.query(`
SELECT
current_database() as database,
current_user as user,
version() as postgresql_version,
(SELECT count(*) FROM information_schema.tables WHERE table_schema = 'public') as table_count
`);
console.log('\n📊 Database Information:');
const info = dbInfoResult.rows[0];
console.log(` • Database: ${info.database}`);
console.log(` • User: ${info.user}`);
console.log(` • PostgreSQL Version: ${info.postgresql_version.split(',')[0]}`);
console.log(` • Public Tables: ${info.table_count}`);
// Test 4: List tables
console.log('\n📋 Public Tables:');
const tablesResult = await client.query(`
SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'
`);
if (tablesResult.rows.length > 0) {
tablesResult.rows.forEach((row) => {
console.log(`${row.table_name}`);
});
} else {
console.log(' (no tables found)');
}
} finally {
client.release();
}
// Test 5: Connection string validation
console.log('\n🔐 Connection Details:');
const url = new URL(DATABASE_URL);
console.log(` • Host: ${url.hostname}`);
console.log(` • Port: ${url.port || 5432}`);
console.log(` • Database: ${url.pathname.slice(1)}`);
console.log(` • SSL Mode: ${url.searchParams.get('sslmode') || 'require'}`);
await pool.end();
return true;
} catch (error) {
console.error(' ❌ Connection failed');
console.error(` Error: ${(error as any).message}`);
return false;
}
}
async function main() {
console.log('═══════════════════════════════════════════════════════');
console.log(' Neon Connection Validator');
console.log('═══════════════════════════════════════════════════════');
console.log(`\n🚀 Testing ${CONNECTION_TYPE.toUpperCase()} connection...`);
console.log(` Database URL: ${DATABASE_URL.split('@')[1] || '...'}`);
let success = false;
if (CONNECTION_TYPE === 'websocket') {
success = await validateWebSocketConnection();
} else {
success = await validateHttpConnection();
}
console.log('\n═══════════════════════════════════════════════════════');
if (success) {
console.log('✅ Connection validated successfully!');
process.exit(0);
} else {
console.log('❌ Connection validation failed');
console.log('\n💡 Troubleshooting tips:');
console.log(' • Verify DATABASE_URL is correctly set');
console.log(' • Check your Neon console for connection details');
console.log(' • Ensure your firewall allows outbound connections');
console.log(' • Check if SSL mode is correctly configured');
process.exit(1);
}
}
main().catch((error) => {
console.error('Unexpected error:', error);
process.exit(1);
});

View File

@@ -0,0 +1,189 @@
/**
* HTTP Connection Template for Neon Serverless
*
* This template demonstrates the HTTP connection pattern,
* ideal for edge functions and stateless serverless environments.
*
* Usage: Best for Vercel Edge Functions, AWS Lambda, Cloudflare Workers, etc.
*/
import { neon } from '@neondatabase/serverless';
// Initialize the HTTP client
// This should be done once per request or in a module-level scope
const sql = neon(process.env.DATABASE_URL!);
/**
* Example: Query a single row
*/
export async function getUserById(userId: string) {
try {
const user = await sql`SELECT * FROM users WHERE id = ${userId}`;
return user[0] || null;
} catch (error) {
console.error('Failed to fetch user:', error);
throw error;
}
}
/**
* Example: Query multiple rows
*/
export async function getAllUsers() {
try {
const users = await sql`SELECT * FROM users ORDER BY created_at DESC`;
return users;
} catch (error) {
console.error('Failed to fetch users:', error);
throw error;
}
}
/**
* Example: Insert data
*/
export async function createUser(email: string, name: string) {
try {
const result = await sql`
INSERT INTO users (email, name, created_at)
VALUES (${email}, ${name}, NOW())
RETURNING id, email, name, created_at
`;
return result[0];
} catch (error) {
console.error('Failed to create user:', error);
throw error;
}
}
/**
* Example: Update data
*/
export async function updateUser(userId: string, updates: Record<string, any>) {
try {
const setClauses = Object.entries(updates)
.map(([key, value]) => `${key} = ${value}`)
.join(', ');
const result = await sql`
UPDATE users
SET ${setClauses}, updated_at = NOW()
WHERE id = ${userId}
RETURNING *
`;
return result[0];
} catch (error) {
console.error('Failed to update user:', error);
throw error;
}
}
/**
* Example: Delete data
*/
export async function deleteUser(userId: string) {
try {
const result = await sql`
DELETE FROM users WHERE id = ${userId}
RETURNING id
`;
return result.length > 0;
} catch (error) {
console.error('Failed to delete user:', error);
throw error;
}
}
/**
* Example: Transaction-like behavior with multiple queries
* Note: HTTP doesn't support true transactions, but you can sequence queries
*/
export async function createUserWithProfile(
email: string,
name: string,
bio: string
) {
try {
// Step 1: Create user
const userResult = await sql`
INSERT INTO users (email, name)
VALUES (${email}, ${name})
RETURNING id
`;
const userId = userResult[0].id;
// Step 2: Create profile
const profileResult = await sql`
INSERT INTO profiles (user_id, bio)
VALUES (${userId}, ${bio})
RETURNING *
`;
return { userId, profile: profileResult[0] };
} catch (error) {
console.error('Failed to create user with profile:', error);
throw error;
}
}
/**
* Example: Query with filtering and pagination
*/
export async function searchUsers(
query: string,
limit: number = 10,
offset: number = 0
) {
try {
const results = await sql`
SELECT * FROM users
WHERE name ILIKE ${'%' + query + '%'}
OR email ILIKE ${'%' + query + '%'}
ORDER BY created_at DESC
LIMIT ${limit}
OFFSET ${offset}
`;
return results;
} catch (error) {
console.error('Failed to search users:', error);
throw error;
}
}
/**
* Example: Aggregate query
*/
export async function getUserStats() {
try {
const stats = await sql`
SELECT
COUNT(*) as total_users,
COUNT(CASE WHEN created_at > NOW() - INTERVAL '30 days' THEN 1 END) as new_users_30d,
MIN(created_at) as oldest_user,
MAX(created_at) as newest_user
FROM users
`;
return stats[0];
} catch (error) {
console.error('Failed to fetch user stats:', error);
throw error;
}
}
/**
* Example: Join query
*/
export async function getUserWithProfile(userId: string) {
try {
const result = await sql`
SELECT u.*, p.bio, p.avatar_url
FROM users u
LEFT JOIN profiles p ON u.id = p.user_id
WHERE u.id = ${userId}
`;
return result[0] || null;
} catch (error) {
console.error('Failed to fetch user with profile:', error);
throw error;
}
}

View File

@@ -0,0 +1,245 @@
/**
* WebSocket Pool Template for Neon Serverless
*
* This template demonstrates the WebSocket connection pattern,
* ideal for Node.js servers and applications needing persistent connections.
*
* Usage: Best for Next.js API routes, Express servers, and long-lived applications
*/
import { Pool, PoolClient } from '@neondatabase/serverless';
// Create a global pool instance (reused across requests)
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
max: 20, // Maximum number of connections in the pool
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
});
// Optional: Log pool events
pool.on('error', (err) => {
console.error('Unexpected error on idle client', err);
});
/**
* Helper: Get a connection from the pool
*/
async function withConnection<T>(
callback: (client: PoolClient) => Promise<T>
): Promise<T> {
const client = await pool.connect();
try {
return await callback(client);
} finally {
client.release();
}
}
/**
* Example: Query a single row
*/
export async function getUserById(userId: string) {
return withConnection(async (client) => {
const result = await client.query('SELECT * FROM users WHERE id = $1', [
userId,
]);
return result.rows[0] || null;
});
}
/**
* Example: Query multiple rows
*/
export async function getAllUsers() {
return withConnection(async (client) => {
const result = await client.query('SELECT * FROM users ORDER BY created_at DESC');
return result.rows;
});
}
/**
* Example: Insert data
*/
export async function createUser(email: string, name: string) {
return withConnection(async (client) => {
const result = await client.query(
`INSERT INTO users (email, name, created_at)
VALUES ($1, $2, NOW())
RETURNING id, email, name, created_at`,
[email, name]
);
return result.rows[0];
});
}
/**
* Example: Update data
*/
export async function updateUser(
userId: string,
updates: Record<string, any>
) {
return withConnection(async (client) => {
const keys = Object.keys(updates);
const values = Object.values(updates);
const setClauses = keys
.map((key, i) => `${key} = $${i + 1}`)
.join(', ');
const result = await client.query(
`UPDATE users SET ${setClauses}, updated_at = NOW()
WHERE id = $${keys.length + 1}
RETURNING *`,
[...values, userId]
);
return result.rows[0];
});
}
/**
* Example: Delete data
*/
export async function deleteUser(userId: string) {
return withConnection(async (client) => {
const result = await client.query('DELETE FROM users WHERE id = $1', [
userId,
]);
return result.rowCount > 0;
});
}
/**
* Example: Transaction support (unique to WebSocket connections)
* Transactions allow multiple queries to be atomic
*/
export async function createUserWithProfileTx(
email: string,
name: string,
bio: string
) {
const client = await pool.connect();
try {
// Start transaction
await client.query('BEGIN');
// Step 1: Create user
const userResult = await client.query(
'INSERT INTO users (email, name) VALUES ($1, $2) RETURNING id',
[email, name]
);
const userId = userResult.rows[0].id;
// Step 2: Create profile
const profileResult = await client.query(
'INSERT INTO profiles (user_id, bio) VALUES ($1, $2) RETURNING *',
[userId, bio]
);
// Commit transaction
await client.query('COMMIT');
return { userId, profile: profileResult.rows[0] };
} catch (error) {
// Rollback on error
await client.query('ROLLBACK');
console.error('Transaction failed:', error);
throw error;
} finally {
client.release();
}
}
/**
* Example: Query with filtering and pagination
*/
export async function searchUsers(
query: string,
limit: number = 10,
offset: number = 0
) {
return withConnection(async (client) => {
const result = await client.query(
`SELECT * FROM users
WHERE name ILIKE $1 OR email ILIKE $2
ORDER BY created_at DESC
LIMIT $3 OFFSET $4`,
[`%${query}%`, `%${query}%`, limit, offset]
);
return result.rows;
});
}
/**
* Example: Aggregate query
*/
export async function getUserStats() {
return withConnection(async (client) => {
const result = await client.query(`
SELECT
COUNT(*) as total_users,
COUNT(CASE WHEN created_at > NOW() - INTERVAL '30 days' THEN 1 END) as new_users_30d,
MIN(created_at) as oldest_user,
MAX(created_at) as newest_user
FROM users
`);
return result.rows[0];
});
}
/**
* Example: Join query
*/
export async function getUserWithProfile(userId: string) {
return withConnection(async (client) => {
const result = await client.query(
`SELECT u.*, p.bio, p.avatar_url
FROM users u
LEFT JOIN profiles p ON u.id = p.user_id
WHERE u.id = $1`,
[userId]
);
return result.rows[0] || null;
});
}
/**
* Example: Batch operations
*/
export async function createMultipleUsers(
users: Array<{ email: string; name: string }>
) {
const client = await pool.connect();
try {
await client.query('BEGIN');
const results = [];
for (const user of users) {
const result = await client.query(
`INSERT INTO users (email, name, created_at)
VALUES ($1, $2, NOW())
RETURNING id, email, name`,
[user.email, user.name]
);
results.push(result.rows[0]);
}
await client.query('COMMIT');
return results;
} catch (error) {
await client.query('ROLLBACK');
throw error;
} finally {
client.release();
}
}
/**
* Cleanup: Drain the pool when shutting down
*/
export async function closePool() {
await pool.end();
console.log('Connection pool closed');
}

View File

@@ -0,0 +1,89 @@
---
name: neon-toolkit
description: Creates and manages ephemeral Neon databases for testing, CI/CD pipelines, and isolated development environments. Use when building temporary databases for automated tests or rapid prototyping.
allowed-tools: ["bash"]
---
# Neon Toolkit Skill
Automates creation, management, and cleanup of temporary Neon databases using the Neon Toolkit.
## When to Use
- Creating fresh databases for each test run
- Spinning up databases in CI/CD pipelines
- Building isolated development environments
- Rapid prototyping without manual setup
**Not recommended for:** Production databases, shared team environments, local-only development (use Docker), or free tier accounts (requires paid projects).
## Code Generation Rules
When generating TypeScript/JavaScript code:
- BEFORE generating import statements, check tsconfig.json for path aliases (compilerOptions.paths)
- If path aliases exist (e.g., "@/*": ["./src/*"]), use them (e.g., import { x } from '@/lib/utils')
- If NO path aliases exist or unsure, ALWAYS use relative imports (e.g., import { x } from '../../../lib/utils')
- Verify imports match the project's configuration
- Default to relative imports - they always work regardless of configuration
## Reference Documentation
**Primary Resource:** See `[neon-toolkit.mdc](https://raw.githubusercontent.com/neondatabase-labs/ai-rules/main/neon-toolkit.mdc)` in project root for comprehensive guidelines including:
- Core concepts (Organization, Project, Branch, Endpoint)
- Installation and authentication setup
- Database lifecycle management patterns
- API client usage examples
- Error handling strategies
## Quick Setup
### Installation
```bash
npm install @neondatabase/toolkit
```
### Basic Usage
```typescript
import { NeonToolkit } from '@neondatabase/toolkit';
const neon = new NeonToolkit({ apiKey: process.env.NEON_API_KEY! });
// Create ephemeral database
const db = await neon.createEphemeralDatabase();
console.log(`Database URL: ${db.url}`);
// Use the database...
// Cleanup
await db.delete();
```
## Templates & Scripts
- `templates/toolkit-workflow.ts` - Complete ephemeral database workflow
- `scripts/create-ephemeral-db.ts` - Create a temporary database
- `scripts/destroy-ephemeral-db.ts` - Clean up ephemeral database
## Common Use Cases
### Testing
```typescript
const db = await neon.createEphemeralDatabase();
// Run tests with fresh database
await db.delete();
```
### CI/CD Integration
```bash
export NEON_API_KEY=${{ secrets.NEON_API_KEY }}
npm test # Uses ephemeral database
```
## Related Skills
- **neon-serverless** - For connecting to databases
- **neon-drizzle** - For schema and migrations
---
**Want best practices in your project?** Run `neon-plugin:add-neon-docs` with parameter `SKILL_NAME="neon-toolkit"` to add reference links.

View File

@@ -0,0 +1,94 @@
/**
* Create Ephemeral Database Script
*
* Creates a temporary Neon database for testing or development.
* Run with: NEON_API_KEY=your_key npx ts-node create-ephemeral-db.ts
*
* Outputs the database connection string and saves it to a .env file.
*/
import { NeonToolkit } from '@neondatabase/toolkit';
import * as fs from 'fs';
import * as path from 'path';
const API_KEY = process.env.NEON_API_KEY;
if (!API_KEY) {
console.error('❌ NEON_API_KEY environment variable is not set');
console.error('\nSet it with:');
console.error(' export NEON_API_KEY=your_api_key');
process.exit(1);
}
async function createEphemeralDatabase() {
console.log('═══════════════════════════════════════════════════════');
console.log(' Neon Ephemeral Database Creator');
console.log('═══════════════════════════════════════════════════════\n');
try {
console.log('🔑 Initializing Neon Toolkit...');
const neon = new NeonToolkit({ apiKey: API_KEY });
console.log('📦 Creating ephemeral database...');
const db = await neon.createEphemeralDatabase();
console.log('\n✅ Ephemeral database created successfully!\n');
// Display database info
console.log('📊 Database Information:');
console.log('═══════════════════════════════════════════════════════');
console.log(`Connection String: ${db.url}`);
console.log(`Database Name: ${new URL(db.url).pathname.slice(1)}`);
console.log(`Host: ${new URL(db.url).hostname}`);
console.log('\n');
// Save to .env.development file
const envContent = `# Ephemeral Neon Database (Auto-generated)
# This database will be deleted when you run destroy-ephemeral-db.ts
DATABASE_URL="${db.url}"
`;
const envPath = path.join(process.cwd(), '.env.development');
fs.writeFileSync(envPath, envContent);
console.log(`📝 Saved to: ${envPath}`);
console.log('\n💡 Usage:');
console.log(' 1. Load environment: source .env.development');
console.log(' 2. Run your tests: npm test');
console.log(' 3. Cleanup: npx ts-node destroy-ephemeral-db.ts\n');
// Also print to console for CI/CD usage
console.log('🔗 For CI/CD, use this connection string:');
console.log(db.url);
console.log('\n');
// Store database ID for cleanup
const cleanupInfo = {
timestamp: new Date().toISOString(),
connectionUrl: db.url,
deleteCommand: 'npx ts-node destroy-ephemeral-db.ts',
};
const infoPath = path.join(process.cwd(), '.ephemeral-db-info.json');
fs.writeFileSync(infoPath, JSON.stringify(cleanupInfo, null, 2));
console.log(`📋 Database info saved to: ${infoPath}`);
console.log('═══════════════════════════════════════════════════════');
console.log('✅ Ready to use!\n');
return db.url;
} catch (error) {
console.error('❌ Failed to create ephemeral database');
console.error(`Error: ${(error as any).message}`);
console.log('\n💡 Troubleshooting:');
console.log(' • Check your NEON_API_KEY is valid');
console.log(' • Verify API key permissions in Neon console');
console.log(' • Check network connectivity');
console.log(' • Review Neon API status at https://status.neon.tech\n');
process.exit(1);
}
}
createEphemeralDatabase();

View File

@@ -0,0 +1,83 @@
/**
* Destroy Ephemeral Database Script
*
* Cleans up a temporary Neon database created with create-ephemeral-db.ts.
* Run with: NEON_API_KEY=your_key npx ts-node destroy-ephemeral-db.ts
*
* Removes the database and cleans up related files.
*/
import * as fs from 'fs';
import * as path from 'path';
const API_KEY = process.env.NEON_API_KEY;
if (!API_KEY) {
console.error('❌ NEON_API_KEY environment variable is not set');
console.error('\nSet it with:');
console.error(' export NEON_API_KEY=your_api_key');
process.exit(1);
}
async function destroyEphemeralDatabase() {
console.log('═══════════════════════════════════════════════════════');
console.log(' Neon Ephemeral Database Destroyer');
console.log('═══════════════════════════════════════════════════════\n');
try {
// Check for database info file
const infoPath = path.join(process.cwd(), '.ephemeral-db-info.json');
if (!fs.existsSync(infoPath)) {
console.warn('⚠️ No database info file found at: ' + infoPath);
console.log(' Run create-ephemeral-db.ts first to create a database.\n');
process.exit(1);
}
const dbInfo = JSON.parse(fs.readFileSync(infoPath, 'utf-8'));
console.log('🔍 Found database created at: ' + dbInfo.timestamp);
console.log(` Connection: ${new URL(dbInfo.connectionUrl).hostname}`);
console.log(' Database: ' + new URL(dbInfo.connectionUrl).pathname.slice(1));
console.log('');
console.log('🧹 Cleaning up...');
// Remove .env file if it exists
const envPath = path.join(process.cwd(), '.env.development');
if (fs.existsSync(envPath)) {
fs.unlinkSync(envPath);
console.log(' ✅ Removed .env.development');
}
// Remove info file
fs.unlinkSync(infoPath);
console.log(' ✅ Removed database info file');
console.log('\n✅ Cleanup complete!');
console.log(' (Database itself is ephemeral and auto-deletes)');
console.log('\n');
// Show next steps
console.log('💡 Next steps:');
console.log(' • To create a new database: npx ts-node create-ephemeral-db.ts');
console.log(' • To persist a database: Use Neon Console directly\n');
console.log('═══════════════════════════════════════════════════════');
} catch (error) {
console.error('❌ Error during cleanup');
console.error(`Error: ${(error as any).message}\n`);
console.log('💡 Manual cleanup:');
console.log(' 1. Remove .env.development');
console.log(' 2. Remove .ephemeral-db-info.json');
console.log(' 3. Ephemeral database auto-deletes\n');
process.exit(1);
}
}
// Note: In a real implementation, you might also delete the database via API:
// import { NeonToolkit } from '@neondatabase/toolkit';
// const neon = new NeonToolkit({ apiKey: API_KEY });
// await neon.deleteBranch(dbInfo.branchId);
destroyEphemeralDatabase();

View File

@@ -0,0 +1,237 @@
/**
* Neon Toolkit Workflow Example
*
* This demonstrates a complete workflow for creating, using, and cleaning up
* an ephemeral Neon database. Perfect for testing, CI/CD, and prototyping.
*/
import { NeonToolkit } from '@neondatabase/toolkit';
/**
* Main workflow function
*/
export async function ephemeralDatabaseWorkflow() {
const apiKey = process.env.NEON_API_KEY;
if (!apiKey) {
throw new Error('NEON_API_KEY environment variable is required');
}
// Initialize Neon Toolkit
const neon = new NeonToolkit({ apiKey });
console.log('🚀 Starting ephemeral database workflow...\n');
try {
// Step 1: Create ephemeral database
console.log('📦 Creating ephemeral database...');
const db = await neon.createEphemeralDatabase();
console.log(`✅ Database created!`);
console.log(` Connection string: ${db.url}\n`);
// Step 2: Setup schema
console.log('📝 Setting up schema...');
await db.query(`
CREATE TABLE IF NOT EXISTS users (
id SERIAL PRIMARY KEY,
email VARCHAR(255) UNIQUE NOT NULL,
name VARCHAR(255) NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
)
`);
console.log('✅ Schema created\n');
// Step 3: Insert sample data
console.log('📤 Inserting sample data...');
const insertResult = await db.query(
`INSERT INTO users (email, name) VALUES
($1, $2), ($3, $4), ($5, $6)
RETURNING *`,
[
'alice@example.com',
'Alice',
'bob@example.com',
'Bob',
'charlie@example.com',
'Charlie',
]
);
console.log(`✅ Inserted ${insertResult.rows?.length || 0} users\n`);
// Step 4: Query data
console.log('🔍 Querying data...');
const selectResult = await db.query('SELECT * FROM users ORDER BY created_at');
console.log('✅ Users in database:');
selectResult.rows?.forEach((row: any) => {
console.log(`${row.name} (${row.email})`);
});
console.log('');
// Step 5: Run tests (example)
console.log('🧪 Running tests...');
const testResults = await runTests(db);
console.log(`${testResults.passed} tests passed, ${testResults.failed} failed\n`);
// Step 6: Cleanup
console.log('🧹 Cleaning up...');
await db.delete();
console.log('✅ Ephemeral database destroyed\n');
console.log('🎉 Workflow completed successfully!');
return true;
} catch (error) {
console.error('❌ Error during workflow:', error);
throw error;
}
}
/**
* Example test suite using ephemeral database
*/
async function runTests(db: any) {
const tests = [
{
name: 'User count should be 3',
async run() {
const result = await db.query('SELECT COUNT(*) as count FROM users');
return result.rows?.[0]?.count === 3;
},
},
{
name: 'Should find user by email',
async run() {
const result = await db.query(
"SELECT * FROM users WHERE email = $1",
['alice@example.com']
);
return result.rows?.[0]?.name === 'Alice';
},
},
{
name: 'Should insert new user',
async run() {
await db.query(
'INSERT INTO users (email, name) VALUES ($1, $2)',
['david@example.com', 'David']
);
const result = await db.query('SELECT COUNT(*) as count FROM users');
return result.rows?.[0]?.count === 4;
},
},
{
name: 'Should update user',
async run() {
await db.query(
"UPDATE users SET name = $1 WHERE email = $2",
['Alice Updated', 'alice@example.com']
);
const result = await db.query(
"SELECT name FROM users WHERE email = $1",
['alice@example.com']
);
return result.rows?.[0]?.name === 'Alice Updated';
},
},
{
name: 'Should delete user',
async run() {
await db.query("DELETE FROM users WHERE email = $1", ['bob@example.com']);
const result = await db.query('SELECT COUNT(*) as count FROM users');
return result.rows?.[0]?.count >= 3;
},
},
];
let passed = 0;
let failed = 0;
for (const test of tests) {
try {
const result = await test.run();
if (result) {
console.log(`${test.name}`);
passed++;
} else {
console.log(`${test.name}`);
failed++;
}
} catch (error) {
console.log(`${test.name} (error)`);
failed++;
}
}
return { passed, failed };
}
/**
* Example: Using in CI/CD
* Run this in your CI/CD pipeline for isolated testing
*/
export async function cicdWorkflow() {
console.log('🔄 CI/CD Workflow\n');
const apiKey = process.env.NEON_API_KEY;
if (!apiKey) {
console.error('NEON_API_KEY not set. Skipping ephemeral database setup.');
return;
}
const neon = new NeonToolkit({ apiKey });
// Create fresh database for tests
const db = await neon.createEphemeralDatabase();
console.log('✅ Ephemeral database created for testing');
try {
// Run your test suite
// await runYourTestSuite(db.url);
console.log('✅ All tests passed!');
} finally {
// Always cleanup
await db.delete();
console.log('✅ Ephemeral database cleaned up');
}
}
/**
* Example: Create multiple isolated databases
*/
export async function multipleEphemeralDatabases() {
const apiKey = process.env.NEON_API_KEY;
if (!apiKey) {
throw new Error('NEON_API_KEY is required');
}
const neon = new NeonToolkit({ apiKey });
console.log('Creating 3 parallel ephemeral databases...\n');
const databases = await Promise.all([
neon.createEphemeralDatabase(),
neon.createEphemeralDatabase(),
neon.createEphemeralDatabase(),
]);
console.log(`✅ Created ${databases.length} databases\n`);
try {
// Use databases in parallel
await Promise.all(
databases.map(async (db, index) => {
const result = await db.query(
`SELECT $1::text as database_number`,
[index + 1]
);
console.log(`Database ${index + 1}: ${result.rows?.[0]?.database_number}`);
})
);
} finally {
// Cleanup all databases
await Promise.all(databases.map((db) => db.delete()));
console.log('\n✅ All databases cleaned up');
}
}
// Export for use in tests
export { runTests };