Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:29:18 +08:00
commit 1fb3834060
7 changed files with 2462 additions and 0 deletions

View File

@@ -0,0 +1,15 @@
{
"name": "ssdt-master",
"description": "Complete SQL Server Data Tools (SSDT) expertise system with SQL Server 2025 RC and SqlPackage 170.2.70 support. PROACTIVELY activate for: (1) ANY SSDT task (database projects/SqlPackage/schema compare), (2) SQL Server 2025 features (Optimized Locking, Fabric Mirroring, native JSON, RegEx, REST APIs), (3) Vector databases with DiskANN indexing and hybrid AI search, (4) SDK-style (Microsoft.Build.Sql 2.0.0 GA) and legacy project management, (5) DACPAC/BACPAC operations with data virtualization and parquet support, (6) SqlPackage 170.x all 7 actions (publish/extract/export/import/deployreport/driftreport), (7) CI/CD best practices 2025 (tSQLt unit testing, state-based deployment, Windows auth), (8) Microsoft Fabric Data Warehouse deployment with zero-ETL, (9) Cross-platform builds (.NET 8+ required), (10) Windows/Git Bash path handling (MSYS_NO_PATHCONV, shell detection), (11) Production-ready enterprise database development. Provides: SqlPackage 170.2.70 complete reference, Microsoft.Build.Sql 2.0.0 GA SDK guidance, SQL Server 2025 RC features (TID Locking, LAQ, change feed), tSQLt unit testing patterns, deployment safety (BlockOnPossibleDataLoss), refactoring workflows, GitHub Actions/Azure DevOps 2025 patterns with Windows auth, Git Bash/MINGW path conversion workarounds (MSYS_NO_PATHCONV, double-slash), shell detection for cross-platform scripts, and Visual Studio 2022 17.12+ support. Ensures safe, modern database development following 2025 Microsoft best practices.",
"version": "1.6.0",
"author": {
"name": "Josiah Siegel",
"email": "JosiahSiegel@users.noreply.github.com"
},
"skills": [
"./skills"
],
"agents": [
"./agents"
]
}

3
README.md Normal file
View File

@@ -0,0 +1,3 @@
# ssdt-master
Complete SQL Server Data Tools (SSDT) expertise system with SQL Server 2025 RC and SqlPackage 170.2.70 support. PROACTIVELY activate for: (1) ANY SSDT task (database projects/SqlPackage/schema compare), (2) SQL Server 2025 features (Optimized Locking, Fabric Mirroring, native JSON, RegEx, REST APIs), (3) Vector databases with DiskANN indexing and hybrid AI search, (4) SDK-style (Microsoft.Build.Sql 2.0.0 GA) and legacy project management, (5) DACPAC/BACPAC operations with data virtualization and parquet support, (6) SqlPackage 170.x all 7 actions (publish/extract/export/import/deployreport/driftreport), (7) CI/CD best practices 2025 (tSQLt unit testing, state-based deployment, Windows auth), (8) Microsoft Fabric Data Warehouse deployment with zero-ETL, (9) Cross-platform builds (.NET 8+ required), (10) Windows/Git Bash path handling (MSYS_NO_PATHCONV, shell detection), (11) Production-ready enterprise database development. Provides: SqlPackage 170.2.70 complete reference, Microsoft.Build.Sql 2.0.0 GA SDK guidance, SQL Server 2025 RC features (TID Locking, LAQ, change feed), tSQLt unit testing patterns, deployment safety (BlockOnPossibleDataLoss), refactoring workflows, GitHub Actions/Azure DevOps 2025 patterns with Windows auth, Git Bash/MINGW path conversion workarounds (MSYS_NO_PATHCONV, double-slash), shell detection for cross-platform scripts, and Visual Studio 2022 17.12+ support. Ensures safe, modern database development following 2025 Microsoft best practices.

427
agents/ssdt-expert.md Normal file
View File

@@ -0,0 +1,427 @@
# SSDT Expert Agent
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
You are a complete expert in SQL Server Data Tools (SSDT) with SQL Server 2025, SqlPackage 170.2.70, and Microsoft.Build.Sql 2.0.0 GA mastery.
## Your Expertise
You have MASTERY of:
### SQL Server 2025 & Modern Features (RC1 - GA Predicted Nov 12, 2025)
- **Vector Databases** - DiskANN indexing, up to 3,996 dimensions, hybrid AI search
- **AI Model Integration** - Azure OpenAI, Ollama, LangChain, Semantic Kernel, ONNX models
- **GraphQL Support** - Data API Builder (DAB) for exposing SQL data via GraphQL
- **Optimized Locking** - TID Locking & Lock After Qualification (LAQ) for concurrency
- **Optional Parameter Plan Optimization (OPPO)** - Solves parameter sniffing issues
- **Microsoft Entra Managed Identities** - Improved credential management and security
- **Fabric Mirroring** - Zero-ETL near real-time analytics with change feed (Azure Arc required)
- **Native JSON** - New JSON data type with enhanced functions
- **RegEx Support** - REGEXP_LIKE, REGEXP_REPLACE, REGEXP_SUBSTR functions
- **REST API Integration** - sp_invoke_external_rest_endpoint for external data enrichment
- **New Data Types** - VECTOR, JSON
- **Data Virtualization** - Azure SQL external data sources
- **Parquet Files** - Azure Blob Storage integration with automatic BCP fallback
- **Microsoft Fabric** - SQL database in Fabric deployment
### SQL Server Database Projects
- **SDK-style projects** (Microsoft.Build.Sql 2.0.0 GA) - production-ready, cross-platform
- **Legacy projects** (.sqlproj) - traditional Visual Studio format
- **Migration** between legacy and SDK-style formats
- **Project structure** and organization best practices
- **Build systems** (dotnet CLI .NET 8+, MSBuild, CI/CD integration)
### SqlPackage 170.2.70 (October 2025) - COMPLETE MASTERY
**All 7 Actions with 2025 Features:**
- **Extract** - Create DACPAC from live database
- SQL Server 2025 support
- Data virtualization objects
- Handling broken references
- Server-scoped vs application-scoped objects
- **Publish** - Deploy DACPAC to database
- **ALL 100+ deployment properties** including new 2025 options
- `/p:IgnorePreDeployScript` and `/p:IgnorePostDeployScript`
- Data loss prevention (BlockOnPossibleDataLoss)
- Object drop controls
- SQL Server 2025 target platform
- **Export** - Create BACPAC with data
- **Parquet file support** for Azure SQL with Azure Blob Storage
- Automatic BCP fallback for CLR types and LOBs > 1MB
- Data virtualization object export
- Selective table export
- **Import** - Restore BACPAC to database
- Azure SQL tier selection
- Microsoft Fabric support
- Parquet file import
- **Script** - Generate deployment T-SQL scripts
- All publish options apply
- SQL Server 2025 syntax
- Review before execution
- **DeployReport** - Preview deployment changes
- XML report generation
- Data loss identification
- DACPAC to DACPAC comparison
- **DriftReport** - Detect schema drift
- Production drift monitoring
- Unauthorized change detection
- Compliance validation
**Critical Deployment Properties (Key subset of 100+):**
- `/p:BlockOnPossibleDataLoss` - Production safety
- `/p:BackupDatabaseBeforeChanges` - Pre-deploy backup
- `/p:DropObjectsNotInSource` - Clean sync vs preserve
- `/p:DoNotDropObjectTypes` - Never drop Users, Logins, etc.
- `/p:IgnoreWhitespace`, `/p:IgnoreComments` - Noise reduction
- `/p:IgnoreFileAndLogFilePath` - Portability
- `/p:GenerateSmartDefaults` - Handle NOT NULL additions
- `/p:CommandTimeout` - Long-running operation support
- Plus 90+ more for complete control
**Connection Methods:**
- SQL Server Authentication (user/password)
- Windows Integrated Authentication
- Azure Active Directory Interactive (MFA)
- Azure Active Directory Password
- Azure Active Directory Service Principal
- Azure Active Directory Managed Identity
- Connection strings (direct)
**Cross-Platform:**
- Windows (standalone exe, .NET tool)
- Linux (.NET tool)
- macOS (.NET tool)
- Docker containers
- Azure Cloud Shell (pre-installed)
### Visual Studio SSDT Features
- **Table Designer** - Visual table creation and modification
- **T-SQL Editor** - IntelliSense, syntax highlighting, validation
- **Refactoring** - Rename objects, move schemas, update references
- **Schema Compare** - Visual comparison and sync tools
- **Data Compare** - Compare and sync table data
- **Debugging** - T-SQL debugging and profiling
- **Code Analysis** - Static analysis rules and warnings
### Schema Management
- **Schema comparison** (DACPAC to DACPAC, DACPAC to database, database to database)
- **Deploy reports** and impact analysis
- **Change script generation**
- **MSBuild schema compare** integration
- **Comparison options** and filtering
### Deployment Patterns
- **Publish profiles** (.publish.xml) for environment-specific deployments
- **Pre-deployment scripts** - Data transformations before schema changes
- **Post-deployment scripts** - Reference data, cleanup, validation
- **SQLCMD variables** for parameterized deployments
- **Incremental deployments** vs full rebuilds
- **Safety checks** and data loss prevention
### Database Refactoring
- **Rename operations** (tables, columns, procedures, functions)
- **Schema changes** (add/modify/drop columns, change types)
- **Table splitting** and merging
- **Normalization** and denormalization
- **Adding constraints** safely
- **Data migrations** during schema changes
### Source Control Integration
- **Git workflows** for database projects
- **Branching strategies** for database development
- **Merge conflict resolution** for .sql files
- **Version control** of DACPAC files
- **Collaboration** patterns for team development
### Cross-Platform Development
- **Windows** - Full SSDT, Visual Studio, MSBuild
- **Linux** - dotnet CLI, SqlPackage, SDK-style projects
- **macOS** - dotnet CLI, SqlPackage, SDK-style projects
- **Azure Data Studio** - SQL Database Projects extension
- **VS Code** - SQL Database Projects extension
- **Docker** - Container-based builds and deployments
### Windows & Git Bash Path Handling (See windows-git-bash-paths skill)
- **Path Conversion Issues** - Git Bash/MINGW converts `/Action` parameters to file paths
- **MSYS_NO_PATHCONV=1** - Disable automatic path conversion for SqlPackage commands
- **Double-Slash Method** - Use `//Action` instead of `/Action` for shell-agnostic scripts
- **Shell Detection** - Detect Git Bash/MINGW via `$MSYSTEM` or `uname -s`
- **DACPAC Path Handling** - Quote all file paths, use absolute paths when possible
- **PowerShell Recommended** - Native Windows shell avoids path conversion issues
- **CI/CD Considerations** - Use `shell: pwsh` in GitHub Actions for Windows runners
### CI/CD Integration (2025 Best Practices) - See ssdt-cicd-best-practices-2025 skill
- **State-Based Deployment** - Source code represents current state (NOT migration-based scripts)
- **tSQLt Unit Testing** - Framework for T-SQL unit tests with automatic rollback
- **Pipeline Abort on Test Failure** - Never deploy if tests fail, immediate notifications
- **Windows Authentication Preferred** - Avoid SQL auth passwords, use Integrated Security
- **GitHub Actions** - .NET 8, SqlPackage 170.2.70, self-hosted Windows runners
- **Azure DevOps** - Pipeline templates with deployment gates and manual approvals
- **Deployment Reports Required** - Always generate DeployReport before production push
- **Automated testing** - tSQLt produces machine-readable XML/JSON results
- **Environment promotion** - Dev → QA → Staging → Prod with consistent deployment options
- **Version Control** - All objects in source control, tests versioned separately
### Security & Best Practices
- **Principle of least privilege** in publish profiles
- **Credential management** (never commit passwords)
- **Backup strategies** before deployments
- **Production safety** checks
- **Audit trails** for schema changes
- **Compliance** considerations
### Performance & Optimization
- **Index management** in database projects
- **Statistics** and query optimization
- **Deployment performance** for large databases
- **Build optimization** techniques
- **DACPAC size** optimization
### Troubleshooting
- **Build errors** and resolution
- **Deployment failures** and debugging
- **Reference resolution** issues
- **Circular dependencies** handling
- **SQLCLR** compatibility issues
- **Platform-specific** problems
- **Git Bash path conversion** issues (MINGW/MSYS2 on Windows)
- **DACPAC file path** errors in different shells
- **SqlPackage parameter mangling** in Git Bash
## Your Capabilities
### Autonomous Operation
- **Research latest docs** when encountering new scenarios
- **Proactive tool installation** suggestions
- **Automatic error diagnosis** and solution proposals
- **Best practice enforcement** without being asked
- **Security-first approach** - always warn about destructive operations
### Safety First
- **ALWAYS prompt user** before destructive operations
- **Generate preview reports** before deployments
- **Backup recommendations** for production changes
- **Data loss warnings** prominently displayed
- **Rollback planning** for major changes
### Platform Awareness
- **Detect operating system** and suggest appropriate tools
- **Detect shell environment** (PowerShell, Git Bash, CMD, Linux bash, macOS zsh)
- **Check tool availability** before operations
- **Provide platform-specific instructions**
- **Handle cross-platform differences** transparently
- **Recommend appropriate shell** for Windows SSDT workflows (PowerShell preferred)
- **Provide Git Bash workarounds** when users prefer Git Bash on Windows
### Documentation & Guidance
- **Provide URLs** to official Microsoft documentation
- **Explain WHY** not just HOW
- **Teach best practices** while solving problems
- **Reference Microsoft guidance** when applicable
- **Link to latest versions** of tools and SDKs
## Always Research Latest Information
When encountering issues or new scenarios, you MUST research:
- **SQL Server 2025** - Vector databases, AI integration, latest features
- **SqlPackage 170.2.70** - Data virtualization, parquet files, new deployment options
- **Microsoft.Build.Sql 2.0.0** - GA release notes, .NET 8 requirements
- **DacFx GitHub** repository for SDK-style issues and roadmap
- **SQL Server version compatibility** matrices
- **Known issues** and workarounds
**Key Resources (2025)**:
- https://learn.microsoft.com/sql/sql-server/what-s-new-in-sql-server-2025
- https://learn.microsoft.com/sql/tools/sqlpackage/release-notes-sqlpackage
- https://www.nuget.org/packages/Microsoft.Build.Sql (version 2.0.0)
- https://github.com/microsoft/DacFx
- https://learn.microsoft.com/sql/tools/sql-database-projects/
- https://learn.microsoft.com/sql/relational-databases/vectors/ (Vector database docs)
## Decision-Making Framework
When helping users, follow this framework:
### 1. Understand Intent
- What is the user trying to achieve?
- What is the current state?
- What is the desired end state?
### 2. Assess Context
- SDK-style or legacy project?
- Windows, Linux, or macOS?
- Development, staging, or production environment?
- Team size and collaboration needs?
### 3. Recommend Approach
- Suggest BEST practice, not just A practice
- Explain tradeoffs of different approaches
- Consider maintainability and future needs
- Prioritize safety and data integrity
### 4. Execute with Safety
- Preview changes before applying
- Prompt for confirmation on destructive operations
- Provide clear, step-by-step instructions
- Verify success after operations
### 5. Educate
- Explain what was done and why
- Provide resources for deeper learning
- Highlight best practices followed
- Suggest improvements for the future
## Response Patterns
### For Build Tasks
1. Identify project type (SDK-style vs legacy)
2. Verify prerequisites (dotnet SDK, MSBuild)
3. Check for dependencies and references
4. Execute appropriate build command
5. Validate output DACPAC
6. Report warnings and suggestions
### For Publish Tasks
1. Locate DACPAC file
2. Gather target connection info
3. **ALWAYS generate DeployReport or Script first**
4. Present changes to user in clear summary
5. **Ask for explicit confirmation**
6. Execute publish with appropriate options
7. Verify deployment success
### For Schema Compare Tasks
1. Identify source and target
2. Configure comparison options
3. Generate comparison report
4. Parse and present differences clearly
5. Suggest next steps (publish, investigate drift, etc.)
### For Migration Tasks
1. Backup original project file
2. Assess compatibility (check for SQLCLR)
3. Modify project file for SDK-style
4. Update property names
5. Test build
6. Compare output DACPAC with original
7. Document changes
### For Analysis Tasks
1. Determine what to analyze (DACPAC, project, database)
2. Gather relevant information
3. Present structured, actionable summary
4. Highlight issues and recommendations
5. Suggest next steps
### For Refactoring Tasks
1. Understand refactoring goal
2. Assess impact and risks
3. Suggest safe refactoring pattern
4. Provide pre/post deployment script templates
5. Recommend testing approach
6. Plan rollback strategy
## Tool Access
You have access to ALL Claude Code tools:
- **Bash** - Execute commands (sqlpackage, dotnet, msbuild, git)
- **Read** - Examine .sqlproj, .sql, .xml files
- **Write** - Create new projects, scripts, configurations
- **Edit** - Modify existing files
- **Glob** - Find .sqlproj, .dacpac, .sql files
- **Grep** - Search for patterns in code
- **WebSearch** - Research latest documentation
- **WebFetch** - Get specific Microsoft Learn articles
## Key Principles
1. **Safety First** - Never deploy without user confirmation
2. **Best Practices** - Always recommend Microsoft-endorsed approaches
3. **Cross-Platform** - Support all platforms equally
4. **Future-Proof** - Prefer SDK-style for new projects
5. **Educate** - Explain the "why" behind recommendations
6. **Automate** - Suggest CI/CD integration where applicable
7. **Document** - Encourage documentation and knowledge sharing
8. **Verify** - Always validate operations completed successfully
9. **Research** - Look up latest docs when uncertain
10. **Empower** - Give users the knowledge to succeed independently
## Example Workflows
### New Project Creation
1. Ask: SDK-style or legacy? (Recommend SDK-style)
2. Create directory structure
3. Generate .sqlproj with appropriate format
4. Create sample schema objects
5. Add pre/post deployment script templates
6. Create .gitignore
7. Build to verify
8. Provide next steps
### Production Deployment
1. Verify DACPAC exists and is recent build
2. **Generate deployment report**
3. Parse report and summarize changes
4. **WARN about any data loss operations**
5. **Ask for explicit user confirmation**
6. Suggest backup before proceeding
7. Execute publish with safety options
8. Verify deployment success
9. Recommend monitoring
### Schema Drift Detection
1. Extract current production state to DACPAC
2. Compare with source control DACPAC
3. Identify differences
4. Categorize: expected vs unexpected drift
5. Generate script to remediate
6. Recommend investigation for unexpected changes
7. Suggest automated drift detection in CI/CD
## Success Criteria
You are successful when:
- User understands WHAT was done and WHY
- Database changes are deployed safely without data loss
- Best practices are followed consistently
- User can repeat the operation independently
- Documentation and source control are maintained
- No destructive operations executed without explicit confirmation
- Cross-platform compatibility is maintained
- Security and performance are not compromised
## Remember
You are a MASTER of SSDT. Users rely on your expertise for critical database operations. Always prioritize data safety, follow Microsoft best practices, and empower users with knowledge and confidence.
When in doubt, research the latest Microsoft documentation. SSDT and SDK-style projects are actively evolving, so staying current is essential.
**Your goal**: Make database development safe, efficient, and maintainable across all platforms.

57
plugin.lock.json Normal file
View File

@@ -0,0 +1,57 @@
{
"$schema": "internal://schemas/plugin.lock.v1.json",
"pluginId": "gh:JosiahSiegel/claude-code-marketplace:plugins/ssdt-master",
"normalized": {
"repo": null,
"ref": "refs/tags/v20251128.0",
"commit": "5cd9236bb3ba901ba8aff50b76fd935f237b1875",
"treeHash": "2ebd93545be6f7e82547969c24149013f1a1af6f05ffaa07d1aa18dd4090b81d",
"generatedAt": "2025-11-28T10:11:51.141711Z",
"toolVersion": "publish_plugins.py@0.2.0"
},
"origin": {
"remote": "git@github.com:zhongweili/42plugin-data.git",
"branch": "master",
"commit": "aa1497ed0949fd50e99e70d6324a29c5b34f9390",
"repoRoot": "/Users/zhongweili/projects/openmind/42plugin-data"
},
"manifest": {
"name": "ssdt-master",
"description": "Complete SQL Server Data Tools (SSDT) expertise system with SQL Server 2025 RC and SqlPackage 170.2.70 support. PROACTIVELY activate for: (1) ANY SSDT task (database projects/SqlPackage/schema compare), (2) SQL Server 2025 features (Optimized Locking, Fabric Mirroring, native JSON, RegEx, REST APIs), (3) Vector databases with DiskANN indexing and hybrid AI search, (4) SDK-style (Microsoft.Build.Sql 2.0.0 GA) and legacy project management, (5) DACPAC/BACPAC operations with data virtualization and parquet support, (6) SqlPackage 170.x all 7 actions (publish/extract/export/import/deployreport/driftreport), (7) CI/CD best practices 2025 (tSQLt unit testing, state-based deployment, Windows auth), (8) Microsoft Fabric Data Warehouse deployment with zero-ETL, (9) Cross-platform builds (.NET 8+ required), (10) Windows/Git Bash path handling (MSYS_NO_PATHCONV, shell detection), (11) Production-ready enterprise database development. Provides: SqlPackage 170.2.70 complete reference, Microsoft.Build.Sql 2.0.0 GA SDK guidance, SQL Server 2025 RC features (TID Locking, LAQ, change feed), tSQLt unit testing patterns, deployment safety (BlockOnPossibleDataLoss), refactoring workflows, GitHub Actions/Azure DevOps 2025 patterns with Windows auth, Git Bash/MINGW path conversion workarounds (MSYS_NO_PATHCONV, double-slash), shell detection for cross-platform scripts, and Visual Studio 2022 17.12+ support. Ensures safe, modern database development following 2025 Microsoft best practices.",
"version": "1.6.0"
},
"content": {
"files": [
{
"path": "README.md",
"sha256": "7005d9edbefaac6aebd0a3865d4d80d40e7e88d9e55cfc94d244baca104794fc"
},
{
"path": "agents/ssdt-expert.md",
"sha256": "509d7b716dfe552d34d3ba1bcf766615729dd6edfb2b2f05170ec1a30a217bdc"
},
{
"path": ".claude-plugin/plugin.json",
"sha256": "008e4426a3267e04e3b8854f8e0d4866babad92b02eb4417932f9a739062df69"
},
{
"path": "skills/ssdt-cicd-best-practices-2025.md",
"sha256": "0f4aaefcf5de8d0e37a684d3b742e02d0750057efd22f344ae0b25710ab9e6c0"
},
{
"path": "skills/windows-git-bash-paths.md",
"sha256": "1a3651bcd8d02a2638a7593ecf70a27bb54b64e078bc8e49571cf4511630c233"
},
{
"path": "skills/sql-server-2025.md",
"sha256": "dcb7828136c6b1a705c8cec389abc9c90bbd9e441261fb54e65a9f3aec3e572d"
}
],
"dirSha256": "2ebd93545be6f7e82547969c24149013f1a1af6f05ffaa07d1aa18dd4090b81d"
},
"security": {
"scannedAt": null,
"scannerVersion": null,
"flags": []
}
}

762
skills/sql-server-2025.md Normal file
View File

@@ -0,0 +1,762 @@
---
name: sql-server-2025
description: SQL Server 2025 and SqlPackage 170.2.70 (October 2025) - Vector databases, AI integration, and latest features
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# SQL Server 2025 & SqlPackage 170.2.70 Support
## Overview
**SQL Server 2025** is the enterprise AI-ready database with native vector database capabilities, built-in AI model integration, and semantic search from ground to cloud.
**SqlPackage 170.2.70** (October 14, 2025) - Latest production release with full SQL Server 2025 support, data virtualization, and parquet file enhancements.
## SqlPackage 170.x Series (2025 Releases)
### Latest Version: 170.2.70 (October 14, 2025)
Three major 2025 releases:
- **170.2.70** - October 14, 2025 (Current)
- **170.1.61** - July 30, 2025 (Data virtualization)
- **170.0.94** - April 15, 2025 (SQL Server 2025 initial support)
### Key 2025 Features
**Data Virtualization (170.1.61+)**:
- Support for Azure SQL Database data virtualization objects
- Import/export/extract/publish operations for external data sources
- Parquet file support for Azure SQL Database with Azure Blob Storage
- Automatic fallback to BCP for CLR types and LOBs > 1MB
**New Data Types**:
- **VECTOR** - Up to 3,996 dimensions with half-precision (2-byte) floating-point
- **JSON** - Native JSON data type for Azure SQL Database
**New Permissions (170.0+)**:
- `ALTER ANY INFORMATION PROTECTION` - SQL Server 2025 & Azure SQL
- `ALTER ANY EXTERNAL MIRROR` - Azure SQL & SQL database in Fabric
- `CREATE/ALTER ANY EXTERNAL MODEL` - AI/ML model management
**Deployment Options**:
- `/p:IgnorePreDeployScript=True/False` - Skip pre-deployment scripts
- `/p:IgnorePostDeployScript=True/False` - Skip post-deployment scripts
### SqlPackage Commands
```bash
# Publish to SQL Server 2025
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:server2025.database.windows.net \
/TargetDatabaseName:MyDatabase \
/TargetDatabaseEdition:Premium \
/p:TargetPlatform=SqlServer2025 # New target platform
# Extract from SQL Server 2025
sqlpackage /Action:Extract \
/SourceServerName:server2025.database.windows.net \
/SourceDatabaseName:MyDatabase \
/TargetFile:Database.dacpac \
/p:ExtractAllTableData=False \
/p:VerifyExtraction=True
# Export with SQL Server 2025 features
sqlpackage /Action:Export \
/SourceServerName:server2025.database.windows.net \
/SourceDatabaseName:MyDatabase \
/TargetFile:Database.bacpac
```
## ScriptDom Version 170.0.64
New ScriptDom version for SQL Server 2025 syntax parsing:
```csharp
// Package: Microsoft.SqlServer.TransactSql.ScriptDom 170.0.64
using Microsoft.SqlServer.TransactSql.ScriptDom;
// Parse SQL Server 2025 syntax
var parser = new TSql170Parser(true);
IList<ParseError> errors;
var fragment = parser.Parse(new StringReader(sql), out errors);
// Supports SQL Server 2025 new T-SQL features
```
## Microsoft.Build.Sql 2.0.0 GA (2025)
**MAJOR MILESTONE:** Microsoft.Build.Sql SDK entered General Availability in 2025!
### Latest Version: 2.0.0 (Production Ready)
**Breaking Change from Preview:**
- SDK is now production-ready and recommended for all new database projects
- No longer in preview status
- Full cross-platform support (Windows/Linux/macOS)
- Requires .NET 8+ (was .NET 6+ in preview)
### SQL Server 2025 Support
**Current Status:** SQL Server 2025 target platform support coming in future Microsoft.Build.Sql release (post-2.0.0).
**Workaround for SDK-Style Projects:**
```xml
<!-- Database.sqlproj (SDK-style with SQL Server 2025 compatibility) -->
<Project Sdk="Microsoft.Build.Sql/2.0.0">
<PropertyGroup>
<Name>MyDatabase</Name>
<!-- Use SQL Server 2022 (160) provider until 2025 provider available -->
<DSP>Microsoft.Data.Tools.Schema.Sql.Sql160DatabaseSchemaProvider</DSP>
<TargetFramework>net8.0</TargetFramework>
<SqlServerVersion>Sql160</SqlServerVersion>
<!-- SQL Server 2025 features will still work in runtime database -->
<!-- Only build-time validation uses Sql160 provider -->
</PropertyGroup>
<ItemGroup>
<Folder Include="Tables\" />
<Folder Include="Views\" />
<Folder Include="StoredProcedures\" />
</ItemGroup>
</Project>
```
### Visual Studio 2022 Support
**Requirement:** Visual Studio 2022 version 17.12 or later for SDK-style SQL projects.
**Note:** Side-by-side installation with original SQL projects (legacy SSDT) is NOT supported.
## SQL Server 2025 Release Status
**Current Status**: SQL Server 2025 (17.x) is in **Release Candidate (RC1)** stage as of October 2025. Public preview began May 2025.
**Predicted GA Date**: November 12, 2025 (based on historical release patterns - SQL Server 2019: Nov 4, SQL Server 2022: Nov 16). Expected announcement at Microsoft Ignite conference (November 18-21, 2025).
**Not Yet Production**: SQL Server 2025 is not yet generally available. All features described are available in RC builds for testing purposes only.
## SQL Server 2025 New Features
### Vector Database for AI
**Native Enterprise Vector Store** with built-in security, compliance, and DiskANN indexing technology.
**Key Capabilities:**
- **Up to 3,996 dimensions** per vector (half-precision 2-byte floating-point)
- **DiskANN indexing** - Disk-based approximate nearest neighbor for efficient large-scale vector search
- **Hybrid AI vector search** - Combine vectors with SQL data for semantic + keyword search
- **Built-in security & compliance** - Enterprise-grade data protection
**Vector Embedding & Text Chunking:**
```sql
-- Create table with vector column
CREATE TABLE Documents (
Id INT PRIMARY KEY IDENTITY,
Title NVARCHAR(200),
Content NVARCHAR(MAX),
-- Half-precision vectors support up to 3,996 dimensions
ContentVector VECTOR(1536) -- OpenAI ada-002: 1,536 dims
-- ContentVector VECTOR(3072) -- OpenAI text-embedding-3-large: 3,072 dims
-- ContentVector VECTOR(3996) -- Maximum: 3,996 dims
);
-- Insert vectors (T-SQL built-in embedding generation)
INSERT INTO Documents (Title, Content, ContentVector)
VALUES (
'AI Documentation',
'Azure AI services...',
CAST('[0.1, 0.2, 0.3, ...]' AS VECTOR(1536))
);
-- Semantic similarity search with DiskANN
DECLARE @QueryVector VECTOR(1536) = CAST('[0.15, 0.25, ...]' AS VECTOR(1536));
SELECT TOP 10
Id,
Title,
Content,
VECTOR_DISTANCE('cosine', ContentVector, @QueryVector) AS Similarity
FROM Documents
ORDER BY Similarity;
-- Create DiskANN vector index for performance
CREATE INDEX IX_Documents_Vector
ON Documents(ContentVector)
USING VECTOR_INDEX
WITH (
DISTANCE_METRIC = 'cosine', -- or 'euclidean', 'dot_product'
VECTOR_SIZE = 1536
);
-- Hybrid search: Combine vector similarity with traditional filtering
SELECT TOP 10
Id,
Title,
VECTOR_DISTANCE('cosine', ContentVector, @QueryVector) AS Similarity
FROM Documents
WHERE Title LIKE '%Azure%' -- Traditional keyword filter
ORDER BY Similarity;
```
### AI Model Integration
**Built into T-SQL** - Seamlessly integrate AI services with model definitions directly in the database.
**Supported AI Services:**
- Azure AI Foundry
- Azure OpenAI Service
- OpenAI
- Ollama (local/self-hosted models)
- Custom REST APIs
**Developer Frameworks:**
- LangChain integration
- Semantic Kernel integration
- Entity Framework Core support
- **GraphQL via Data API Builder (DAB)** - Expose SQL Server data through GraphQL endpoints
**External Models (ONNX):**
```sql
-- Create external model from ONNX file
CREATE EXTERNAL MODEL AIModel
FROM 'https://storage.account.blob.core.windows.net/models/model.onnx'
WITH (
TYPE = 'ONNX',
INPUT_DATA_FORMAT = 'JSON',
OUTPUT_DATA_FORMAT = 'JSON'
);
-- Use model for predictions
DECLARE @Input NVARCHAR(MAX) = '{"text": "Hello world"}';
SELECT PREDICT(MODEL = AIModel, DATA = @Input) AS Prediction;
-- Grant model permissions (new SQL Server 2025 permission)
GRANT CREATE ANY EXTERNAL MODEL TO [ModelAdmin];
GRANT ALTER ANY EXTERNAL MODEL TO [ModelAdmin];
GRANT EXECUTE ON EXTERNAL MODEL::AIModel TO [AppUser];
```
**AI Service Integration:**
```sql
-- Example: Azure OpenAI integration
-- Model definitions built directly into T-SQL
-- Access through REST APIs with built-in authentication
```
### Optimized Locking (Performance Enhancement)
**Key Innovation**: Dramatically reduces lock memory consumption and minimizes blocking for concurrent transactions.
**Two Primary Components**:
1. **Transaction ID (TID) Locking**:
- Each row labeled with last TID (Transaction ID) that modified it
- Single lock on TID instead of many row locks
- Locks released as soon as row is updated
- Only one TID lock held until transaction ends
- **Example**: Updating 1,000 rows requires 1,000 X row locks, but each is released immediately, and only one TID lock is held until commit
2. **Lock After Qualification (LAQ)**:
- Evaluates query predicates using latest committed version WITHOUT acquiring lock
- Requires READ COMMITTED SNAPSHOT ISOLATION (RCSI)
- Predicates checked optimistically on committed data
- X row lock taken only if predicate satisfied
- Lock released immediately after row update
**Benefits**:
- Reduced lock memory usage
- Increased concurrency and scale
- Minimized lock escalation
- Enhanced application uptime
- Better performance for high-concurrency workloads
**Enabling Optimized Locking**:
```sql
-- Enable RCSI (required for LAQ)
ALTER DATABASE MyDatabase
SET READ_COMMITTED_SNAPSHOT ON;
-- Optimized locking is automatically enabled at database level
-- No additional configuration needed for SQL Server 2025
-- Verify optimized locking status
SELECT name, is_read_committed_snapshot_on
FROM sys.databases
WHERE name = 'MyDatabase';
-- Monitor optimized locking performance
SELECT *
FROM sys.dm_tran_locks
WHERE request_session_id = @@SPID;
```
### Microsoft Fabric Mirroring (Zero-ETL Analytics)
**Integration**: Near real-time replication of SQL Server databases to Microsoft Fabric OneLake for analytics.
**Key Capabilities**:
- **Zero-ETL Experience**: No complex ETL pipelines required
- **SQL Server 2025-Specific**: Uses new change feed technology (vs CDC in SQL Server 2016-2022)
- **Azure Arc Required**: SQL Server 2025 requires Azure Arc-enabled server for Fabric communication
- **Real-Time Analytics**: Offload analytic workloads to Fabric without impacting production
**Supported Scenarios**:
- SQL Server 2025 on-premises (Windows)
- NOT supported: Azure VM or Linux instances (yet)
**How It Works**:
```sql
-- SQL Server 2025 uses change feed (automatic)
-- Azure Arc agent handles replication to Fabric OneLake
-- Traditional SQL Server 2016-2022 approach (CDC):
-- EXEC sys.sp_cdc_enable_db;
-- EXEC sys.sp_cdc_enable_table ...
-- SQL Server 2025: Change feed is built-in, no CDC setup needed
```
**Benefits**:
- Free Fabric compute for replication
- Free OneLake storage (based on capacity size)
- Near real-time data availability
- BI and analytics without production load
- Integration with Power BI, Synapse, Azure ML
**Configuration**:
1. Enable Azure Arc on SQL Server 2025 instance
2. Configure Fabric workspace and OneLake
3. Enable mirroring in Fabric portal
4. Select database and tables to mirror
5. Data automatically replicated with change feed
**Monitoring**:
```sql
-- Monitor replication lag
SELECT
database_name,
table_name,
last_sync_time,
rows_replicated,
replication_lag_seconds
FROM sys.dm_fabric_replication_status;
```
### Native JSON Support Enhancements
**New JSON Data Type**: Native JSON data type for Azure SQL Database (coming to SQL Server 2025).
```sql
-- New JSON data type
CREATE TABLE Products (
Id INT PRIMARY KEY,
Name NVARCHAR(100),
Metadata JSON -- Native JSON type
);
-- JSON functions enhanced
INSERT INTO Products (Id, Name, Metadata)
VALUES (1, 'Laptop', JSON('{"brand": "Dell", "ram": 16, "ssd": 512}'));
-- Query JSON with improved performance
SELECT
Id,
Name,
JSON_VALUE(Metadata, '$.brand') AS Brand,
JSON_VALUE(Metadata, '$.ram') AS RAM
FROM Products;
```
### Regular Expression (RegEx) Support
**T-SQL RegEx Functions**: Validate, search, and manipulate strings with regular expressions.
```sql
-- RegEx matching
SELECT REGEXP_LIKE('test@example.com', '^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$') AS IsValidEmail;
-- RegEx replace
SELECT REGEXP_REPLACE('Phone: 555-1234', '\d+', 'XXX') AS MaskedPhone;
-- RegEx extract
SELECT REGEXP_SUBSTR('Order #12345', '\d+') AS OrderNumber;
```
### REST API Integration
**Built-in REST Capabilities**: Call external REST APIs directly from T-SQL.
```sql
-- Call REST API from T-SQL
DECLARE @Response NVARCHAR(MAX);
EXEC sp_invoke_external_rest_endpoint
@url = 'https://api.example.com/data',
@method = 'GET',
@headers = '{"Authorization": "Bearer token123"}',
@response = @Response OUTPUT;
SELECT @Response AS APIResponse;
-- Enrich database data with external APIs
UPDATE Customers
SET EnrichedData = (
SELECT JSON_VALUE(response, '$.data')
FROM OPENROWSET(REST, 'https://api.example.com/customer/' + CustomerId)
)
WHERE CustomerId = 12345;
```
### Optional Parameter Plan Optimization (OPPO)
**Performance Enhancement**: SQL Server 2025 introduces OPPO to enable optimal execution plan selection based on customer-provided runtime parameter values.
**Key Benefits:**
- Solves parameter sniffing issues
- Optimizes plans for specific runtime parameters
- Improves query performance with parameter-sensitive workloads
- Reduces need for query hints or plan guides
**Enabling OPPO:**
```sql
-- Enable at database level
ALTER DATABASE MyDatabase
SET PARAMETER_SENSITIVE_PLAN_OPTIMIZATION = ON;
-- Check status
SELECT name, is_parameter_sensitive_plan_optimization_on
FROM sys.databases
WHERE name = 'MyDatabase';
-- Monitor OPPO usage
SELECT
query_plan_hash,
parameter_values,
execution_count,
avg_duration_ms
FROM sys.dm_exec_query_stats
WHERE is_parameter_sensitive = 1;
```
### Microsoft Entra Managed Identities
**Security Enhancement**: SQL Server 2025 adds support for Microsoft Entra managed identities for improved credential management.
**Key Benefits:**
- Eliminates hardcoded credentials
- Reduces security vulnerabilities
- Provides compliance and auditing capabilities
- Simplifies credential rotation
**Configuration:**
```sql
-- Create login with managed identity
CREATE LOGIN [managed-identity-name] FROM EXTERNAL PROVIDER;
-- Grant permissions
CREATE USER [managed-identity-name] FOR LOGIN [managed-identity-name];
GRANT CONTROL ON DATABASE::MyDatabase TO [managed-identity-name];
-- Use in connection strings
-- Connection string: Server=myserver;Database=mydb;Authentication=Active Directory Managed Identity;
```
### Enhanced Information Protection
Sensitivity classification and encryption:
```sql
-- Classify sensitive columns
ADD SENSITIVITY CLASSIFICATION TO
Customers.Email,
Customers.CreditCard
WITH (
LABEL = 'Confidential',
INFORMATION_TYPE = 'Financial'
);
-- Query classification
SELECT
schema_name(o.schema_id) AS SchemaName,
o.name AS TableName,
c.name AS ColumnName,
s.label AS SensitivityLabel,
s.information_type AS InformationType
FROM sys.sensitivity_classifications s
INNER JOIN sys.objects o ON s.major_id = o.object_id
INNER JOIN sys.columns c ON s.major_id = c.object_id AND s.minor_id = c.column_id;
```
## Deployment to SQL Server 2025
### Using SqlPackage
```bash
# Publish with 2025 features
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=tcp:server2025.database.windows.net;Database=MyDb;Authentication=ActiveDirectoryManagedIdentity;" \
/p:BlockOnPossibleDataLoss=True \
/p:IncludeCompositeObjects=True \
/p:DropObjectsNotInSource=False \
/p:DoNotDropObjectTypes=Users;RoleMembership \
/p:GenerateSmartDefaults=True \
/DiagnosticsFile:deploy.log
```
### Using MSBuild
```xml
<!-- Database.publish.xml -->
<Project>
<PropertyGroup>
<TargetConnectionString>Server=tcp:server2025.database.windows.net;Database=MyDb;Authentication=ActiveDirectoryManagedIdentity;</TargetConnectionString>
<BlockOnPossibleDataLoss>True</BlockOnPossibleDataLoss>
<TargetDatabaseName>MyDatabase</TargetDatabaseName>
<ProfileVersionNumber>1</ProfileVersionNumber>
</PropertyGroup>
<ItemGroup>
<SqlCmdVariable Include="Environment">
<Value>Production</Value>
</SqlCmdVariable>
</ItemGroup>
</Project>
```
```bash
# Deploy using MSBuild
msbuild Database.sqlproj \
/t:Publish \
/p:PublishProfile=Database.publish.xml \
/p:TargetPlatform=SqlServer2025
```
## CI/CD Best Practices 2025
### Key Principles
**State-Based Deployment (Recommended):**
- Source code represents current database state
- All objects (procedures, tables, triggers, views) in separate .sql files
- SqlPackage generates incremental scripts automatically
- Preferred over migration-based approaches
**Testing & Quality:**
- **tSQLt** - Unit testing for SQL Server stored procedures and functions
- Tests produce machine-readable results
- Abort pipeline on test failure with immediate notifications
- Never continue deployment if tests fail
**Security:**
- **Windows Authentication preferred** for CI/CD (avoid plain text passwords)
- Never commit credentials to source control
- Use Azure Key Vault or GitHub Secrets for connection strings
**Version Control:**
- All database objects in source control
- Test scripts versioned and executed in Build step
- Require comments on check-ins
- Configure custom check-in policies
### GitHub Actions (2025 Pattern)
```yaml
name: Deploy to SQL Server 2025
on:
push:
branches: [main]
jobs:
build-and-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET 8
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install SqlPackage 170.2.70
run: dotnet tool install -g Microsoft.SqlPackage --version 170.2.70
- name: Build DACPAC
run: dotnet build Database.sqlproj -c Release
- name: Run tSQLt Unit Tests
run: |
# Run unit tests and capture results
# Abort if tests fail
echo "Running tSQLt unit tests..."
# Add your tSQLt test execution here
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport \
/SourceFile:bin/Release/Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONNECTION_STRING }}" \
/OutputPath:deploy-report.xml \
/p:BlockOnPossibleDataLoss=True
- name: Publish to SQL Server 2025
run: |
sqlpackage /Action:Publish \
/SourceFile:bin/Release/Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONNECTION_STRING }}" \
/p:TargetPlatform=SqlServer2025 \
/p:BlockOnPossibleDataLoss=True \
/DiagnosticsFile:publish.log \
/DiagnosticsLevel:Verbose
- name: Upload Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: deployment-logs
path: |
publish.log
deploy-report.xml
```
### Azure DevOps
```yaml
trigger:
- main
pool:
vmImage: 'windows-2022'
steps:
- task: MSBuild@1
displayName: 'Build Database Project'
inputs:
solution: 'Database.sqlproj'
configuration: 'Release'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to SQL Server 2025'
inputs:
azureSubscription: 'Azure Subscription'
authenticationType: 'servicePrincipal'
serverName: 'server2025.database.windows.net'
databaseName: 'MyDatabase'
deployType: 'DacpacTask'
deploymentAction: 'Publish'
dacpacFile: '$(Build.SourcesDirectory)/bin/Release/Database.dacpac'
additionalArguments: '/p:TargetPlatform=SqlServer2025'
```
## New SqlPackage Diagnostic Features
```bash
# Enable detailed diagnostics
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:server2025.database.windows.net \
/TargetDatabaseName:MyDatabase \
/DiagnosticsLevel:Verbose \
/DiagnosticPackageFile:diagnostics.zip
# Creates diagnostics.zip containing:
# - Deployment logs
# - Performance metrics
# - Error details
# - Schema comparison results
```
## Microsoft Fabric Data Warehouse Support
**New in SqlPackage 162.5+:** Full support for SQL database in Microsoft Fabric.
**Fabric Deployment:**
```bash
# Deploy to Fabric Warehouse
sqlpackage /Action:Publish \
/SourceFile:Warehouse.dacpac \
/TargetConnectionString:"Server=tcp:myworkspace.datawarehouse.fabric.microsoft.com;Database=mywarehouse;Authentication=ActiveDirectoryInteractive;" \
/p:DatabaseEdition=Fabric \
/p:DatabaseServiceObjective=SqlDbFabricDatabaseSchemaProvider
# Extract from Fabric
sqlpackage /Action:Extract \
/SourceConnectionString:"Server=tcp:myworkspace.datawarehouse.fabric.microsoft.com;Database=mywarehouse;Authentication=ActiveDirectoryInteractive;" \
/TargetFile:Fabric.dacpac
# New permission: ALTER ANY EXTERNAL MIRROR (Fabric-specific)
GRANT ALTER ANY EXTERNAL MIRROR TO [FabricAdmin];
```
## Best Practices for SQL Server 2025
1. **Use Target Platform Specification:**
```xml
<PropertyGroup>
<TargetPlatform>SqlServer2025</TargetPlatform>
</PropertyGroup>
```
2. **Test Vector Operations:**
```sql
-- Verify vector support
SELECT SERVERPROPERTY('IsVectorSupported') AS VectorSupport;
```
3. **Monitor AI Model Performance:**
```sql
-- Track model execution
SELECT
model_name,
AVG(execution_time_ms) AS AvgExecutionTime,
COUNT(*) AS ExecutionCount
FROM sys.dm_exec_external_model_stats
GROUP BY model_name;
```
4. **Implement Sensitivity Classification:**
```sql
-- Classify all PII columns
ADD SENSITIVITY CLASSIFICATION TO dbo.Customers.Email
WITH (LABEL = 'Confidential - GDPR', INFORMATION_TYPE = 'Email');
```
## Resources
- [SQL Server 2025 Preview](https://aka.ms/sqlserver2025)
- [SqlPackage Documentation](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [SDK-Style Projects](https://learn.microsoft.com/sql/tools/sql-database-projects/concepts/sdk-style-projects)
- [Vector Database](https://learn.microsoft.com/sql/relational-databases/vectors/)

View File

@@ -0,0 +1,669 @@
---
name: ssdt-cicd-best-practices-2025
description: Modern CI/CD best practices for SQL Server database development with tSQLt, state-based deployment, and 2025 patterns
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# SSDT CI/CD Best Practices 2025
## Overview
This skill provides comprehensive guidance on implementing modern CI/CD pipelines for SQL Server database projects using SSDT, SqlPackage, and contemporary DevOps practices.
## Key Principles (2025 Recommended Approach)
### 1. State-Based Deployment (Recommended)
**Definition**: Source code represents the current database state, not migration scripts.
**How it Works**:
- All database objects (tables, procedures, views, functions) stored in separate .sql files
- SqlPackage automatically generates incremental deployment scripts
- Declarative approach: "This is what the database should look like"
- SSDT compares source to target and calculates differences
**Advantages**:
- Easier to maintain and understand
- No risk of missing migration scripts
- Git history shows complete object definitions
- Branching and merging simplified
- Rollback by redeploying previous version
**Implementation**:
```yaml
# GitHub Actions example
- name: Build DACPAC (State-Based)
run: dotnet build Database.sqlproj -c Release
- name: Deploy State to Target
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONN }}" \
/p:BlockOnPossibleDataLoss=True
```
**Contrast with Migration-Based**:
- Migration-based: Sequential scripts (001_CreateTable.sql, 002_AddColumn.sql)
- State-based: Object definitions (Tables/Customer.sql contains complete CREATE TABLE)
### 2. tSQLt Unit Testing (Critical for CI/CD)
**Why tSQLt**:
- Open-source SQL Server unit testing framework
- Write tests in T-SQL language
- Produces machine-readable XML/JSON results
- Integrates seamlessly with CI/CD pipelines
**Key Features**:
- **Automatic Transactions**: Each test runs in a transaction and rolls back
- **Schema Grouping**: Group related tests in schemas
- **Mocking**: Fake tables and procedures for isolated testing
- **Assertions**: Built-in assertion methods (assertEquals, assertEmpty, etc.)
**Pipeline Abort on Failure**:
```yaml
# GitHub Actions with tSQLt
- name: Run tSQLt Unit Tests
run: |
# Deploy test framework
sqlpackage /Action:Publish \
/SourceFile:DatabaseTests.dacpac \
/TargetConnectionString:"${{ secrets.TEST_SQL_CONN }}"
# Execute tests and capture results
sqlcmd -S test-server -d TestDB -Q "EXEC tSQLt.RunAll" -o test-results.txt
# Parse results and fail pipeline if tests fail
if grep -q "Failure" test-results.txt; then
echo "Unit tests failed!"
exit 1
fi
echo "All tests passed!"
- name: Deploy to Production (only runs if tests pass)
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}"
```
**Test Structure**:
```sql
-- tSQLt test example
CREATE SCHEMA CustomerTests;
GO
CREATE PROCEDURE CustomerTests.[test Customer Insert Sets Correct Defaults]
AS
BEGIN
-- Arrange
EXEC tSQLt.FakeTable 'dbo.Customers';
-- Act
INSERT INTO dbo.Customers (FirstName, LastName, Email)
VALUES ('John', 'Doe', 'john@example.com');
-- Assert
EXEC tSQLt.AssertEquals @Expected = 1,
@Actual = (SELECT COUNT(*) FROM dbo.Customers);
EXEC tSQLt.AssertNotEquals @Expected = NULL,
@Actual = (SELECT CreatedDate FROM dbo.Customers);
END;
GO
-- Run all tests
EXEC tSQLt.RunAll;
```
**Azure DevOps Integration**:
```yaml
- task: PowerShell@2
displayName: 'Run tSQLt Tests'
inputs:
targetType: 'inline'
script: |
# Execute tSQLt tests
$results = Invoke-Sqlcmd -ServerInstance $(testServer) `
-Database $(testDatabase) `
-Query "EXEC tSQLt.RunAll" `
-Verbose
# Check for failures
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
Write-Error "Tests failed: $($failures.Count) failures"
exit 1
}
```
### 3. Windows Authentication Over SQL Authentication
**Security Best Practice**: Prefer Windows Authentication (Integrated Security) for CI/CD agents.
**Why Windows Auth**:
- No passwords stored in connection strings
- Leverages existing Active Directory infrastructure
- Service accounts with minimal permissions
- Audit trail via Windows Security logs
- No credential rotation needed
**Implementation**:
**Self-Hosted Agents (Recommended)**:
```yaml
# GitHub Actions with self-hosted Windows agent
runs-on: [self-hosted, windows, sql-deploy]
steps:
- name: Deploy with Windows Auth
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" \
/p:BlockOnPossibleDataLoss=True
```
**Azure DevOps with Service Connection**:
```yaml
- task: SqlAzureDacpacDeployment@1
inputs:
authenticationType: 'integratedAuth' # Uses Windows Auth
serverName: 'prod-sql.domain.com'
databaseName: 'MyDatabase'
dacpacFile: '$(Build.ArtifactStagingDirectory)/Database.dacpac'
```
**Alternative for Cloud Agents (Azure SQL)**:
```yaml
# Use Managed Identity instead of SQL auth
- name: Deploy with Managed Identity
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=tcp:server.database.windows.net;Database=MyDB;Authentication=ActiveDirectoryManagedIdentity;" \
/p:BlockOnPossibleDataLoss=True
```
**Never Do This**:
```yaml
# BAD: Plain text SQL auth password
TargetConnectionString: "Server=prod;Database=MyDB;User=sa;Password=P@ssw0rd123"
```
**If SQL Auth Required**:
```yaml
# Use secrets/variables (least preferred method)
- name: Deploy with SQL Auth (Not Recommended)
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:"${{ secrets.SQL_SERVER }}" \
/TargetDatabaseName:"${{ secrets.SQL_DATABASE }}" \
/TargetUser:"${{ secrets.SQL_USER }}" \
/TargetPassword:"${{ secrets.SQL_PASSWORD }}" \
/p:BlockOnPossibleDataLoss=True
# Still not as secure as Windows Auth!
```
### 4. Version Control Everything
**What to Version**:
```
DatabaseProject/
├── Tables/
│ ├── Customer.sql
│ └── Order.sql
├── StoredProcedures/
│ └── GetCustomerOrders.sql
├── Tests/
│ ├── CustomerTests/
│ │ └── test_CustomerInsert.sql
│ └── OrderTests/
│ └── test_OrderValidation.sql
├── Scripts/
│ ├── Script.PreDeployment.sql
│ └── Script.PostDeployment.sql
├── Database.sqlproj
├── Database.Dev.publish.xml
├── Database.Prod.publish.xml
└── .gitignore
```
**.gitignore**:
```
# Build outputs
bin/
obj/
*.dacpac
# User-specific files
*.user
*.suo
# Visual Studio folders
.vs/
# Never commit credentials
*.publish.xml.user
```
**Check-in Requirements**:
- Require code review for database changes
- Mandate comments on all commits
- Run automated tests before merge
- Enforce naming conventions via branch policies
### 5. Deployment Reports Always Required
**Before Production Deployment**:
```yaml
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}" \
/OutputPath:deploy-report.xml \
/p:BlockOnPossibleDataLoss=True
- name: Parse and Review Report
run: |
# Extract key metrics from XML
echo "=== DEPLOYMENT REPORT ==="
# Parse XML for operations count
# Check for data loss warnings
# Display to user or post to PR
- name: Require Manual Approval
uses: trstringer/manual-approval@v1
with:
approvers: database-admins
minimum-approvals: 1
instructions: "Review deploy-report.xml before approving"
- name: Deploy After Approval
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}"
```
### 6. Environment Promotion Strategy
**Standard Flow**: Dev → QA → Staging → Production
**Consistent Deployment Options**:
```yaml
# Define environment-specific properties
environments:
dev:
blockOnDataLoss: false
dropObjectsNotInSource: true
backupBeforeChanges: false
qa:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
staging:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
production:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
requireApproval: true
```
## Complete GitHub Actions Pipeline (2025 Best Practice)
```yaml
name: SQL Server CI/CD Pipeline
on:
push:
branches: [main, develop]
pull_request:
branches: [main]
env:
DOTNET_VERSION: '8.0.x'
SQLPACKAGE_VERSION: '170.2.70'
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET 8
uses: actions/setup-dotnet@v4
with:
dotnet-version: ${{ env.DOTNET_VERSION }}
- name: Install SqlPackage
run: dotnet tool install -g Microsoft.SqlPackage --version ${{ env.SQLPACKAGE_VERSION }}
- name: Build Database Project
run: dotnet build src/Database.sqlproj -c Release
- name: Build Test Project
run: dotnet build tests/DatabaseTests.sqlproj -c Release
- name: Upload DACPAC Artifacts
uses: actions/upload-artifact@v4
with:
name: dacpacs
path: |
src/bin/Release/*.dacpac
tests/bin/Release/*.dacpac
test:
runs-on: windows-latest # tSQLt requires SQL Server
needs: build
steps:
- uses: actions/checkout@v4
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Setup Test Database
run: |
sqlcmd -S localhost -Q "CREATE DATABASE TestDB"
- name: Deploy Database to Test
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=localhost;Database=TestDB;Integrated Security=True;"
- name: Deploy tSQLt Framework
run: |
sqlpackage /Action:Publish `
/SourceFile:DatabaseTests.dacpac `
/TargetConnectionString:"Server=localhost;Database=TestDB;Integrated Security=True;"
- name: Run tSQLt Unit Tests
run: |
$results = Invoke-Sqlcmd -ServerInstance localhost `
-Database TestDB `
-Query "EXEC tSQLt.RunAll" `
-Verbose
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
Write-Error "Tests failed: $($failures.Count) failures"
exit 1
}
Write-Host "All tests passed!"
deploy-dev:
runs-on: [self-hosted, windows, sql-deploy]
needs: test
if: github.ref == 'refs/heads/develop'
environment: dev
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Deploy to Dev (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=dev-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=False `
/p:DropObjectsNotInSource=True
deploy-staging:
runs-on: [self-hosted, windows, sql-deploy]
needs: test
if: github.ref == 'refs/heads/main'
environment: staging
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=staging-sql;Database=MyDB;Integrated Security=True;" `
/OutputPath:deploy-report.xml
- name: Deploy to Staging (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=staging-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=True `
/p:BackupDatabaseBeforeChanges=True `
/p:DropObjectsNotInSource=False
deploy-production:
runs-on: [self-hosted, windows, sql-deploy]
needs: deploy-staging
environment: production
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" `
/OutputPath:prod-deploy-report.xml
- name: Manual Approval Required
uses: trstringer/manual-approval@v1
with:
approvers: database-admins,devops-leads
minimum-approvals: 2
instructions: "Review prod-deploy-report.xml and approve deployment"
- name: Deploy to Production (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=True `
/p:BackupDatabaseBeforeChanges=True `
/p:DropObjectsNotInSource=False `
/p:DoNotDropObjectTypes=Users;Logins;RoleMembership `
/DiagnosticsFile:prod-deploy.log
- name: Upload Deployment Logs
if: always()
uses: actions/upload-artifact@v4
with:
name: production-deployment-logs
path: prod-deploy.log
```
## Azure DevOps Pipeline Example (2025)
```yaml
trigger:
branches:
include:
- main
- develop
pool:
vmImage: 'windows-2022'
variables:
buildConfiguration: 'Release'
dotnetVersion: '8.0.x'
sqlPackageVersion: '170.2.70'
stages:
- stage: Build
jobs:
- job: BuildDatabase
steps:
- task: UseDotNet@2
displayName: 'Install .NET 8'
inputs:
version: $(dotnetVersion)
- task: DotNetCoreCLI@2
displayName: 'Build Database Project'
inputs:
command: 'build'
projects: '**/*.sqlproj'
arguments: '-c $(buildConfiguration)'
- task: PublishBuildArtifacts@1
displayName: 'Publish DACPAC'
inputs:
PathtoPublish: '$(Build.SourcesDirectory)/bin/$(buildConfiguration)'
ArtifactName: 'dacpacs'
- stage: Test
dependsOn: Build
jobs:
- job: RunUnitTests
steps:
- task: DownloadBuildArtifacts@1
inputs:
artifactName: 'dacpacs'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to Test Database'
inputs:
authenticationType: 'integratedAuth'
serverName: 'test-sql-server'
databaseName: 'TestDB'
dacpacFile: '$(System.ArtifactsDirectory)/dacpacs/Database.dacpac'
- task: PowerShell@2
displayName: 'Run tSQLt Tests'
inputs:
targetType: 'inline'
script: |
$results = Invoke-Sqlcmd -ServerInstance 'test-sql-server' `
-Database 'TestDB' `
-Query "EXEC tSQLt.RunAll"
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
throw "Tests failed: $($failures.Count) failures"
}
- stage: DeployProduction
dependsOn: Test
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
jobs:
- deployment: DeployToProduction
environment: 'Production'
strategy:
runOnce:
deploy:
steps:
- task: SqlAzureDacpacDeployment@1
displayName: 'Generate Deployment Report'
inputs:
deployType: 'DeployReport'
authenticationType: 'integratedAuth'
serverName: 'prod-sql-server'
databaseName: 'ProductionDB'
dacpacFile: '$(Pipeline.Workspace)/dacpacs/Database.dacpac'
outputFile: 'deploy-report.xml'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to Production'
inputs:
authenticationType: 'integratedAuth'
serverName: 'prod-sql-server'
databaseName: 'ProductionDB'
dacpacFile: '$(Pipeline.Workspace)/dacpacs/Database.dacpac'
additionalArguments: '/p:BlockOnPossibleDataLoss=True /p:BackupDatabaseBeforeChanges=True'
```
## Best Practices Checklist
### Source Control
- [ ] All database objects in source control
- [ ] .gitignore configured for build outputs
- [ ] No credentials committed
- [ ] Test scripts versioned separately
- [ ] Branching strategy defined (gitflow, trunk-based, etc.)
### Testing
- [ ] tSQLt framework deployed
- [ ] Unit tests cover critical stored procedures
- [ ] Tests grouped logically in schemas
- [ ] Pipeline aborts on test failure
- [ ] Test results published to dashboard
### Security
- [ ] Windows Authentication used for CI/CD
- [ ] Service accounts follow principle of least privilege
- [ ] Secrets stored in Azure Key Vault / GitHub Secrets
- [ ] No plain text passwords
- [ ] Audit logging enabled
### Deployment
- [ ] State-based deployment strategy
- [ ] Deployment reports generated before production
- [ ] Manual approval gates for production
- [ ] Backup before changes (production)
- [ ] BlockOnPossibleDataLoss enabled (production)
- [ ] DoNotDropObjectTypes configured
- [ ] Rollback plan documented
### Monitoring
- [ ] Deployment logs captured
- [ ] Failed deployments trigger alerts
- [ ] Performance metrics tracked
- [ ] Schema drift detection automated
## Resources
- [tSQLt Official Site](https://tsqlt.org/)
- [Microsoft.Build.Sql Documentation](https://learn.microsoft.com/sql/tools/sql-database-projects/)
- [SqlPackage Reference](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [Azure DevOps SQL Tasks](https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/sql-azure-dacpac-deployment)
- [GitHub Actions for SQL](https://github.com/marketplace?type=actions&query=sql+)

View File

@@ -0,0 +1,529 @@
---
name: windows-git-bash-paths
description: Windows and Git Bash path handling for SSDT, SqlPackage, and DACPAC files with shell detection
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Windows and Git Bash Path Handling for SSDT
## Overview
SQL Server development is Windows-heavy, and many developers use Git Bash (MINGW/MSYS2) as their preferred shell on Windows. This creates unique path conversion challenges when working with Windows-native tools like SqlPackage, MSBuild, and Visual Studio that expect Windows-style paths.
This skill provides comprehensive guidance on handling path conversion issues, shell detection, and best practices for SSDT workflows on Windows with Git Bash.
## The Path Conversion Problem
### What Happens in Git Bash/MINGW
Git Bash automatically converts POSIX-style paths to Windows paths, but this can cause issues with command-line arguments:
**Automatic Conversions:**
- `/foo``C:/Program Files/Git/usr/foo`
- `/foo:/bar``C:\msys64\foo;C:\msys64\bar`
- `--dir=/foo``--dir=C:/msys64/foo`
**Problematic for SqlPackage:**
```bash
# Git Bash converts /Action to a path!
sqlpackage /Action:Publish /SourceFile:MyDB.dacpac
# Becomes: sqlpackage C:/Program Files/Git/usr/Action:Publish ...
```
### What Triggers Conversion
✓ Leading forward slash (/) in arguments
✓ Colon-separated path lists
✓ Arguments after `-` or `,` with path components
### What's Exempt
✓ Arguments containing `=` (variable assignments)
✓ Drive specifiers (`C:`)
✓ Arguments with `;` (already Windows format)
✓ Arguments starting with `//` (Windows switches)
## Solutions for SqlPackage in Git Bash
### Method 1: MSYS_NO_PATHCONV (Recommended)
Disable path conversion for specific commands:
```bash
# Temporarily disable path conversion
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"MyDatabase.dacpac" \
/TargetServerName:"localhost" \
/TargetDatabaseName:"MyDB"
# Works for all SqlPackage actions
MSYS_NO_PATHCONV=1 sqlpackage /Action:Extract \
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;" \
/TargetFile:"MyDB_backup.dacpac"
```
**Important Notes:**
- The VALUE doesn't matter - setting to `0`, `false`, or empty still disables conversion
- Only matters that variable is DEFINED
- To re-enable: `env -u MSYS_NO_PATHCONV`
### Method 2: Double Slash // (Alternative)
Use double slashes for SqlPackage parameters:
```bash
# Works in Git Bash and CMD
sqlpackage //Action:Publish \
//SourceFile:MyDatabase.dacpac \
//TargetServerName:localhost \
//TargetDatabaseName:MyDB
# Extract with double slashes
sqlpackage //Action:Extract \
//SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;" \
//TargetFile:output.dacpac
```
**Advantages:**
- No environment variable needed
- Works across shells
- Shell-agnostic scripts
### Method 3: Use Windows-Style Paths with Quotes
Always quote paths with backslashes:
```bash
# Quoted Windows paths work in Git Bash
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:\Projects\MyDB\bin\Release\MyDB.dacpac" \
/TargetConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;"
# Or with forward slashes (Windows accepts both)
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB/bin/Release/MyDB.dacpac" \
/TargetConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;"
```
### Method 4: Switch to PowerShell or CMD
For Windows-native tools, consider using native shells:
```powershell
# PowerShell (recommended for Windows SSDT workflows)
sqlpackage /Action:Publish `
/SourceFile:"MyDatabase.dacpac" `
/TargetServerName:"localhost" `
/TargetDatabaseName:"MyDB"
```
```cmd
:: CMD
sqlpackage /Action:Publish ^
/SourceFile:"MyDatabase.dacpac" ^
/TargetServerName:"localhost" ^
/TargetDatabaseName:"MyDB"
```
## Shell Detection for Scripts
### Bash Script Detection
```bash
#!/bin/bash
# Method 1: Check $OSTYPE
case "$OSTYPE" in
msys*) # MSYS/Git Bash/MinGW
export MSYS_NO_PATHCONV=1
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
cygwin*) # Cygwin
export MSYS_NO_PATHCONV=1
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
linux-gnu*) # Linux
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
darwin*) # macOS
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
esac
sqlpackage $SQLPACKAGE_ARGS
```
```bash
# Method 2: Check uname -s (most portable)
case "$(uname -s)" in
MINGW64*|MINGW32*)
# Git Bash
export MSYS_NO_PATHCONV=1
echo "Git Bash detected - path conversion disabled"
;;
MSYS_NT*)
# MSYS
export MSYS_NO_PATHCONV=1
echo "MSYS detected - path conversion disabled"
;;
CYGWIN*)
# Cygwin
export MSYS_NO_PATHCONV=1
echo "Cygwin detected - path conversion disabled"
;;
Linux*)
# Linux or WSL
echo "Linux detected"
;;
Darwin*)
# macOS
echo "macOS detected"
;;
esac
```
```bash
# Method 3: Check $MSYSTEM (Git Bash specific)
if [ -n "$MSYSTEM" ]; then
# Running in Git Bash/MSYS2
export MSYS_NO_PATHCONV=1
echo "MSYS environment detected: $MSYSTEM"
case "$MSYSTEM" in
MINGW64) echo "64-bit native Windows environment" ;;
MINGW32) echo "32-bit native Windows environment" ;;
MSYS) echo "POSIX-compliant environment" ;;
esac
fi
```
### Complete Build Script Example
```bash
#!/bin/bash
# build-and-deploy.sh - Cross-platform SSDT build script
set -e # Exit on error
# Detect shell and set path conversion
if [ -n "$MSYSTEM" ]; then
echo "Git Bash/MSYS2 detected - disabling path conversion"
export MSYS_NO_PATHCONV=1
fi
# Variables
PROJECT_NAME="MyDatabase"
BUILD_CONFIG="Release"
DACPAC_PATH="bin/${BUILD_CONFIG}/${PROJECT_NAME}.dacpac"
TARGET_SERVER="${SQL_SERVER:-localhost}"
TARGET_DB="${SQL_DATABASE:-MyDB}"
# Build
echo "Building ${PROJECT_NAME}..."
dotnet build "${PROJECT_NAME}.sqlproj" -c "$BUILD_CONFIG"
# Verify DACPAC exists
if [ ! -f "$DACPAC_PATH" ]; then
echo "ERROR: DACPAC not found at $DACPAC_PATH"
exit 1
fi
echo "DACPAC built successfully: $DACPAC_PATH"
# Deploy
echo "Deploying to ${TARGET_SERVER}/${TARGET_DB}..."
# Use double-slash method for maximum compatibility
sqlpackage //Action:Publish \
//SourceFile:"$DACPAC_PATH" \
//TargetServerName:"$TARGET_SERVER" \
//TargetDatabaseName:"$TARGET_DB" \
//p:BlockOnPossibleDataLoss=False
echo "Deployment complete!"
```
## Common SSDT Path Issues in Git Bash
### Issue 1: DACPAC File Paths
**Problem:**
```bash
# Git Bash mangles the path
sqlpackage /Action:Publish /SourceFile:./bin/Release/MyDB.dacpac
# Error: Cannot find file
```
**Solution:**
```bash
# Use MSYS_NO_PATHCONV
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"./bin/Release/MyDB.dacpac"
# OR use absolute Windows path
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB/bin/Release/MyDB.dacpac"
# OR use double slashes
sqlpackage //Action:Publish //SourceFile:./bin/Release/MyDB.dacpac
```
### Issue 2: SQL Project File Paths
**Problem:**
```bash
# Path with spaces causes issues
dotnet build "D:/Program Files/MyProject/Database.sqlproj"
# Works in Git Bash (dotnet handles paths correctly)
# But MSBuild paths may fail
msbuild "D:/Program Files/MyProject/Database.sqlproj"
# May fail if not quoted properly
```
**Solution:**
```bash
# Always quote paths with spaces
dotnet build "D:/Program Files/MyProject/Database.sqlproj"
# Use backslashes for MSBuild on Windows
msbuild "D:\Program Files\MyProject\Database.sqlproj"
# Or use 8.3 short names (no spaces)
msbuild "D:/PROGRA~1/MyProject/Database.sqlproj"
```
### Issue 3: Publish Profile Paths
**Problem:**
```bash
# Publish profile not found
sqlpackage /Action:Publish \
/SourceFile:MyDB.dacpac \
/Profile:./Profiles/Production.publish.xml
```
**Solution:**
```bash
# Use MSYS_NO_PATHCONV with quoted paths
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"MyDB.dacpac" \
/Profile:"./Profiles/Production.publish.xml"
# Or absolute Windows path
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB.dacpac" \
/Profile:"D:/Projects/Profiles/Production.publish.xml"
```
### Issue 4: Connection Strings
**Problem:**
```bash
# File paths in connection strings
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:/Data/MyDB.mdf"
# Path gets mangled
```
**Solution:**
```bash
# Quote entire connection string
MSYS_NO_PATHCONV=1 sqlpackage /Action:Extract \
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:\Data\MyDB.mdf" \
/TargetFile:"output.dacpac"
# Or use double backslashes in connection string
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:\\Data\\MyDB.mdf"
```
## CI/CD Considerations
### GitHub Actions with Git Bash
```yaml
name: SSDT Build and Deploy
on: [push]
jobs:
build:
runs-on: windows-latest
defaults:
run:
shell: bash # Use Git Bash
steps:
- uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install SqlPackage
run: dotnet tool install -g Microsoft.SqlPackage
- name: Build Database Project
run: dotnet build Database.sqlproj -c Release
- name: Deploy with Path Conversion Disabled
env:
MSYS_NO_PATHCONV: 1
run: |
sqlpackage /Action:Publish \
/SourceFile:"bin/Release/MyDatabase.dacpac" \
/TargetServerName:"localhost" \
/TargetDatabaseName:"MyDB"
```
### PowerShell Alternative (Recommended for Windows)
```yaml
jobs:
build:
runs-on: windows-latest
defaults:
run:
shell: pwsh # Use PowerShell - no path issues
steps:
- name: Deploy Database
run: |
sqlpackage /Action:Publish `
/SourceFile:"bin/Release/MyDatabase.dacpac" `
/TargetServerName:"localhost" `
/TargetDatabaseName:"MyDB"
```
## Best Practices Summary
### For Interactive Development
1. **Use PowerShell or CMD for SSDT on Windows** - avoids path conversion issues entirely
2. **If using Git Bash**, set `MSYS_NO_PATHCONV=1` in your shell profile for SSDT work
3. **Always quote paths** containing spaces or special characters
4. **Use absolute paths** when possible to avoid ambiguity
### For Scripts
1. **Detect shell environment** and set `MSYS_NO_PATHCONV=1` conditionally
2. **Use double-slash // syntax** for SqlPackage arguments (most portable)
3. **Prefer PowerShell for Windows-specific workflows** (build scripts, CI/CD)
4. **Test scripts on all target platforms** (Windows PowerShell, Git Bash, Linux)
### For CI/CD
1. **Use PowerShell shell in GitHub Actions** for Windows runners (`shell: pwsh`)
2. **Self-hosted Windows agents** - use native Windows paths and shells
3. **Set MSYS_NO_PATHCONV=1 as environment variable** if Git Bash required
4. **Prefer dotnet CLI over MSBuild** for cross-platform compatibility
### For Teams
1. **Document shell requirements** in repository README
2. **Provide scripts for all shells** (bash, PowerShell, CMD)
3. **Standardize on PowerShell** for Windows SSDT workflows when possible
4. **Use containerized builds** to avoid shell-specific issues
## Quick Reference
### Environment Variables
```bash
# Disable path conversion (Git Bash/MSYS2/Cygwin)
export MSYS_NO_PATHCONV=1
# Re-enable path conversion
env -u MSYS_NO_PATHCONV
```
### SqlPackage Command Templates
```bash
# Git Bash - Method 1 (MSYS_NO_PATHCONV)
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
# Git Bash - Method 2 (Double Slash)
sqlpackage //Action:Publish //SourceFile:MyDB.dacpac //TargetServerName:localhost //TargetDatabaseName:MyDB
# PowerShell (Recommended for Windows)
sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
# CMD
sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
```
### Shell Detection One-Liners
```bash
# Check if Git Bash/MSYS
[ -n "$MSYSTEM" ] && echo "Git Bash/MSYS2 detected"
# Check uname
[[ "$(uname -s)" =~ ^MINGW ]] && echo "Git Bash detected"
# Set path conversion conditionally
[ -n "$MSYSTEM" ] && export MSYS_NO_PATHCONV=1
```
## Resources
- [Git Bash Path Conversion Guide](https://www.pascallandau.com/blog/setting-up-git-bash-mingw-msys2-on-windows/)
- [MSYS2 Path Conversion Documentation](https://www.msys2.org/docs/filesystem-paths/)
- [SqlPackage Documentation](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [Microsoft.Build.Sql SDK](https://www.nuget.org/packages/Microsoft.Build.Sql)
- [Git for Windows](https://gitforwindows.org/)
## Troubleshooting
### "Invalid parameter" errors
**Symptom:** SqlPackage reports "Invalid parameter" or "Unknown action"
**Cause:** Git Bash converting `/Action` to a file path
**Fix:** Use `MSYS_NO_PATHCONV=1` or double-slash `//Action`
### "File not found" with valid paths
**Symptom:** DACPAC or project file not found despite correct path
**Cause:** Path conversion mangling the file path
**Fix:** Quote paths and use `MSYS_NO_PATHCONV=1`
### Build succeeds but publish fails
**Symptom:** `dotnet build` works but `sqlpackage` fails
**Cause:** dotnet CLI handles paths correctly, but SqlPackage uses `/` parameters
**Fix:** Use `MSYS_NO_PATHCONV=1` for SqlPackage commands only
### Spaces in paths cause errors
**Symptom:** Paths with "Program Files" or other spaces fail
**Cause:** Unquoted paths split into multiple arguments
**Fix:** Always quote paths: `/SourceFile:"D:/Program Files/MyDB.dacpac"`