Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:29:18 +08:00
commit 1fb3834060
7 changed files with 2462 additions and 0 deletions

762
skills/sql-server-2025.md Normal file
View File

@@ -0,0 +1,762 @@
---
name: sql-server-2025
description: SQL Server 2025 and SqlPackage 170.2.70 (October 2025) - Vector databases, AI integration, and latest features
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# SQL Server 2025 & SqlPackage 170.2.70 Support
## Overview
**SQL Server 2025** is the enterprise AI-ready database with native vector database capabilities, built-in AI model integration, and semantic search from ground to cloud.
**SqlPackage 170.2.70** (October 14, 2025) - Latest production release with full SQL Server 2025 support, data virtualization, and parquet file enhancements.
## SqlPackage 170.x Series (2025 Releases)
### Latest Version: 170.2.70 (October 14, 2025)
Three major 2025 releases:
- **170.2.70** - October 14, 2025 (Current)
- **170.1.61** - July 30, 2025 (Data virtualization)
- **170.0.94** - April 15, 2025 (SQL Server 2025 initial support)
### Key 2025 Features
**Data Virtualization (170.1.61+)**:
- Support for Azure SQL Database data virtualization objects
- Import/export/extract/publish operations for external data sources
- Parquet file support for Azure SQL Database with Azure Blob Storage
- Automatic fallback to BCP for CLR types and LOBs > 1MB
**New Data Types**:
- **VECTOR** - Up to 3,996 dimensions with half-precision (2-byte) floating-point
- **JSON** - Native JSON data type for Azure SQL Database
**New Permissions (170.0+)**:
- `ALTER ANY INFORMATION PROTECTION` - SQL Server 2025 & Azure SQL
- `ALTER ANY EXTERNAL MIRROR` - Azure SQL & SQL database in Fabric
- `CREATE/ALTER ANY EXTERNAL MODEL` - AI/ML model management
**Deployment Options**:
- `/p:IgnorePreDeployScript=True/False` - Skip pre-deployment scripts
- `/p:IgnorePostDeployScript=True/False` - Skip post-deployment scripts
### SqlPackage Commands
```bash
# Publish to SQL Server 2025
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:server2025.database.windows.net \
/TargetDatabaseName:MyDatabase \
/TargetDatabaseEdition:Premium \
/p:TargetPlatform=SqlServer2025 # New target platform
# Extract from SQL Server 2025
sqlpackage /Action:Extract \
/SourceServerName:server2025.database.windows.net \
/SourceDatabaseName:MyDatabase \
/TargetFile:Database.dacpac \
/p:ExtractAllTableData=False \
/p:VerifyExtraction=True
# Export with SQL Server 2025 features
sqlpackage /Action:Export \
/SourceServerName:server2025.database.windows.net \
/SourceDatabaseName:MyDatabase \
/TargetFile:Database.bacpac
```
## ScriptDom Version 170.0.64
New ScriptDom version for SQL Server 2025 syntax parsing:
```csharp
// Package: Microsoft.SqlServer.TransactSql.ScriptDom 170.0.64
using Microsoft.SqlServer.TransactSql.ScriptDom;
// Parse SQL Server 2025 syntax
var parser = new TSql170Parser(true);
IList<ParseError> errors;
var fragment = parser.Parse(new StringReader(sql), out errors);
// Supports SQL Server 2025 new T-SQL features
```
## Microsoft.Build.Sql 2.0.0 GA (2025)
**MAJOR MILESTONE:** Microsoft.Build.Sql SDK entered General Availability in 2025!
### Latest Version: 2.0.0 (Production Ready)
**Breaking Change from Preview:**
- SDK is now production-ready and recommended for all new database projects
- No longer in preview status
- Full cross-platform support (Windows/Linux/macOS)
- Requires .NET 8+ (was .NET 6+ in preview)
### SQL Server 2025 Support
**Current Status:** SQL Server 2025 target platform support coming in future Microsoft.Build.Sql release (post-2.0.0).
**Workaround for SDK-Style Projects:**
```xml
<!-- Database.sqlproj (SDK-style with SQL Server 2025 compatibility) -->
<Project Sdk="Microsoft.Build.Sql/2.0.0">
<PropertyGroup>
<Name>MyDatabase</Name>
<!-- Use SQL Server 2022 (160) provider until 2025 provider available -->
<DSP>Microsoft.Data.Tools.Schema.Sql.Sql160DatabaseSchemaProvider</DSP>
<TargetFramework>net8.0</TargetFramework>
<SqlServerVersion>Sql160</SqlServerVersion>
<!-- SQL Server 2025 features will still work in runtime database -->
<!-- Only build-time validation uses Sql160 provider -->
</PropertyGroup>
<ItemGroup>
<Folder Include="Tables\" />
<Folder Include="Views\" />
<Folder Include="StoredProcedures\" />
</ItemGroup>
</Project>
```
### Visual Studio 2022 Support
**Requirement:** Visual Studio 2022 version 17.12 or later for SDK-style SQL projects.
**Note:** Side-by-side installation with original SQL projects (legacy SSDT) is NOT supported.
## SQL Server 2025 Release Status
**Current Status**: SQL Server 2025 (17.x) is in **Release Candidate (RC1)** stage as of October 2025. Public preview began May 2025.
**Predicted GA Date**: November 12, 2025 (based on historical release patterns - SQL Server 2019: Nov 4, SQL Server 2022: Nov 16). Expected announcement at Microsoft Ignite conference (November 18-21, 2025).
**Not Yet Production**: SQL Server 2025 is not yet generally available. All features described are available in RC builds for testing purposes only.
## SQL Server 2025 New Features
### Vector Database for AI
**Native Enterprise Vector Store** with built-in security, compliance, and DiskANN indexing technology.
**Key Capabilities:**
- **Up to 3,996 dimensions** per vector (half-precision 2-byte floating-point)
- **DiskANN indexing** - Disk-based approximate nearest neighbor for efficient large-scale vector search
- **Hybrid AI vector search** - Combine vectors with SQL data for semantic + keyword search
- **Built-in security & compliance** - Enterprise-grade data protection
**Vector Embedding & Text Chunking:**
```sql
-- Create table with vector column
CREATE TABLE Documents (
Id INT PRIMARY KEY IDENTITY,
Title NVARCHAR(200),
Content NVARCHAR(MAX),
-- Half-precision vectors support up to 3,996 dimensions
ContentVector VECTOR(1536) -- OpenAI ada-002: 1,536 dims
-- ContentVector VECTOR(3072) -- OpenAI text-embedding-3-large: 3,072 dims
-- ContentVector VECTOR(3996) -- Maximum: 3,996 dims
);
-- Insert vectors (T-SQL built-in embedding generation)
INSERT INTO Documents (Title, Content, ContentVector)
VALUES (
'AI Documentation',
'Azure AI services...',
CAST('[0.1, 0.2, 0.3, ...]' AS VECTOR(1536))
);
-- Semantic similarity search with DiskANN
DECLARE @QueryVector VECTOR(1536) = CAST('[0.15, 0.25, ...]' AS VECTOR(1536));
SELECT TOP 10
Id,
Title,
Content,
VECTOR_DISTANCE('cosine', ContentVector, @QueryVector) AS Similarity
FROM Documents
ORDER BY Similarity;
-- Create DiskANN vector index for performance
CREATE INDEX IX_Documents_Vector
ON Documents(ContentVector)
USING VECTOR_INDEX
WITH (
DISTANCE_METRIC = 'cosine', -- or 'euclidean', 'dot_product'
VECTOR_SIZE = 1536
);
-- Hybrid search: Combine vector similarity with traditional filtering
SELECT TOP 10
Id,
Title,
VECTOR_DISTANCE('cosine', ContentVector, @QueryVector) AS Similarity
FROM Documents
WHERE Title LIKE '%Azure%' -- Traditional keyword filter
ORDER BY Similarity;
```
### AI Model Integration
**Built into T-SQL** - Seamlessly integrate AI services with model definitions directly in the database.
**Supported AI Services:**
- Azure AI Foundry
- Azure OpenAI Service
- OpenAI
- Ollama (local/self-hosted models)
- Custom REST APIs
**Developer Frameworks:**
- LangChain integration
- Semantic Kernel integration
- Entity Framework Core support
- **GraphQL via Data API Builder (DAB)** - Expose SQL Server data through GraphQL endpoints
**External Models (ONNX):**
```sql
-- Create external model from ONNX file
CREATE EXTERNAL MODEL AIModel
FROM 'https://storage.account.blob.core.windows.net/models/model.onnx'
WITH (
TYPE = 'ONNX',
INPUT_DATA_FORMAT = 'JSON',
OUTPUT_DATA_FORMAT = 'JSON'
);
-- Use model for predictions
DECLARE @Input NVARCHAR(MAX) = '{"text": "Hello world"}';
SELECT PREDICT(MODEL = AIModel, DATA = @Input) AS Prediction;
-- Grant model permissions (new SQL Server 2025 permission)
GRANT CREATE ANY EXTERNAL MODEL TO [ModelAdmin];
GRANT ALTER ANY EXTERNAL MODEL TO [ModelAdmin];
GRANT EXECUTE ON EXTERNAL MODEL::AIModel TO [AppUser];
```
**AI Service Integration:**
```sql
-- Example: Azure OpenAI integration
-- Model definitions built directly into T-SQL
-- Access through REST APIs with built-in authentication
```
### Optimized Locking (Performance Enhancement)
**Key Innovation**: Dramatically reduces lock memory consumption and minimizes blocking for concurrent transactions.
**Two Primary Components**:
1. **Transaction ID (TID) Locking**:
- Each row labeled with last TID (Transaction ID) that modified it
- Single lock on TID instead of many row locks
- Locks released as soon as row is updated
- Only one TID lock held until transaction ends
- **Example**: Updating 1,000 rows requires 1,000 X row locks, but each is released immediately, and only one TID lock is held until commit
2. **Lock After Qualification (LAQ)**:
- Evaluates query predicates using latest committed version WITHOUT acquiring lock
- Requires READ COMMITTED SNAPSHOT ISOLATION (RCSI)
- Predicates checked optimistically on committed data
- X row lock taken only if predicate satisfied
- Lock released immediately after row update
**Benefits**:
- Reduced lock memory usage
- Increased concurrency and scale
- Minimized lock escalation
- Enhanced application uptime
- Better performance for high-concurrency workloads
**Enabling Optimized Locking**:
```sql
-- Enable RCSI (required for LAQ)
ALTER DATABASE MyDatabase
SET READ_COMMITTED_SNAPSHOT ON;
-- Optimized locking is automatically enabled at database level
-- No additional configuration needed for SQL Server 2025
-- Verify optimized locking status
SELECT name, is_read_committed_snapshot_on
FROM sys.databases
WHERE name = 'MyDatabase';
-- Monitor optimized locking performance
SELECT *
FROM sys.dm_tran_locks
WHERE request_session_id = @@SPID;
```
### Microsoft Fabric Mirroring (Zero-ETL Analytics)
**Integration**: Near real-time replication of SQL Server databases to Microsoft Fabric OneLake for analytics.
**Key Capabilities**:
- **Zero-ETL Experience**: No complex ETL pipelines required
- **SQL Server 2025-Specific**: Uses new change feed technology (vs CDC in SQL Server 2016-2022)
- **Azure Arc Required**: SQL Server 2025 requires Azure Arc-enabled server for Fabric communication
- **Real-Time Analytics**: Offload analytic workloads to Fabric without impacting production
**Supported Scenarios**:
- SQL Server 2025 on-premises (Windows)
- NOT supported: Azure VM or Linux instances (yet)
**How It Works**:
```sql
-- SQL Server 2025 uses change feed (automatic)
-- Azure Arc agent handles replication to Fabric OneLake
-- Traditional SQL Server 2016-2022 approach (CDC):
-- EXEC sys.sp_cdc_enable_db;
-- EXEC sys.sp_cdc_enable_table ...
-- SQL Server 2025: Change feed is built-in, no CDC setup needed
```
**Benefits**:
- Free Fabric compute for replication
- Free OneLake storage (based on capacity size)
- Near real-time data availability
- BI and analytics without production load
- Integration with Power BI, Synapse, Azure ML
**Configuration**:
1. Enable Azure Arc on SQL Server 2025 instance
2. Configure Fabric workspace and OneLake
3. Enable mirroring in Fabric portal
4. Select database and tables to mirror
5. Data automatically replicated with change feed
**Monitoring**:
```sql
-- Monitor replication lag
SELECT
database_name,
table_name,
last_sync_time,
rows_replicated,
replication_lag_seconds
FROM sys.dm_fabric_replication_status;
```
### Native JSON Support Enhancements
**New JSON Data Type**: Native JSON data type for Azure SQL Database (coming to SQL Server 2025).
```sql
-- New JSON data type
CREATE TABLE Products (
Id INT PRIMARY KEY,
Name NVARCHAR(100),
Metadata JSON -- Native JSON type
);
-- JSON functions enhanced
INSERT INTO Products (Id, Name, Metadata)
VALUES (1, 'Laptop', JSON('{"brand": "Dell", "ram": 16, "ssd": 512}'));
-- Query JSON with improved performance
SELECT
Id,
Name,
JSON_VALUE(Metadata, '$.brand') AS Brand,
JSON_VALUE(Metadata, '$.ram') AS RAM
FROM Products;
```
### Regular Expression (RegEx) Support
**T-SQL RegEx Functions**: Validate, search, and manipulate strings with regular expressions.
```sql
-- RegEx matching
SELECT REGEXP_LIKE('test@example.com', '^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$') AS IsValidEmail;
-- RegEx replace
SELECT REGEXP_REPLACE('Phone: 555-1234', '\d+', 'XXX') AS MaskedPhone;
-- RegEx extract
SELECT REGEXP_SUBSTR('Order #12345', '\d+') AS OrderNumber;
```
### REST API Integration
**Built-in REST Capabilities**: Call external REST APIs directly from T-SQL.
```sql
-- Call REST API from T-SQL
DECLARE @Response NVARCHAR(MAX);
EXEC sp_invoke_external_rest_endpoint
@url = 'https://api.example.com/data',
@method = 'GET',
@headers = '{"Authorization": "Bearer token123"}',
@response = @Response OUTPUT;
SELECT @Response AS APIResponse;
-- Enrich database data with external APIs
UPDATE Customers
SET EnrichedData = (
SELECT JSON_VALUE(response, '$.data')
FROM OPENROWSET(REST, 'https://api.example.com/customer/' + CustomerId)
)
WHERE CustomerId = 12345;
```
### Optional Parameter Plan Optimization (OPPO)
**Performance Enhancement**: SQL Server 2025 introduces OPPO to enable optimal execution plan selection based on customer-provided runtime parameter values.
**Key Benefits:**
- Solves parameter sniffing issues
- Optimizes plans for specific runtime parameters
- Improves query performance with parameter-sensitive workloads
- Reduces need for query hints or plan guides
**Enabling OPPO:**
```sql
-- Enable at database level
ALTER DATABASE MyDatabase
SET PARAMETER_SENSITIVE_PLAN_OPTIMIZATION = ON;
-- Check status
SELECT name, is_parameter_sensitive_plan_optimization_on
FROM sys.databases
WHERE name = 'MyDatabase';
-- Monitor OPPO usage
SELECT
query_plan_hash,
parameter_values,
execution_count,
avg_duration_ms
FROM sys.dm_exec_query_stats
WHERE is_parameter_sensitive = 1;
```
### Microsoft Entra Managed Identities
**Security Enhancement**: SQL Server 2025 adds support for Microsoft Entra managed identities for improved credential management.
**Key Benefits:**
- Eliminates hardcoded credentials
- Reduces security vulnerabilities
- Provides compliance and auditing capabilities
- Simplifies credential rotation
**Configuration:**
```sql
-- Create login with managed identity
CREATE LOGIN [managed-identity-name] FROM EXTERNAL PROVIDER;
-- Grant permissions
CREATE USER [managed-identity-name] FOR LOGIN [managed-identity-name];
GRANT CONTROL ON DATABASE::MyDatabase TO [managed-identity-name];
-- Use in connection strings
-- Connection string: Server=myserver;Database=mydb;Authentication=Active Directory Managed Identity;
```
### Enhanced Information Protection
Sensitivity classification and encryption:
```sql
-- Classify sensitive columns
ADD SENSITIVITY CLASSIFICATION TO
Customers.Email,
Customers.CreditCard
WITH (
LABEL = 'Confidential',
INFORMATION_TYPE = 'Financial'
);
-- Query classification
SELECT
schema_name(o.schema_id) AS SchemaName,
o.name AS TableName,
c.name AS ColumnName,
s.label AS SensitivityLabel,
s.information_type AS InformationType
FROM sys.sensitivity_classifications s
INNER JOIN sys.objects o ON s.major_id = o.object_id
INNER JOIN sys.columns c ON s.major_id = c.object_id AND s.minor_id = c.column_id;
```
## Deployment to SQL Server 2025
### Using SqlPackage
```bash
# Publish with 2025 features
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=tcp:server2025.database.windows.net;Database=MyDb;Authentication=ActiveDirectoryManagedIdentity;" \
/p:BlockOnPossibleDataLoss=True \
/p:IncludeCompositeObjects=True \
/p:DropObjectsNotInSource=False \
/p:DoNotDropObjectTypes=Users;RoleMembership \
/p:GenerateSmartDefaults=True \
/DiagnosticsFile:deploy.log
```
### Using MSBuild
```xml
<!-- Database.publish.xml -->
<Project>
<PropertyGroup>
<TargetConnectionString>Server=tcp:server2025.database.windows.net;Database=MyDb;Authentication=ActiveDirectoryManagedIdentity;</TargetConnectionString>
<BlockOnPossibleDataLoss>True</BlockOnPossibleDataLoss>
<TargetDatabaseName>MyDatabase</TargetDatabaseName>
<ProfileVersionNumber>1</ProfileVersionNumber>
</PropertyGroup>
<ItemGroup>
<SqlCmdVariable Include="Environment">
<Value>Production</Value>
</SqlCmdVariable>
</ItemGroup>
</Project>
```
```bash
# Deploy using MSBuild
msbuild Database.sqlproj \
/t:Publish \
/p:PublishProfile=Database.publish.xml \
/p:TargetPlatform=SqlServer2025
```
## CI/CD Best Practices 2025
### Key Principles
**State-Based Deployment (Recommended):**
- Source code represents current database state
- All objects (procedures, tables, triggers, views) in separate .sql files
- SqlPackage generates incremental scripts automatically
- Preferred over migration-based approaches
**Testing & Quality:**
- **tSQLt** - Unit testing for SQL Server stored procedures and functions
- Tests produce machine-readable results
- Abort pipeline on test failure with immediate notifications
- Never continue deployment if tests fail
**Security:**
- **Windows Authentication preferred** for CI/CD (avoid plain text passwords)
- Never commit credentials to source control
- Use Azure Key Vault or GitHub Secrets for connection strings
**Version Control:**
- All database objects in source control
- Test scripts versioned and executed in Build step
- Require comments on check-ins
- Configure custom check-in policies
### GitHub Actions (2025 Pattern)
```yaml
name: Deploy to SQL Server 2025
on:
push:
branches: [main]
jobs:
build-and-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET 8
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install SqlPackage 170.2.70
run: dotnet tool install -g Microsoft.SqlPackage --version 170.2.70
- name: Build DACPAC
run: dotnet build Database.sqlproj -c Release
- name: Run tSQLt Unit Tests
run: |
# Run unit tests and capture results
# Abort if tests fail
echo "Running tSQLt unit tests..."
# Add your tSQLt test execution here
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport \
/SourceFile:bin/Release/Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONNECTION_STRING }}" \
/OutputPath:deploy-report.xml \
/p:BlockOnPossibleDataLoss=True
- name: Publish to SQL Server 2025
run: |
sqlpackage /Action:Publish \
/SourceFile:bin/Release/Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONNECTION_STRING }}" \
/p:TargetPlatform=SqlServer2025 \
/p:BlockOnPossibleDataLoss=True \
/DiagnosticsFile:publish.log \
/DiagnosticsLevel:Verbose
- name: Upload Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: deployment-logs
path: |
publish.log
deploy-report.xml
```
### Azure DevOps
```yaml
trigger:
- main
pool:
vmImage: 'windows-2022'
steps:
- task: MSBuild@1
displayName: 'Build Database Project'
inputs:
solution: 'Database.sqlproj'
configuration: 'Release'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to SQL Server 2025'
inputs:
azureSubscription: 'Azure Subscription'
authenticationType: 'servicePrincipal'
serverName: 'server2025.database.windows.net'
databaseName: 'MyDatabase'
deployType: 'DacpacTask'
deploymentAction: 'Publish'
dacpacFile: '$(Build.SourcesDirectory)/bin/Release/Database.dacpac'
additionalArguments: '/p:TargetPlatform=SqlServer2025'
```
## New SqlPackage Diagnostic Features
```bash
# Enable detailed diagnostics
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:server2025.database.windows.net \
/TargetDatabaseName:MyDatabase \
/DiagnosticsLevel:Verbose \
/DiagnosticPackageFile:diagnostics.zip
# Creates diagnostics.zip containing:
# - Deployment logs
# - Performance metrics
# - Error details
# - Schema comparison results
```
## Microsoft Fabric Data Warehouse Support
**New in SqlPackage 162.5+:** Full support for SQL database in Microsoft Fabric.
**Fabric Deployment:**
```bash
# Deploy to Fabric Warehouse
sqlpackage /Action:Publish \
/SourceFile:Warehouse.dacpac \
/TargetConnectionString:"Server=tcp:myworkspace.datawarehouse.fabric.microsoft.com;Database=mywarehouse;Authentication=ActiveDirectoryInteractive;" \
/p:DatabaseEdition=Fabric \
/p:DatabaseServiceObjective=SqlDbFabricDatabaseSchemaProvider
# Extract from Fabric
sqlpackage /Action:Extract \
/SourceConnectionString:"Server=tcp:myworkspace.datawarehouse.fabric.microsoft.com;Database=mywarehouse;Authentication=ActiveDirectoryInteractive;" \
/TargetFile:Fabric.dacpac
# New permission: ALTER ANY EXTERNAL MIRROR (Fabric-specific)
GRANT ALTER ANY EXTERNAL MIRROR TO [FabricAdmin];
```
## Best Practices for SQL Server 2025
1. **Use Target Platform Specification:**
```xml
<PropertyGroup>
<TargetPlatform>SqlServer2025</TargetPlatform>
</PropertyGroup>
```
2. **Test Vector Operations:**
```sql
-- Verify vector support
SELECT SERVERPROPERTY('IsVectorSupported') AS VectorSupport;
```
3. **Monitor AI Model Performance:**
```sql
-- Track model execution
SELECT
model_name,
AVG(execution_time_ms) AS AvgExecutionTime,
COUNT(*) AS ExecutionCount
FROM sys.dm_exec_external_model_stats
GROUP BY model_name;
```
4. **Implement Sensitivity Classification:**
```sql
-- Classify all PII columns
ADD SENSITIVITY CLASSIFICATION TO dbo.Customers.Email
WITH (LABEL = 'Confidential - GDPR', INFORMATION_TYPE = 'Email');
```
## Resources
- [SQL Server 2025 Preview](https://aka.ms/sqlserver2025)
- [SqlPackage Documentation](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [SDK-Style Projects](https://learn.microsoft.com/sql/tools/sql-database-projects/concepts/sdk-style-projects)
- [Vector Database](https://learn.microsoft.com/sql/relational-databases/vectors/)

View File

@@ -0,0 +1,669 @@
---
name: ssdt-cicd-best-practices-2025
description: Modern CI/CD best practices for SQL Server database development with tSQLt, state-based deployment, and 2025 patterns
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# SSDT CI/CD Best Practices 2025
## Overview
This skill provides comprehensive guidance on implementing modern CI/CD pipelines for SQL Server database projects using SSDT, SqlPackage, and contemporary DevOps practices.
## Key Principles (2025 Recommended Approach)
### 1. State-Based Deployment (Recommended)
**Definition**: Source code represents the current database state, not migration scripts.
**How it Works**:
- All database objects (tables, procedures, views, functions) stored in separate .sql files
- SqlPackage automatically generates incremental deployment scripts
- Declarative approach: "This is what the database should look like"
- SSDT compares source to target and calculates differences
**Advantages**:
- Easier to maintain and understand
- No risk of missing migration scripts
- Git history shows complete object definitions
- Branching and merging simplified
- Rollback by redeploying previous version
**Implementation**:
```yaml
# GitHub Actions example
- name: Build DACPAC (State-Based)
run: dotnet build Database.sqlproj -c Release
- name: Deploy State to Target
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.SQL_CONN }}" \
/p:BlockOnPossibleDataLoss=True
```
**Contrast with Migration-Based**:
- Migration-based: Sequential scripts (001_CreateTable.sql, 002_AddColumn.sql)
- State-based: Object definitions (Tables/Customer.sql contains complete CREATE TABLE)
### 2. tSQLt Unit Testing (Critical for CI/CD)
**Why tSQLt**:
- Open-source SQL Server unit testing framework
- Write tests in T-SQL language
- Produces machine-readable XML/JSON results
- Integrates seamlessly with CI/CD pipelines
**Key Features**:
- **Automatic Transactions**: Each test runs in a transaction and rolls back
- **Schema Grouping**: Group related tests in schemas
- **Mocking**: Fake tables and procedures for isolated testing
- **Assertions**: Built-in assertion methods (assertEquals, assertEmpty, etc.)
**Pipeline Abort on Failure**:
```yaml
# GitHub Actions with tSQLt
- name: Run tSQLt Unit Tests
run: |
# Deploy test framework
sqlpackage /Action:Publish \
/SourceFile:DatabaseTests.dacpac \
/TargetConnectionString:"${{ secrets.TEST_SQL_CONN }}"
# Execute tests and capture results
sqlcmd -S test-server -d TestDB -Q "EXEC tSQLt.RunAll" -o test-results.txt
# Parse results and fail pipeline if tests fail
if grep -q "Failure" test-results.txt; then
echo "Unit tests failed!"
exit 1
fi
echo "All tests passed!"
- name: Deploy to Production (only runs if tests pass)
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}"
```
**Test Structure**:
```sql
-- tSQLt test example
CREATE SCHEMA CustomerTests;
GO
CREATE PROCEDURE CustomerTests.[test Customer Insert Sets Correct Defaults]
AS
BEGIN
-- Arrange
EXEC tSQLt.FakeTable 'dbo.Customers';
-- Act
INSERT INTO dbo.Customers (FirstName, LastName, Email)
VALUES ('John', 'Doe', 'john@example.com');
-- Assert
EXEC tSQLt.AssertEquals @Expected = 1,
@Actual = (SELECT COUNT(*) FROM dbo.Customers);
EXEC tSQLt.AssertNotEquals @Expected = NULL,
@Actual = (SELECT CreatedDate FROM dbo.Customers);
END;
GO
-- Run all tests
EXEC tSQLt.RunAll;
```
**Azure DevOps Integration**:
```yaml
- task: PowerShell@2
displayName: 'Run tSQLt Tests'
inputs:
targetType: 'inline'
script: |
# Execute tSQLt tests
$results = Invoke-Sqlcmd -ServerInstance $(testServer) `
-Database $(testDatabase) `
-Query "EXEC tSQLt.RunAll" `
-Verbose
# Check for failures
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
Write-Error "Tests failed: $($failures.Count) failures"
exit 1
}
```
### 3. Windows Authentication Over SQL Authentication
**Security Best Practice**: Prefer Windows Authentication (Integrated Security) for CI/CD agents.
**Why Windows Auth**:
- No passwords stored in connection strings
- Leverages existing Active Directory infrastructure
- Service accounts with minimal permissions
- Audit trail via Windows Security logs
- No credential rotation needed
**Implementation**:
**Self-Hosted Agents (Recommended)**:
```yaml
# GitHub Actions with self-hosted Windows agent
runs-on: [self-hosted, windows, sql-deploy]
steps:
- name: Deploy with Windows Auth
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" \
/p:BlockOnPossibleDataLoss=True
```
**Azure DevOps with Service Connection**:
```yaml
- task: SqlAzureDacpacDeployment@1
inputs:
authenticationType: 'integratedAuth' # Uses Windows Auth
serverName: 'prod-sql.domain.com'
databaseName: 'MyDatabase'
dacpacFile: '$(Build.ArtifactStagingDirectory)/Database.dacpac'
```
**Alternative for Cloud Agents (Azure SQL)**:
```yaml
# Use Managed Identity instead of SQL auth
- name: Deploy with Managed Identity
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"Server=tcp:server.database.windows.net;Database=MyDB;Authentication=ActiveDirectoryManagedIdentity;" \
/p:BlockOnPossibleDataLoss=True
```
**Never Do This**:
```yaml
# BAD: Plain text SQL auth password
TargetConnectionString: "Server=prod;Database=MyDB;User=sa;Password=P@ssw0rd123"
```
**If SQL Auth Required**:
```yaml
# Use secrets/variables (least preferred method)
- name: Deploy with SQL Auth (Not Recommended)
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetServerName:"${{ secrets.SQL_SERVER }}" \
/TargetDatabaseName:"${{ secrets.SQL_DATABASE }}" \
/TargetUser:"${{ secrets.SQL_USER }}" \
/TargetPassword:"${{ secrets.SQL_PASSWORD }}" \
/p:BlockOnPossibleDataLoss=True
# Still not as secure as Windows Auth!
```
### 4. Version Control Everything
**What to Version**:
```
DatabaseProject/
├── Tables/
│ ├── Customer.sql
│ └── Order.sql
├── StoredProcedures/
│ └── GetCustomerOrders.sql
├── Tests/
│ ├── CustomerTests/
│ │ └── test_CustomerInsert.sql
│ └── OrderTests/
│ └── test_OrderValidation.sql
├── Scripts/
│ ├── Script.PreDeployment.sql
│ └── Script.PostDeployment.sql
├── Database.sqlproj
├── Database.Dev.publish.xml
├── Database.Prod.publish.xml
└── .gitignore
```
**.gitignore**:
```
# Build outputs
bin/
obj/
*.dacpac
# User-specific files
*.user
*.suo
# Visual Studio folders
.vs/
# Never commit credentials
*.publish.xml.user
```
**Check-in Requirements**:
- Require code review for database changes
- Mandate comments on all commits
- Run automated tests before merge
- Enforce naming conventions via branch policies
### 5. Deployment Reports Always Required
**Before Production Deployment**:
```yaml
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}" \
/OutputPath:deploy-report.xml \
/p:BlockOnPossibleDataLoss=True
- name: Parse and Review Report
run: |
# Extract key metrics from XML
echo "=== DEPLOYMENT REPORT ==="
# Parse XML for operations count
# Check for data loss warnings
# Display to user or post to PR
- name: Require Manual Approval
uses: trstringer/manual-approval@v1
with:
approvers: database-admins
minimum-approvals: 1
instructions: "Review deploy-report.xml before approving"
- name: Deploy After Approval
run: |
sqlpackage /Action:Publish \
/SourceFile:Database.dacpac \
/TargetConnectionString:"${{ secrets.PROD_SQL_CONN }}"
```
### 6. Environment Promotion Strategy
**Standard Flow**: Dev → QA → Staging → Production
**Consistent Deployment Options**:
```yaml
# Define environment-specific properties
environments:
dev:
blockOnDataLoss: false
dropObjectsNotInSource: true
backupBeforeChanges: false
qa:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
staging:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
production:
blockOnDataLoss: true
dropObjectsNotInSource: false
backupBeforeChanges: true
requireApproval: true
```
## Complete GitHub Actions Pipeline (2025 Best Practice)
```yaml
name: SQL Server CI/CD Pipeline
on:
push:
branches: [main, develop]
pull_request:
branches: [main]
env:
DOTNET_VERSION: '8.0.x'
SQLPACKAGE_VERSION: '170.2.70'
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET 8
uses: actions/setup-dotnet@v4
with:
dotnet-version: ${{ env.DOTNET_VERSION }}
- name: Install SqlPackage
run: dotnet tool install -g Microsoft.SqlPackage --version ${{ env.SQLPACKAGE_VERSION }}
- name: Build Database Project
run: dotnet build src/Database.sqlproj -c Release
- name: Build Test Project
run: dotnet build tests/DatabaseTests.sqlproj -c Release
- name: Upload DACPAC Artifacts
uses: actions/upload-artifact@v4
with:
name: dacpacs
path: |
src/bin/Release/*.dacpac
tests/bin/Release/*.dacpac
test:
runs-on: windows-latest # tSQLt requires SQL Server
needs: build
steps:
- uses: actions/checkout@v4
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Setup Test Database
run: |
sqlcmd -S localhost -Q "CREATE DATABASE TestDB"
- name: Deploy Database to Test
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=localhost;Database=TestDB;Integrated Security=True;"
- name: Deploy tSQLt Framework
run: |
sqlpackage /Action:Publish `
/SourceFile:DatabaseTests.dacpac `
/TargetConnectionString:"Server=localhost;Database=TestDB;Integrated Security=True;"
- name: Run tSQLt Unit Tests
run: |
$results = Invoke-Sqlcmd -ServerInstance localhost `
-Database TestDB `
-Query "EXEC tSQLt.RunAll" `
-Verbose
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
Write-Error "Tests failed: $($failures.Count) failures"
exit 1
}
Write-Host "All tests passed!"
deploy-dev:
runs-on: [self-hosted, windows, sql-deploy]
needs: test
if: github.ref == 'refs/heads/develop'
environment: dev
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Deploy to Dev (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=dev-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=False `
/p:DropObjectsNotInSource=True
deploy-staging:
runs-on: [self-hosted, windows, sql-deploy]
needs: test
if: github.ref == 'refs/heads/main'
environment: staging
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=staging-sql;Database=MyDB;Integrated Security=True;" `
/OutputPath:deploy-report.xml
- name: Deploy to Staging (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=staging-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=True `
/p:BackupDatabaseBeforeChanges=True `
/p:DropObjectsNotInSource=False
deploy-production:
runs-on: [self-hosted, windows, sql-deploy]
needs: deploy-staging
environment: production
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
name: dacpacs
- name: Generate Deployment Report
run: |
sqlpackage /Action:DeployReport `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" `
/OutputPath:prod-deploy-report.xml
- name: Manual Approval Required
uses: trstringer/manual-approval@v1
with:
approvers: database-admins,devops-leads
minimum-approvals: 2
instructions: "Review prod-deploy-report.xml and approve deployment"
- name: Deploy to Production (Windows Auth)
run: |
sqlpackage /Action:Publish `
/SourceFile:Database.dacpac `
/TargetConnectionString:"Server=prod-sql;Database=MyDB;Integrated Security=True;" `
/p:BlockOnPossibleDataLoss=True `
/p:BackupDatabaseBeforeChanges=True `
/p:DropObjectsNotInSource=False `
/p:DoNotDropObjectTypes=Users;Logins;RoleMembership `
/DiagnosticsFile:prod-deploy.log
- name: Upload Deployment Logs
if: always()
uses: actions/upload-artifact@v4
with:
name: production-deployment-logs
path: prod-deploy.log
```
## Azure DevOps Pipeline Example (2025)
```yaml
trigger:
branches:
include:
- main
- develop
pool:
vmImage: 'windows-2022'
variables:
buildConfiguration: 'Release'
dotnetVersion: '8.0.x'
sqlPackageVersion: '170.2.70'
stages:
- stage: Build
jobs:
- job: BuildDatabase
steps:
- task: UseDotNet@2
displayName: 'Install .NET 8'
inputs:
version: $(dotnetVersion)
- task: DotNetCoreCLI@2
displayName: 'Build Database Project'
inputs:
command: 'build'
projects: '**/*.sqlproj'
arguments: '-c $(buildConfiguration)'
- task: PublishBuildArtifacts@1
displayName: 'Publish DACPAC'
inputs:
PathtoPublish: '$(Build.SourcesDirectory)/bin/$(buildConfiguration)'
ArtifactName: 'dacpacs'
- stage: Test
dependsOn: Build
jobs:
- job: RunUnitTests
steps:
- task: DownloadBuildArtifacts@1
inputs:
artifactName: 'dacpacs'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to Test Database'
inputs:
authenticationType: 'integratedAuth'
serverName: 'test-sql-server'
databaseName: 'TestDB'
dacpacFile: '$(System.ArtifactsDirectory)/dacpacs/Database.dacpac'
- task: PowerShell@2
displayName: 'Run tSQLt Tests'
inputs:
targetType: 'inline'
script: |
$results = Invoke-Sqlcmd -ServerInstance 'test-sql-server' `
-Database 'TestDB' `
-Query "EXEC tSQLt.RunAll"
$failures = $results | Where-Object { $_.Class -eq 'Failure' }
if ($failures) {
throw "Tests failed: $($failures.Count) failures"
}
- stage: DeployProduction
dependsOn: Test
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
jobs:
- deployment: DeployToProduction
environment: 'Production'
strategy:
runOnce:
deploy:
steps:
- task: SqlAzureDacpacDeployment@1
displayName: 'Generate Deployment Report'
inputs:
deployType: 'DeployReport'
authenticationType: 'integratedAuth'
serverName: 'prod-sql-server'
databaseName: 'ProductionDB'
dacpacFile: '$(Pipeline.Workspace)/dacpacs/Database.dacpac'
outputFile: 'deploy-report.xml'
- task: SqlAzureDacpacDeployment@1
displayName: 'Deploy to Production'
inputs:
authenticationType: 'integratedAuth'
serverName: 'prod-sql-server'
databaseName: 'ProductionDB'
dacpacFile: '$(Pipeline.Workspace)/dacpacs/Database.dacpac'
additionalArguments: '/p:BlockOnPossibleDataLoss=True /p:BackupDatabaseBeforeChanges=True'
```
## Best Practices Checklist
### Source Control
- [ ] All database objects in source control
- [ ] .gitignore configured for build outputs
- [ ] No credentials committed
- [ ] Test scripts versioned separately
- [ ] Branching strategy defined (gitflow, trunk-based, etc.)
### Testing
- [ ] tSQLt framework deployed
- [ ] Unit tests cover critical stored procedures
- [ ] Tests grouped logically in schemas
- [ ] Pipeline aborts on test failure
- [ ] Test results published to dashboard
### Security
- [ ] Windows Authentication used for CI/CD
- [ ] Service accounts follow principle of least privilege
- [ ] Secrets stored in Azure Key Vault / GitHub Secrets
- [ ] No plain text passwords
- [ ] Audit logging enabled
### Deployment
- [ ] State-based deployment strategy
- [ ] Deployment reports generated before production
- [ ] Manual approval gates for production
- [ ] Backup before changes (production)
- [ ] BlockOnPossibleDataLoss enabled (production)
- [ ] DoNotDropObjectTypes configured
- [ ] Rollback plan documented
### Monitoring
- [ ] Deployment logs captured
- [ ] Failed deployments trigger alerts
- [ ] Performance metrics tracked
- [ ] Schema drift detection automated
## Resources
- [tSQLt Official Site](https://tsqlt.org/)
- [Microsoft.Build.Sql Documentation](https://learn.microsoft.com/sql/tools/sql-database-projects/)
- [SqlPackage Reference](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [Azure DevOps SQL Tasks](https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/sql-azure-dacpac-deployment)
- [GitHub Actions for SQL](https://github.com/marketplace?type=actions&query=sql+)

View File

@@ -0,0 +1,529 @@
---
name: windows-git-bash-paths
description: Windows and Git Bash path handling for SSDT, SqlPackage, and DACPAC files with shell detection
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Windows and Git Bash Path Handling for SSDT
## Overview
SQL Server development is Windows-heavy, and many developers use Git Bash (MINGW/MSYS2) as their preferred shell on Windows. This creates unique path conversion challenges when working with Windows-native tools like SqlPackage, MSBuild, and Visual Studio that expect Windows-style paths.
This skill provides comprehensive guidance on handling path conversion issues, shell detection, and best practices for SSDT workflows on Windows with Git Bash.
## The Path Conversion Problem
### What Happens in Git Bash/MINGW
Git Bash automatically converts POSIX-style paths to Windows paths, but this can cause issues with command-line arguments:
**Automatic Conversions:**
- `/foo``C:/Program Files/Git/usr/foo`
- `/foo:/bar``C:\msys64\foo;C:\msys64\bar`
- `--dir=/foo``--dir=C:/msys64/foo`
**Problematic for SqlPackage:**
```bash
# Git Bash converts /Action to a path!
sqlpackage /Action:Publish /SourceFile:MyDB.dacpac
# Becomes: sqlpackage C:/Program Files/Git/usr/Action:Publish ...
```
### What Triggers Conversion
✓ Leading forward slash (/) in arguments
✓ Colon-separated path lists
✓ Arguments after `-` or `,` with path components
### What's Exempt
✓ Arguments containing `=` (variable assignments)
✓ Drive specifiers (`C:`)
✓ Arguments with `;` (already Windows format)
✓ Arguments starting with `//` (Windows switches)
## Solutions for SqlPackage in Git Bash
### Method 1: MSYS_NO_PATHCONV (Recommended)
Disable path conversion for specific commands:
```bash
# Temporarily disable path conversion
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"MyDatabase.dacpac" \
/TargetServerName:"localhost" \
/TargetDatabaseName:"MyDB"
# Works for all SqlPackage actions
MSYS_NO_PATHCONV=1 sqlpackage /Action:Extract \
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;" \
/TargetFile:"MyDB_backup.dacpac"
```
**Important Notes:**
- The VALUE doesn't matter - setting to `0`, `false`, or empty still disables conversion
- Only matters that variable is DEFINED
- To re-enable: `env -u MSYS_NO_PATHCONV`
### Method 2: Double Slash // (Alternative)
Use double slashes for SqlPackage parameters:
```bash
# Works in Git Bash and CMD
sqlpackage //Action:Publish \
//SourceFile:MyDatabase.dacpac \
//TargetServerName:localhost \
//TargetDatabaseName:MyDB
# Extract with double slashes
sqlpackage //Action:Extract \
//SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;" \
//TargetFile:output.dacpac
```
**Advantages:**
- No environment variable needed
- Works across shells
- Shell-agnostic scripts
### Method 3: Use Windows-Style Paths with Quotes
Always quote paths with backslashes:
```bash
# Quoted Windows paths work in Git Bash
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:\Projects\MyDB\bin\Release\MyDB.dacpac" \
/TargetConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;"
# Or with forward slashes (Windows accepts both)
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB/bin/Release/MyDB.dacpac" \
/TargetConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;"
```
### Method 4: Switch to PowerShell or CMD
For Windows-native tools, consider using native shells:
```powershell
# PowerShell (recommended for Windows SSDT workflows)
sqlpackage /Action:Publish `
/SourceFile:"MyDatabase.dacpac" `
/TargetServerName:"localhost" `
/TargetDatabaseName:"MyDB"
```
```cmd
:: CMD
sqlpackage /Action:Publish ^
/SourceFile:"MyDatabase.dacpac" ^
/TargetServerName:"localhost" ^
/TargetDatabaseName:"MyDB"
```
## Shell Detection for Scripts
### Bash Script Detection
```bash
#!/bin/bash
# Method 1: Check $OSTYPE
case "$OSTYPE" in
msys*) # MSYS/Git Bash/MinGW
export MSYS_NO_PATHCONV=1
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
cygwin*) # Cygwin
export MSYS_NO_PATHCONV=1
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
linux-gnu*) # Linux
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
darwin*) # macOS
SQLPACKAGE_ARGS="/Action:Publish /SourceFile:MyDB.dacpac"
;;
esac
sqlpackage $SQLPACKAGE_ARGS
```
```bash
# Method 2: Check uname -s (most portable)
case "$(uname -s)" in
MINGW64*|MINGW32*)
# Git Bash
export MSYS_NO_PATHCONV=1
echo "Git Bash detected - path conversion disabled"
;;
MSYS_NT*)
# MSYS
export MSYS_NO_PATHCONV=1
echo "MSYS detected - path conversion disabled"
;;
CYGWIN*)
# Cygwin
export MSYS_NO_PATHCONV=1
echo "Cygwin detected - path conversion disabled"
;;
Linux*)
# Linux or WSL
echo "Linux detected"
;;
Darwin*)
# macOS
echo "macOS detected"
;;
esac
```
```bash
# Method 3: Check $MSYSTEM (Git Bash specific)
if [ -n "$MSYSTEM" ]; then
# Running in Git Bash/MSYS2
export MSYS_NO_PATHCONV=1
echo "MSYS environment detected: $MSYSTEM"
case "$MSYSTEM" in
MINGW64) echo "64-bit native Windows environment" ;;
MINGW32) echo "32-bit native Windows environment" ;;
MSYS) echo "POSIX-compliant environment" ;;
esac
fi
```
### Complete Build Script Example
```bash
#!/bin/bash
# build-and-deploy.sh - Cross-platform SSDT build script
set -e # Exit on error
# Detect shell and set path conversion
if [ -n "$MSYSTEM" ]; then
echo "Git Bash/MSYS2 detected - disabling path conversion"
export MSYS_NO_PATHCONV=1
fi
# Variables
PROJECT_NAME="MyDatabase"
BUILD_CONFIG="Release"
DACPAC_PATH="bin/${BUILD_CONFIG}/${PROJECT_NAME}.dacpac"
TARGET_SERVER="${SQL_SERVER:-localhost}"
TARGET_DB="${SQL_DATABASE:-MyDB}"
# Build
echo "Building ${PROJECT_NAME}..."
dotnet build "${PROJECT_NAME}.sqlproj" -c "$BUILD_CONFIG"
# Verify DACPAC exists
if [ ! -f "$DACPAC_PATH" ]; then
echo "ERROR: DACPAC not found at $DACPAC_PATH"
exit 1
fi
echo "DACPAC built successfully: $DACPAC_PATH"
# Deploy
echo "Deploying to ${TARGET_SERVER}/${TARGET_DB}..."
# Use double-slash method for maximum compatibility
sqlpackage //Action:Publish \
//SourceFile:"$DACPAC_PATH" \
//TargetServerName:"$TARGET_SERVER" \
//TargetDatabaseName:"$TARGET_DB" \
//p:BlockOnPossibleDataLoss=False
echo "Deployment complete!"
```
## Common SSDT Path Issues in Git Bash
### Issue 1: DACPAC File Paths
**Problem:**
```bash
# Git Bash mangles the path
sqlpackage /Action:Publish /SourceFile:./bin/Release/MyDB.dacpac
# Error: Cannot find file
```
**Solution:**
```bash
# Use MSYS_NO_PATHCONV
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"./bin/Release/MyDB.dacpac"
# OR use absolute Windows path
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB/bin/Release/MyDB.dacpac"
# OR use double slashes
sqlpackage //Action:Publish //SourceFile:./bin/Release/MyDB.dacpac
```
### Issue 2: SQL Project File Paths
**Problem:**
```bash
# Path with spaces causes issues
dotnet build "D:/Program Files/MyProject/Database.sqlproj"
# Works in Git Bash (dotnet handles paths correctly)
# But MSBuild paths may fail
msbuild "D:/Program Files/MyProject/Database.sqlproj"
# May fail if not quoted properly
```
**Solution:**
```bash
# Always quote paths with spaces
dotnet build "D:/Program Files/MyProject/Database.sqlproj"
# Use backslashes for MSBuild on Windows
msbuild "D:\Program Files\MyProject\Database.sqlproj"
# Or use 8.3 short names (no spaces)
msbuild "D:/PROGRA~1/MyProject/Database.sqlproj"
```
### Issue 3: Publish Profile Paths
**Problem:**
```bash
# Publish profile not found
sqlpackage /Action:Publish \
/SourceFile:MyDB.dacpac \
/Profile:./Profiles/Production.publish.xml
```
**Solution:**
```bash
# Use MSYS_NO_PATHCONV with quoted paths
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"MyDB.dacpac" \
/Profile:"./Profiles/Production.publish.xml"
# Or absolute Windows path
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish \
/SourceFile:"D:/Projects/MyDB.dacpac" \
/Profile:"D:/Projects/Profiles/Production.publish.xml"
```
### Issue 4: Connection Strings
**Problem:**
```bash
# File paths in connection strings
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:/Data/MyDB.mdf"
# Path gets mangled
```
**Solution:**
```bash
# Quote entire connection string
MSYS_NO_PATHCONV=1 sqlpackage /Action:Extract \
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:\Data\MyDB.mdf" \
/TargetFile:"output.dacpac"
# Or use double backslashes in connection string
/SourceConnectionString:"Server=localhost;Database=MyDB;Integrated Security=True;AttachDbFilename=D:\\Data\\MyDB.mdf"
```
## CI/CD Considerations
### GitHub Actions with Git Bash
```yaml
name: SSDT Build and Deploy
on: [push]
jobs:
build:
runs-on: windows-latest
defaults:
run:
shell: bash # Use Git Bash
steps:
- uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install SqlPackage
run: dotnet tool install -g Microsoft.SqlPackage
- name: Build Database Project
run: dotnet build Database.sqlproj -c Release
- name: Deploy with Path Conversion Disabled
env:
MSYS_NO_PATHCONV: 1
run: |
sqlpackage /Action:Publish \
/SourceFile:"bin/Release/MyDatabase.dacpac" \
/TargetServerName:"localhost" \
/TargetDatabaseName:"MyDB"
```
### PowerShell Alternative (Recommended for Windows)
```yaml
jobs:
build:
runs-on: windows-latest
defaults:
run:
shell: pwsh # Use PowerShell - no path issues
steps:
- name: Deploy Database
run: |
sqlpackage /Action:Publish `
/SourceFile:"bin/Release/MyDatabase.dacpac" `
/TargetServerName:"localhost" `
/TargetDatabaseName:"MyDB"
```
## Best Practices Summary
### For Interactive Development
1. **Use PowerShell or CMD for SSDT on Windows** - avoids path conversion issues entirely
2. **If using Git Bash**, set `MSYS_NO_PATHCONV=1` in your shell profile for SSDT work
3. **Always quote paths** containing spaces or special characters
4. **Use absolute paths** when possible to avoid ambiguity
### For Scripts
1. **Detect shell environment** and set `MSYS_NO_PATHCONV=1` conditionally
2. **Use double-slash // syntax** for SqlPackage arguments (most portable)
3. **Prefer PowerShell for Windows-specific workflows** (build scripts, CI/CD)
4. **Test scripts on all target platforms** (Windows PowerShell, Git Bash, Linux)
### For CI/CD
1. **Use PowerShell shell in GitHub Actions** for Windows runners (`shell: pwsh`)
2. **Self-hosted Windows agents** - use native Windows paths and shells
3. **Set MSYS_NO_PATHCONV=1 as environment variable** if Git Bash required
4. **Prefer dotnet CLI over MSBuild** for cross-platform compatibility
### For Teams
1. **Document shell requirements** in repository README
2. **Provide scripts for all shells** (bash, PowerShell, CMD)
3. **Standardize on PowerShell** for Windows SSDT workflows when possible
4. **Use containerized builds** to avoid shell-specific issues
## Quick Reference
### Environment Variables
```bash
# Disable path conversion (Git Bash/MSYS2/Cygwin)
export MSYS_NO_PATHCONV=1
# Re-enable path conversion
env -u MSYS_NO_PATHCONV
```
### SqlPackage Command Templates
```bash
# Git Bash - Method 1 (MSYS_NO_PATHCONV)
MSYS_NO_PATHCONV=1 sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
# Git Bash - Method 2 (Double Slash)
sqlpackage //Action:Publish //SourceFile:MyDB.dacpac //TargetServerName:localhost //TargetDatabaseName:MyDB
# PowerShell (Recommended for Windows)
sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
# CMD
sqlpackage /Action:Publish /SourceFile:"MyDB.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"MyDB"
```
### Shell Detection One-Liners
```bash
# Check if Git Bash/MSYS
[ -n "$MSYSTEM" ] && echo "Git Bash/MSYS2 detected"
# Check uname
[[ "$(uname -s)" =~ ^MINGW ]] && echo "Git Bash detected"
# Set path conversion conditionally
[ -n "$MSYSTEM" ] && export MSYS_NO_PATHCONV=1
```
## Resources
- [Git Bash Path Conversion Guide](https://www.pascallandau.com/blog/setting-up-git-bash-mingw-msys2-on-windows/)
- [MSYS2 Path Conversion Documentation](https://www.msys2.org/docs/filesystem-paths/)
- [SqlPackage Documentation](https://learn.microsoft.com/sql/tools/sqlpackage/)
- [Microsoft.Build.Sql SDK](https://www.nuget.org/packages/Microsoft.Build.Sql)
- [Git for Windows](https://gitforwindows.org/)
## Troubleshooting
### "Invalid parameter" errors
**Symptom:** SqlPackage reports "Invalid parameter" or "Unknown action"
**Cause:** Git Bash converting `/Action` to a file path
**Fix:** Use `MSYS_NO_PATHCONV=1` or double-slash `//Action`
### "File not found" with valid paths
**Symptom:** DACPAC or project file not found despite correct path
**Cause:** Path conversion mangling the file path
**Fix:** Quote paths and use `MSYS_NO_PATHCONV=1`
### Build succeeds but publish fails
**Symptom:** `dotnet build` works but `sqlpackage` fails
**Cause:** dotnet CLI handles paths correctly, but SqlPackage uses `/` parameters
**Fix:** Use `MSYS_NO_PATHCONV=1` for SqlPackage commands only
### Spaces in paths cause errors
**Symptom:** Paths with "Program Files" or other spaces fail
**Cause:** Unquoted paths split into multiple arguments
**Fix:** Always quote paths: `/SourceFile:"D:/Program Files/MyDB.dacpac"`