Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:38:44 +08:00
commit 0351cc1fd3
17 changed files with 8151 additions and 0 deletions

View File

@@ -0,0 +1,578 @@
# Power Automate Error Patterns Reference
Quick reference guide for common Power Automate error patterns and their solutions.
## Authentication Errors (401/403)
### Pattern Recognition
- Status codes: 401, 403
- Error messages containing: "unauthorized", "forbidden", "access denied", "authentication failed"
- Common in: SharePoint, OneDrive, HTTP with authentication
### Research Targets
```
PowerAutomateDocs/{Connector}/overview.md → Authentication section
PowerAutomateDocs/{Connector}/actions.md → Permission requirements
```
### Common Root Causes
1. **Expired or invalid credentials**
- Connection needs re-authentication
- Credentials rotated but connection not updated
2. **Insufficient permissions**
- Service account lacks required permissions
- SharePoint: Need "Edit" or "Full Control" on list
- OneDrive: Need appropriate file/folder permissions
3. **Conditional access policies**
- Azure AD policies blocking service accounts
- MFA requirements not met
- Location-based restrictions
### Fix Patterns
```json
{
"actions": {
"Scope_Error_Handling": {
"type": "Scope",
"actions": {
"Get_Items": {
// Original action
}
},
"runAfter": {}
},
"Catch_Auth_Error": {
"type": "Compose",
"inputs": "Authentication failed - verify connection permissions",
"runAfter": {
"Scope_Error_Handling": ["Failed", "TimedOut"]
}
}
}
}
```
## Throttling Errors (429)
### Pattern Recognition
- Status code: 429
- Error messages: "TooManyRequests", "throttled", "rate limit exceeded"
- Common in: SharePoint (600/min), OneDrive (100/min), HTTP APIs
### Research Targets
```
PowerAutomateDocs/{Connector}/overview.md → API Limits section
PowerAutomateDocs/BuiltIn/control.md → Delay actions
```
### Connector-Specific Limits
| Connector | Limit | Per |
|-----------|-------|-----|
| SharePoint | 600 calls | 60 seconds per connection |
| OneDrive | 100 calls | 60 seconds per connection |
| HTTP | 600 calls | 60 seconds (default) |
| Apply to each | 50 iterations | Concurrent (default) |
### Fix Patterns
**1. Add Delays Between Calls**
```json
{
"actions": {
"Delay": {
"type": "Wait",
"inputs": {
"interval": {
"count": 1,
"unit": "Second"
}
},
"runAfter": {
"Previous_Action": ["Succeeded"]
}
}
}
}
```
**2. Implement Exponential Backoff**
```json
{
"actions": {
"Do_Until_Success": {
"type": "Until",
"expression": "@equals(variables('Success'), true)",
"limit": {
"count": 5,
"timeout": "PT1H"
},
"actions": {
"Try_Action": {
"type": "ApiConnection",
"inputs": { /* action config */ }
},
"Check_Status": {
"type": "If",
"expression": {
"and": [
{
"equals": [
"@outputs('Try_Action')['statusCode']",
429
]
}
]
},
"actions": {
"Wait_Exponential": {
"type": "Wait",
"inputs": {
"interval": {
"count": "@mul(2, variables('RetryCount'))",
"unit": "Second"
}
}
}
}
}
}
}
}
}
```
**3. Reduce Concurrent Iterations**
```json
{
"Apply_to_each": {
"type": "Foreach",
"foreach": "@body('Get_Items')",
"runtimeConfiguration": {
"concurrency": {
"repetitions": 1
}
},
"actions": { /* ... */ }
}
}
```
## Data Format Errors
### Pattern Recognition
- Error messages: "InvalidTemplate", "Unable to process template", "cannot be evaluated", "property doesn't exist"
- Common in: Parse JSON, Compose, expressions with dynamic content
### Research Targets
```
PowerAutomateDocs/BuiltIn/data-operation.md → Parse JSON section
PowerAutomateDocs/BuiltIn/data-operation.md → Compose section
```
### Common Root Causes
1. **Missing Parse JSON Schema**
- Dynamic content not available without schema
- Schema doesn't match actual data structure
2. **Incorrect Expression Syntax**
- Invalid Power Automate expression functions
- Wrong property paths in JSON
- Type mismatches (string vs number vs array)
3. **Null/Undefined Values**
- Expressions trying to access null properties
- Missing optional fields
### Fix Patterns
**1. Add Parse JSON with Proper Schema**
```json
{
"Parse_JSON": {
"type": "ParseJson",
"inputs": {
"content": "@body('HTTP')",
"schema": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"items": {
"type": "array",
"items": {
"type": "object",
"properties": {
"value": {
"type": "string"
}
}
}
}
}
}
}
}
}
```
**2. Add Null Checks in Expressions**
```json
{
"Compose_Safe": {
"type": "Compose",
"inputs": "@if(not(empty(body('Parse_JSON')?['property'])), body('Parse_JSON')['property'], 'default_value')"
}
}
```
**3. Use Proper Type Conversions**
```json
{
"Convert_To_String": {
"type": "Compose",
"inputs": "@string(variables('NumberValue'))"
},
"Convert_To_Int": {
"type": "Compose",
"inputs": "@int(variables('StringValue'))"
}
}
```
## Timeout Errors
### Pattern Recognition
- Error messages: "timeout", "timed out", "operation took too long"
- Common in: Large file operations, long-running HTTP calls, Do until loops
### Research Targets
```
PowerAutomateDocs/{Connector}/overview.md → Known Limitations
PowerAutomateDocs/BuiltIn/control.md → Do until limits
```
### Connector-Specific Limits
| Operation | Timeout |
|-----------|---------|
| OneDrive file triggers | 50MB max file size |
| SharePoint attachments | 90MB max size |
| HTTP actions | 2 minutes default |
| Do until loops | No default (must set) |
### Fix Patterns
**1. Add Timeout Configuration**
```json
{
"Do_Until": {
"type": "Until",
"expression": "@equals(variables('Complete'), true)",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"actions": { /* ... */ }
}
}
```
**2. Implement Chunking for Large Operations**
```json
{
"Apply_to_each_Batch": {
"type": "Foreach",
"foreach": "@chunk(body('Get_Items'), 100)",
"actions": {
"Process_Batch": {
"type": "Scope",
"actions": { /* Process smaller batch */ }
}
}
}
}
```
**3. Add File Size Check**
```json
{
"Check_File_Size": {
"type": "If",
"expression": {
"and": [
{
"lessOrEquals": [
"@triggerBody()?['Size']",
52428800
]
}
]
},
"actions": {
"Process_File": { /* ... */ }
},
"else": {
"actions": {
"Handle_Large_File": { /* Alternative approach */ }
}
}
}
}
```
## Not Found Errors (404)
### Pattern Recognition
- Status code: 404
- Error messages: "not found", "does not exist", "cannot find"
- Common in: SharePoint Get Item, OneDrive Get File Content, HTTP calls
### Research Targets
```
PowerAutomateDocs/{Connector}/actions.md → Specific action requirements
PowerAutomateDocs/{Connector}/overview.md → Naming conventions
```
### Common Root Causes
1. **Incorrect Resource Paths/IDs**
- Hardcoded IDs that don't exist
- Wrong site URLs
- Invalid file paths
2. **Permissions**
- User lacks read access to resource
- Resource moved or deleted
3. **SharePoint-Specific Issues**
- List names with periods (.) cause errors
- Special characters in file names
- Spaces in URLs not encoded
### Fix Patterns
**1. Add Existence Check**
```json
{
"Try_Get_Item": {
"type": "Scope",
"actions": {
"Get_Item": {
"type": "ApiConnection",
"inputs": { /* ... */ }
}
}
},
"Check_If_Failed": {
"type": "If",
"expression": {
"and": [
{
"equals": [
"@result('Try_Get_Item')[0]['status']",
"Failed"
]
}
]
},
"runAfter": {
"Try_Get_Item": ["Failed", "Succeeded"]
},
"actions": {
"Handle_Not_Found": {
"type": "Compose",
"inputs": "Item not found - creating new one"
}
}
}
}
```
**2. Use Dynamic IDs from Previous Actions**
```json
{
"Get_Items": {
"type": "ApiConnection",
"inputs": { /* Get items first */ }
},
"Apply_to_each": {
"type": "Foreach",
"foreach": "@outputs('Get_Items')?['body/value']",
"actions": {
"Get_Item_Detail": {
"type": "ApiConnection",
"inputs": {
"host": {
"connectionName": "shared_sharepointonline"
},
"method": "get",
"path": "/datasets/@{encodeURIComponent(variables('SiteURL'))}/tables/@{encodeURIComponent(variables('ListID'))}/items/@{items('Apply_to_each')?['ID']}"
}
}
}
}
}
```
## Permission Errors (403)
### Pattern Recognition
- Status code: 403
- Error messages: "forbidden", "access denied", "insufficient permissions"
- Different from 401 (authentication vs authorization)
### Research Targets
```
PowerAutomateDocs/{Connector}/actions.md → Required permissions
PowerAutomateDocs/{Connector}/overview.md → Permission scopes
```
### Common Root Causes
1. **SharePoint Permissions**
- Need "Edit" for Create/Update/Delete
- Need "Read" for Get operations
- Site-level vs list-level permissions
2. **OneDrive Permissions**
- Need write access for file operations
- Shared folders require special handling
3. **Delegated vs Application Permissions**
- Service accounts need proper permission grants
- Azure AD application permissions
### Fix Patterns
**1. Verify and Document Required Permissions**
```json
{
"actions": {
"Comment_Permissions": {
"type": "Compose",
"inputs": "This flow requires: SharePoint site collection admin or list owner permissions"
},
"Try_Action_With_Permission": {
"type": "Scope",
"actions": { /* Action requiring permissions */ }
},
"Handle_Permission_Error": {
"type": "If",
"expression": {
"and": [
{
"equals": [
"@result('Try_Action_With_Permission')[0]['code']",
"Forbidden"
]
}
]
},
"runAfter": {
"Try_Action_With_Permission": ["Failed", "Succeeded"]
},
"actions": {
"Send_Permission_Request": {
"type": "ApiConnection",
"inputs": { /* Send email to admin */ }
}
}
}
}
}
```
## Invalid JSON/Syntax Errors
### Pattern Recognition
- Error messages: "Invalid JSON", "syntax error", "unexpected token"
- Common in: HTTP response parsing, JSON composition, dynamic expression building
### Research Targets
```
PowerAutomateDocs/BuiltIn/data-operation.md → JSON handling
PowerAutomateDocs/power-automate-json-format.md → Valid structure
```
### Fix Patterns
**1. Escape Special Characters**
```json
{
"Compose_JSON_String": {
"type": "Compose",
"inputs": "@{replace(variables('TextWithQuotes'), '\"', '\\\"')}"
}
}
```
**2. Validate JSON Before Parsing**
```json
{
"Try_Parse": {
"type": "Scope",
"actions": {
"Parse_JSON": {
"type": "ParseJson",
"inputs": {
"content": "@body('HTTP')"
}
}
}
},
"Handle_Invalid_JSON": {
"type": "If",
"expression": {
"and": [
{
"equals": [
"@result('Try_Parse')[0]['status']",
"Failed"
]
}
]
},
"runAfter": {
"Try_Parse": ["Failed", "Succeeded"]
},
"actions": {
"Log_Invalid_Response": {
"type": "Compose",
"inputs": "Invalid JSON received from API"
}
}
}
}
```
## Cross-Reference Matrix
| Error Code | Error Type | Primary Research Location | Common Connectors |
|------------|------------|--------------------------|-------------------|
| 401 | Authentication | {Connector}/overview.md | SharePoint, OneDrive, HTTP |
| 403 | Permission | {Connector}/actions.md | SharePoint, OneDrive |
| 404 | Not Found | {Connector}/actions.md | SharePoint, OneDrive, HTTP |
| 429 | Throttling | {Connector}/overview.md | SharePoint, OneDrive, HTTP |
| 500 | Server Error | {Connector}/overview.md | HTTP, SharePoint |
| N/A | Data Format | BuiltIn/data-operation.md | Parse JSON, Compose |
| N/A | Timeout | BuiltIn/control.md | Do Until, HTTP |
| N/A | Expression Error | BuiltIn/data-operation.md | All actions |
## Usage in Debugger Skill
This reference should be consulted during Phase 1 (Error Analysis) to:
1. Quickly classify the error type
2. Identify relevant research targets
3. Understand common root causes
4. Reference appropriate fix patterns
The patterns here are templates - always customize based on:
- Specific connector documentation from PowerAutomateDocs/
- Actual error details from erreur.json
- User's flow requirements and constraints

View File

@@ -0,0 +1,546 @@
# Complete Debugger Skill Example
This document demonstrates the complete workflow of the power-automate-debugger skill from input to output.
## Example Scenario: SharePoint Throttling Error
### User Input
User provides `erreur_bloc.json`:
```json
{
"error": {
"code": "429",
"message": "Status code: 429, TooManyRequests - The request was throttled. Please retry after some time."
},
"failedAction": "Get_items",
"flowDefinition": {
"definition": {
"triggers": {
"manual": {
"type": "Request",
"kind": "Button",
"inputs": {}
}
},
"actions": {
"Apply_to_each": {
"type": "Foreach",
"foreach": "@range(0, 100)",
"actions": {
"Get_items": {
"type": "ApiConnection",
"inputs": {
"host": {
"connectionName": "shared_sharepointonline",
"operationId": "GetItems",
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline"
},
"parameters": {
"dataset": "https://contoso.sharepoint.com/sites/ProjectSite",
"table": "d8a0e3c2-4f61-4b89-9012-f3a8c5d6e7b9"
}
},
"runAfter": {}
}
},
"runAfter": {}
}
}
}
}
}
```
User message: "Debug this erreur_bloc.json - my SharePoint flow keeps failing"
---
## Skill Execution Workflow
### Phase 1: Error Analysis
**Skill identifies**:
- Error Code: 429
- Error Type: Throttling
- Failing Action: "Get_items" (SharePoint Get Items)
- Connector: SharePoint
- Context: Inside Apply to each loop with 100 iterations
- No delay between calls
**Classification**: Throttling error due to exceeding SharePoint API limits
---
### Phase 2: Documentation Research
**Skill launches Task tool with Explore agent**:
```
Task tool call:
subagent_type: "Explore"
thoroughness: "very thorough"
prompt: "Research Power Automate documentation for throttling error in SharePoint connector, specifically the GetItems action.
Search in PowerAutomateDocs/ for:
1. Connector overview (PowerAutomateDocs/SharePoint/overview.md) - find API limits
2. Action documentation (PowerAutomateDocs/SharePoint/actions.md) - find GetItems specifics
3. Control flow docs (PowerAutomateDocs/BuiltIn/control.md) - find Apply to each concurrency
Focus on finding:
- SharePoint API call limits
- Recommended delays between calls
- Concurrency settings for Apply to each
- Throttling mitigation strategies
Return specific file paths, section names, and exact limitations found."
```
**Research Agent Returns**:
```
Found in PowerAutomateDocs/SharePoint/overview.md:
Section: "API and Request Limits"
- SharePoint Online connector: 600 API calls per 60 seconds per connection
- Throttling occurs when limit exceeded
- Recommended: Implement delays and reduce concurrency
Found in PowerAutomateDocs/BuiltIn/control.md:
Section: "Apply to each"
- Default concurrency: 50 parallel iterations
- Can be reduced: "concurrency": { "repetitions": 1 }
- Recommendation: Set to 1 for API-heavy operations
Section: "Delay"
- Use "Wait" action type
- Intervals: seconds, minutes, hours
- Example: 1-2 second delay between API calls
```
---
### Phase 3: Solution Design
**Based on research, skill designs solution**:
**Root Cause**:
- Loop makes 100 rapid SharePoint API calls
- Default concurrency of 50 parallel iterations
- No delays between calls
- Exceeds 600 calls/60 seconds limit
**Solution Strategy**:
1. Reduce concurrency to 1 (sequential processing)
2. Add 1-second delay between iterations
3. Wrap in error handling scope
4. Add retry logic for remaining throttled requests
**Validation**:
- With 1-second delays: Max 60 calls per minute
- Well under 600 calls/60 seconds limit
- Sequential processing prevents burst
- Error handling catches any remaining throttling
---
### Phase 4: Fix Generation
**Skill launches Task tool with flow-builder agent**:
```
Task tool call:
subagent_type: "general-purpose"
prompt: "Generate complete fix_bloc.json for Power Automate flow with the following requirements:
Original Error: SharePoint throttling (429) in Get Items action within Apply to each loop
Root Cause: 100 rapid API calls with 50 concurrent iterations exceeding 600 calls/60 seconds limit
Required Fixes:
1. Reduce Apply to each concurrency to 1 (sequential)
2. Add 1-second delay after each Get Items call
3. Wrap in Scope for error handling
4. Add catch action for any remaining throttling errors
Create a complete, valid Power Automate flow JSON following PowerAutomateDocs/power-automate-json-format.md that:
- Maintains the manual trigger
- Processes 100 iterations sequentially with 1-second delays
- Includes proper GUIDs for all actions
- Has complete error handling structure
- Is ready for copy-paste into Power Automate
Return ONLY the complete JSON with no placeholders."
```
**Flow-Builder Agent Returns**: Complete valid JSON (shown in output below)
---
### Phase 5: Structured Output
**Skill generates final output using template**:
```xml
<debug_report>
<error_analysis>
**Error Type**: Throttling
**Failing Action/Trigger**: Get_items (SharePoint Get Items action)
**Error Message**:
```
Status code: 429, TooManyRequests - The request was throttled. Please retry after some time.
```
**Impact**: Flow fails when processing multiple SharePoint list items due to API rate limit exceeded. The Apply to each loop attempts 100 API calls in rapid succession, exceeding SharePoint's 600 calls per 60 seconds limit.
</error_analysis>
<root_cause>
**Primary Cause**: SharePoint API rate limit exceeded due to high-concurrency loop without throttling mitigation
**Documentation Reference**:
- File: `PowerAutomateDocs/SharePoint/overview.md`
- Section: "API and Request Limits"
- Key Limitation: "SharePoint Online connector enforces 600 API calls per 60 seconds per connection"
**Technical Details**:
- The Apply to each loop has default concurrency of 50 parallel iterations
- 100 Get Items calls attempted simultaneously without delays
- Each Get Items call counts toward the 600/60s limit
- Burst of parallel calls triggers throttling response (429)
- No retry logic to handle transient throttling
**Contributing Factors**:
- File: `PowerAutomateDocs/BuiltIn/control.md`
- Section: "Apply to each - Concurrency Control"
- Default: 50 concurrent iterations (too high for API operations)
- Recommendation: Set to 1 for API-heavy operations
</root_cause>
<solution>
## Recommended Fix
**Approach**: Reduce concurrency to sequential processing with delays between API calls
### Step-by-Step Implementation
1. **Reduce Apply to each Concurrency**
- Modify: `runtimeConfiguration.concurrency.repetitions`
- From: `50` (default)
- To: `1` (sequential)
- Reason: Prevents burst of parallel API calls that trigger throttling
2. **Add Delay After Each API Call**
- Add: New "Wait" action after Get_items
- Duration: 1 second
- Purpose: Ensures maximum 60 calls per minute (well under 600/60s limit)
- Documentation: `PowerAutomateDocs/BuiltIn/control.md` - Delay section
3. **Wrap in Error Handling Scope**
- Add: Scope action wrapping the loop
- Purpose: Catches any remaining throttling errors
- Benefit: Graceful failure handling with logging
4. **Add Throttling Error Handler**
- Add: Action configured to run after Scope fails
- Purpose: Logs throttling events for monitoring
- Alternative: Could implement retry logic for complete automation
### Additional Improvements
- **Error Handling**: Scope with "Configure run after" to catch failures
- **Performance**: With 1-second delays, processes 60 items/minute (stable rate)
- **Reliability**: Sequential processing eliminates race conditions
- **Monitoring**: Error handler provides visibility into throttling events
### Validation Checklist
- [x] Concurrency reduced to 1 (sequential processing)
- [x] 1-second delay added between iterations
- [x] API calls rate: ~60/minute (under 600/60s limit)
- [x] Error handling scope implemented
- [x] Throttling catch action added
- [x] All GUIDs valid and unique
- [x] runAfter dependencies correct
- [x] JSON syntax valid
</solution>
<fixed_json>
## Complete Fixed JSON
**Instructions**: Copy the entire JSON block below and paste directly into Power Automate using "Paste code" feature.
```json
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"type": "Request",
"kind": "Button",
"inputs": {
"schema": {
"type": "object",
"properties": {}
}
}
}
},
"actions": {
"Scope_Process_Items": {
"type": "Scope",
"actions": {
"Apply_to_each": {
"type": "Foreach",
"foreach": "@range(0, 100)",
"runtimeConfiguration": {
"concurrency": {
"repetitions": 1
}
},
"actions": {
"Get_items": {
"type": "ApiConnection",
"inputs": {
"host": {
"connectionName": "shared_sharepointonline",
"operationId": "GetItems",
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline"
},
"parameters": {
"dataset": "https://contoso.sharepoint.com/sites/ProjectSite",
"table": "d8a0e3c2-4f61-4b89-9012-f3a8c5d6e7b9"
}
},
"runAfter": {},
"metadata": {
"operationMetadataId": "a1b2c3d4-e5f6-4a8b-9c0d-1e2f3a4b5c6d"
}
},
"Delay_Between_Calls": {
"type": "Wait",
"inputs": {
"interval": {
"count": 1,
"unit": "Second"
}
},
"runAfter": {
"Get_items": [
"Succeeded"
]
},
"metadata": {
"operationMetadataId": "b2c3d4e5-f6a7-4b8c-9d0e-1f2a3b4c5d6e"
}
}
},
"runAfter": {},
"metadata": {
"operationMetadataId": "c3d4e5f6-a7b8-4c9d-0e1f-2a3b4c5d6e7f"
}
}
},
"runAfter": {},
"metadata": {
"operationMetadataId": "d4e5f6a7-b8c9-4d0e-1f2a-3b4c5d6e7f8a"
}
},
"Handle_Throttling_Error": {
"type": "Compose",
"inputs": {
"status": "Throttling error encountered",
"message": "Some items may not have been processed due to API limits",
"recommendation": "Review flow run history for details",
"timestamp": "@utcNow()"
},
"runAfter": {
"Scope_Process_Items": [
"Failed",
"TimedOut"
]
},
"metadata": {
"operationMetadataId": "e5f6a7b8-c9d0-4e1f-2a3b-4c5d6e7f8a9b"
}
}
},
"outputs": {}
},
"schemaVersion": "1.0.0.0"
}
```
**Changes Applied**:
1. **Concurrency Reduced**: Added `runtimeConfiguration.concurrency.repetitions: 1` to Apply_to_each
2. **Delay Added**: New "Delay_Between_Calls" action with 1-second wait after each Get_items
3. **Error Scope**: Wrapped loop in "Scope_Process_Items" for structured error handling
4. **Throttling Handler**: Added "Handle_Throttling_Error" Compose action with helpful error message
5. **Metadata GUIDs**: All actions have unique operationMetadataId values
6. **runAfter Dependencies**: Proper sequential execution configured
**Configuration Notes**:
- After pasting, verify SharePoint connection is configured
- Confirm site URL and list ID match your environment
- Test with smaller range (e.g., `range(0, 10)`) before full run
- Monitor first execution to confirm throttling resolved
- Can reduce delay to 0.5 seconds if processing speed needed (still safe under limits)
</fixed_json>
<alternative_approaches>
## Alternative Solutions
### Alternative 1: Batch Processing
**When to use**: Need faster processing while respecting limits
**Pros**:
- Processes items faster (groups of 10 with delays)
- Still respects API limits
**Cons**:
- More complex to implement
- Requires additional scope for batch logic
**Implementation**:
1. Use nested Apply to each
2. Outer loop chunks items into batches of 10
3. Inner loop processes batch with concurrency 10
4. Add 10-second delay between batches
### Alternative 2: Scheduled Batch Flow
**When to use**: Not time-sensitive, large volume of items
**Pros**:
- Eliminates throttling entirely
- Processes all items reliably
**Cons**:
- Not real-time
- Requires separate tracking of processed items
**Implementation**:
1. Change to scheduled trigger (every 15 minutes)
2. Get only unprocessed items (filtered query)
3. Process up to 600 items per run
4. Mark items as processed
### Alternative 3: Multiple Connections
**When to use**: Extremely high volume requirements
**Pros**:
- Multiplies rate limit (600 per connection)
- Faster processing
**Cons**:
- Requires multiple SharePoint connections
- More complex configuration
- Higher licensing considerations
**Implementation**:
1. Create multiple SharePoint connections
2. Distribute items across connections (modulo operation)
3. Process each subset with separate Apply to each
4. Merge results
</alternative_approaches>
<prevention>
## Preventing Similar Errors
**Best Practices**:
1. **Always set concurrency** for API-heavy Apply to each loops
- Default 50 concurrent is too high for most API operations
- Use 1 for critical operations, 5-10 for less strict APIs
2. **Add delays proactively** when making repeated API calls
- 1-second delay for SharePoint (600/min limit)
- 3-second delay for OneDrive (100/min limit)
- Check connector documentation for specific limits
3. **Implement error handling** for all API interactions
- Use Scope actions with "Configure run after"
- Log throttling events for monitoring
- Consider retry logic for transient failures
**Monitoring**:
- Watch for: 429 status codes in flow run history
- Metrics to track:
- Number of API calls per flow run
- Average flow duration (sudden increases indicate throttling)
- Failure rate over time
- Regular checks:
- Review flow analytics weekly
- Adjust delays if approaching limits
- Optimize queries to reduce API calls
**Design Guidelines**:
- **Filter at source**: Use $filter in Get Items to reduce results
- **Properties-only**: Use "Get Items (properties only)" when content not needed
- **Batch operations**: Group operations when connector supports it
- **Cache data**: Store frequently accessed data in variables
- **Parallel flows**: Distribute load across multiple flows if appropriate
</prevention>
</debug_report>
```
---
## Key Success Factors
### Documentation Research
- Research agent found exact API limit (600/60s)
- Identified specific sections in PowerAutomateDocs/
- Referenced control flow documentation for concurrency
### Solution Quality
- Addresses root cause (high concurrency)
- Implements proven pattern (sequential + delays)
- Adds comprehensive error handling
- Provides monitoring capability
### JSON Completeness
- Flow-builder agent generated valid JSON
- All GUIDs unique and properly formatted
- runAfter dependencies correct
- Ready for immediate paste into Power Automate
- No placeholders or TODOs
### User Experience
- Clear explanation of problem
- Step-by-step fix instructions
- Multiple alternative approaches
- Prevention guidance for future
- Complete working solution
---
## Verification
After user pastes the JSON:
1. **Immediate checks**:
- [ ] JSON accepted by Power Automate (no syntax errors)
- [ ] All connections show properly
- [ ] Actions display correctly in designer
2. **Configuration**:
- [ ] SharePoint connection authenticated
- [ ] Site URL and List ID correct for environment
- [ ] Test with small range first (range(0, 5))
3. **Test run**:
- [ ] Flow completes without 429 errors
- [ ] Items processed sequentially (check run history)
- [ ] Delays visible in action history (~1 second between calls)
- [ ] Total run time reasonable (~100 seconds for 100 items)
4. **Production readiness**:
- [ ] Scale up to full range if test successful
- [ ] Monitor first few runs
- [ ] Confirm no throttling in analytics
- [ ] Document configuration for team
---
This example demonstrates the complete capability of the power-automate-debugger skill to transform an error report into a production-ready solution.

View File

@@ -0,0 +1,712 @@
---
name: automation-debugger
description: Expert automation platform error debugger for Power Automate, n8n, Make, Zapier and other workflow platforms. Analyzes JSON flow definitions with error messages, researches official documentation, and generates complete fixed JSON ready for copy-paste. Triggers when user provides error JSON files, workflow JSON with errors, error messages, debug requests, or failing automation content. Returns structured debug report with root cause analysis and working fixed JSON.
---
# Automation Debugger
Expert system for debugging automation workflow errors across multiple platforms and generating production-ready fixes.
## Supported Platforms
- **Power Automate** (Microsoft)
- **n8n** (Open-source automation)
- **Make** (formerly Integromat)
- **Zapier**
- **Other JSON-based workflow platforms**
## Purpose
This skill provides comprehensive debugging for automation workflows by:
1. Analyzing error messages and failing JSON flow definitions
2. Researching official documentation to find root causes (Docs/ directory)
3. Generating complete, valid fixed JSON ready for copy-paste (outputs as fixed_flow.json or similar)
4. Avoiding hallucinations by strictly referencing platform documentation
5. Adapting solutions to platform-specific requirements
## When This Skill Activates
Automatically activates when user provides:
- **Error JSON files** - Any JSON file containing error information (examples: error.json, erreur.json, error_bloc.json, workflow_error.json)
- **Workflow JSON with errors** - Flow/workflow JSON content with error description (any platform)
- **Error messages** - Error messages from workflow runs, logs, or execution history
- **Debug requests** - Explicit requests to "debug", "fix", "analyze", or "troubleshoot" automation errors
- **Status codes** - HTTP status codes (401, 403, 404, 429, 500, etc.) with workflow context
- **Platform-specific errors** - "Power Automate error", "n8n workflow failing", "Make scenario issue", "Zapier zap broken"
- **Failure descriptions** - "My flow keeps failing", "this automation stopped working", "getting errors in workflow"
## Common Pitfalls to AVOID
### ❌ CRITICAL ERROR #1: Assuming Data Structures (Most Common!)
**The Mistake**:
```json
// You see a filter in the error flow
"where": "@contains(toLower(item()), 'keyword')"
// And you ASSUME it's filtering objects, so you fix with:
"value": "@items('Loop')?['PropertyName']" // ← WRONG IF ARRAY OF STRINGS!
```
**How to Avoid**:
1. **ALWAYS analyze filter syntax first**: `item()` vs `item()?['Prop']`
2. `item()` without property → Array of **primitives** (strings/numbers)
3. `item()?['Property']` → Array of **objects**
4. **Ask user when uncertain** - never guess data structures!
5. Trace data sources back to understand actual types
**Our Real Bug Example**:
```json
// Filter showed it was strings:
"where": "@contains(toLower(item()), 'cnesst')"
// But I incorrectly suggested accessing property:
"value": "@items('LoopCNESST')?['Nom']" // ← BUG!
// Correct fix was:
"value": "@items('LoopCNESST')" // ← Just the string!
```
---
### ❌ ERROR #2: Guessing Connector Outputs
**The Mistake**: Assuming SharePoint GetFileItems returns objects with 'Name' property without checking
**How to Avoid**:
1. Search Docs/{Platform}_Documentation/ for output schemas
2. Use WebSearch for official documentation if local docs don't have it
3. Ask user to confirm property names if uncertain
4. Check Select action mappings (they reveal structure)
---
### ❌ ERROR #3: Not Validating Fixes Against Data Flow
**The Mistake**: Creating fixes that work syntactically but fail with actual data types
**How to Avoid**:
1. Complete Phase 0 (Data Structure Analysis) before fixing
2. Trace entire data flow from source to error point
3. Validate item access patterns match data types
4. Test logic against actual data structures
---
## Core Workflow
### Phase 0: Data Structure Analysis (CRITICAL - NEW!)
**NEVER assume data structures without verification**. Before debugging, ALWAYS analyze actual data structures in the failing flow.
1. **Examine the Error Context**
- Identify which action/variable contains the problematic data
- Look for variable declarations to understand types
- Check if error mentions property access on undefined/null
2. **Analyze Filter/Query Operations** (MOST IMPORTANT!)
- **CRITICAL CHECK**: Look at `where` clauses in Query actions
- `item()` without property → **Array of primitives (strings/numbers)**
```json
"where": "@contains(toLower(item()), 'keyword')"
// ↑ This means: Array of strings ["string1", "string2"]
```
- `item()?['PropertyName']` → **Array of objects**
```json
"where": "@contains(item()?['Name'], 'keyword')"
// ↑ This means: Array of objects [{"Name": "..."}, ...]
```
3. **Trace Data Sources**
- Follow `from` in Query actions back to source
- Check SharePoint/connector outputs
- Check Select action mappings (these define output structure!)
- Verify Compose action inputs
- Look for Array variable initializations
4. **Validate Item Access Consistency**
- In loops: Check if `items('LoopName')` or `items('LoopName')?['Property']`
- In filters: Check if `item()` or `item()?['Property']`
- **MISMATCH = BUG**: Filter uses `item()` but loop uses `items('Loop')?['Prop']` → ERROR!
5. **Ask User When Uncertain** (Use AskUserQuestion tool)
- If data structure is ambiguous after analysis
- If documentation doesn't clarify the structure
- If multiple valid interpretations exist
- If error message doesn't make structure clear
**Example questions**:
- "I see your filter uses `contains(item(), 'text')`. Is your variable an array of strings like `['text1', 'text2']`, or an array of objects?"
- "Can you confirm the output structure from [ActionName]? Is it strings or objects with properties?"
- "The flow accesses `items('Loop')?['Nom']` but the filter suggests strings. Which is correct?"
### Phase 1: Error Analysis
1. **Extract Error Information**
- Parse the provided JSON (error file, flow snippet, or workflow JSON)
- Identify the failing action or trigger
- Classify error type (Authentication/Throttling/Data Format/Timeout/Not Found/Permission)
- Extract exact error message text
2. **Identify Context**
- Determine which platform (Power Automate, n8n, Make, Zapier, etc.)
- Identify connector/node involved (SharePoint, HTTP, Database, etc.)
- Find the specific action or trigger causing the failure
- Note any relevant workflow configuration or parameters
### Phase 2: Documentation Research (Multi-Source Strategy - NEW!)
**CRITICAL**: Research thoroughly using multiple sources in order. Never skip or guess!
#### Step 1: Local Documentation Research (Use Task Tool with Explore Agent)
Launch research sub-agent with thorough investigation:
```
Use Task tool with subagent_type="Explore" and thoroughness="very thorough"
Prompt: "Research [PLATFORM] documentation for [ERROR_TYPE] error in [CONNECTOR/NODE_NAME], specifically the [ACTION_NAME] action/node.
Platform: [Power Automate / n8n / Make / Zapier / Other]
Search in Docs/ directory for platform-specific documentation:
1. Platform documentation (Docs/{Platform}_Documentation/)
2. Connector/node overview - find limitations and constraints
3. **Data structure specifications** (output schemas, property names, data types)
4. Action/node documentation - find parameter requirements
5. Common error patterns for this platform
Focus on finding:
- **DATA STRUCTURE DETAILS** (output schemas, property names, array vs object)
- Known limitations causing this error
- Required parameters and their formats (platform-specific)
- API limits or throttling constraints
- Authentication/permission requirements
- Platform-specific quirks or known issues
- Common error scenarios and solutions
Return specific file paths, section names, and exact limitations found."
```
**Expected Output from Research Agent**:
- **Data structure specifications** (schemas, property names, types)
- Specific documentation file paths (Docs/{Platform}_Documentation/)
- Relevant limitations or constraints
- Required parameter formats (platform-specific)
- Known workarounds or solutions
- Platform-specific considerations
#### Step 2: Web Research (Use WebSearch Tool - If Local Docs Insufficient)
**ONLY if local documentation doesn't provide needed information**, use WebSearch:
```
Use WebSearch tool with targeted queries:
Examples:
- "[PLATFORM] [CONNECTOR] output schema"
- "[PLATFORM] [ACTION_NAME] return properties"
- "[ERROR_CODE] [PLATFORM] [CONNECTOR] solution"
- "Microsoft Learn [PLATFORM] [CONNECTOR] documentation"
Search for:
- Official platform documentation (Microsoft Learn, n8n docs, etc.)
- Platform-specific community forums (verified answers only)
- Official API documentation
- Recent Stack Overflow solutions (check dates!)
```
**Validation Requirements**:
- Prefer official documentation over community posts
- Cross-reference multiple sources
- Verify information is current (check publication dates)
- Cite sources in output
#### Step 3: Ask User for Clarification (Use AskUserQuestion Tool)
**If both local docs and web search are unclear or contradictory**, ask the user:
```
Use AskUserQuestion tool with specific targeted questions:
Example:
{
"questions": [{
"question": "Your flow uses `contains(item(), 'CNESST')` in the filter. Can you confirm what type of data is in this variable?",
"header": "Data Type",
"options": [
{
"label": "Array of strings",
"description": "Like ['CNESST 2025-01', 'INVALIDITÉ 2025-02']"
},
{
"label": "Array of objects",
"description": "Like [{'Nom': 'CNESST 2025-01'}, {'Nom': 'INVALIDITÉ 2025-02'}]"
}
],
"multiSelect": false
}]
}
```
**When to Ask**:
- Data structure is ambiguous after documentation search
- Multiple valid interpretations exist for the error
- Documentation contradicts observed flow patterns
- Risk of creating a fix that introduces new bugs
### Phase 3: Solution Design
Based on research findings:
1. **Root Cause Identification**
- Link error to specific documentation constraint
- Explain WHY the error occurs (technical reasoning)
- Reference exact file and section from Docs/{Platform}_Documentation/
- Consider platform-specific behaviors
2. **Solution Strategy**
- Design fix addressing root cause
- Consider connector limitations
- Include error handling patterns
- Add retry logic if needed for transient errors
- Optimize for API limits
3. **Validation**
- Verify solution against documentation constraints
- Check for unintended side effects
- Ensure compliance with Power Automate best practices
- Validate all parameters and data types
### Phase 4: Fix Generation (Use Task Tool with Flow-Builder Agent)
**CRITICAL**: Use the Task tool to launch a flow-builder agent for JSON generation.
Launch flow-builder sub-agent:
```
Use Task tool with subagent_type="Plan" or "general-purpose"
Prompt: "Generate complete fixed workflow JSON for [PLATFORM] with the following requirements:
Platform: [Power Automate / n8n / Make / Zapier / Other]
Original Error: [ERROR_DETAILS]
Root Cause: [ROOT_CAUSE_FROM_RESEARCH]
Required Fixes: [SPECIFIC_CHANGES_NEEDED]
Create a complete, valid workflow JSON following the platform-specific format in Docs/{Platform}_Documentation/ that:
1. Includes all fixes for the identified error
2. Maintains existing workflow logic that works
3. Adds proper error handling (platform-specific)
4. Includes retry logic if dealing with transient errors
5. Respects API limits with delays if needed
6. Uses valid IDs/GUIDs as required by platform
7. Has correct dependencies/execution order
8. Follows platform-specific syntax and structure
9. Is ready for copy-paste into platform's import/code feature
Return ONLY the complete JSON with no placeholders or TODOs."
```
**Expected Output from Flow-Builder Agent**:
- Complete, syntactically valid JSON (platform-specific format)
- All required structure elements for the platform
- Proper IDs/GUIDs for all operations
- Correct expression/formula syntax
- Complete execution chains/dependencies
### Phase 5: Structured Output
Generate final output using the template from `output-style/template-debug-output.md`:
1. **Error Analysis Section**
- Error type classification
- Failing action/trigger identification
- Original error message
- Impact description
2. **Root Cause Section**
- Primary cause explanation
- Documentation references with specific file paths
- Technical details of the issue
3. **Solution Section**
- Recommended fix approach
- Step-by-step implementation instructions
- Additional improvements (error handling, performance, reliability)
- Validation checklist
4. **Fixed JSON Section**
- Complete, working JSON ready for copy-paste
- Platform-specific format
- List of changes applied
- Configuration notes for after pasting/importing
5. **Alternative Approaches Section** (if applicable)
- Other viable solutions
- When to use each alternative
- Pros/cons comparison
6. **Prevention Section**
- Best practices to avoid similar errors
- Monitoring recommendations
- Regular maintenance tasks
## Output Format
**ALWAYS** follow the XML-structured format from `output-style/template-debug-output.md`:
```xml
<debug_report>
<error_analysis>
[Error classification and details]
</error_analysis>
<root_cause>
[Root cause with documentation references]
</root_cause>
<solution>
[Step-by-step fix instructions]
</solution>
<fixed_json>
[Complete valid JSON ready for copy-paste]
</fixed_json>
<alternative_approaches>
[Other viable solutions if applicable]
</alternative_approaches>
<prevention>
[Best practices and monitoring]
</prevention>
</debug_report>
```
## Critical Requirements
### Documentation-First Approach
**NEVER hallucinate or guess**. Always:
1. Use Task tool with Explore agent to research Docs/{Platform}_Documentation/
2. Reference specific files and sections
3. Quote exact limitations from documentation
4. Verify solutions against documented constraints
5. Adapt to platform-specific requirements and syntax
### Complete JSON Output
**NEVER output partial or placeholder JSON**. Always:
1. Use Task tool with flow-builder agent to generate complete JSON
2. Include all required structure elements (platform-specific)
3. Generate valid IDs/GUIDs as required by platform
4. Ensure syntactic validity (balanced brackets, escaped strings)
5. Validate against platform-specific format documentation
6. Make JSON immediately copy-paste/import ready
7. Follow platform naming conventions and structure
### Quality Assurance
Before delivering output, verify:
**Data Structure Validation** (CRITICAL - New!):
- [ ] Completed Phase 0 (Data Structure Analysis) before creating fix
- [ ] Analyzed all filter/query `where` clauses for data type indicators
- [ ] Verified `item()` vs `item()?['Property']` consistency in fix
- [ ] Checked loop `items()` usage matches actual data structure
- [ ] Traced variable sources to understand actual data types
- [ ] Asked user for clarification if structure was ambiguous
- [ ] No assumptions made about object properties without verification
- [ ] Fix validates against actual data flow (not just syntax)
**Research & Documentation**:
- [ ] Platform identified correctly
- [ ] Research agent consulted Docs/{Platform}_Documentation/
- [ ] Searched web if local docs insufficient (cited sources)
- [ ] Root cause references actual documentation files
- [ ] All claims backed by documentation (local or web)
**Technical Validation**:
- [ ] Flow-builder agent generated complete JSON
- [ ] JSON follows platform-specific format
- [ ] JSON is syntactically valid (no placeholders)
- [ ] All expressions syntactically correct
- [ ] Data type handling matches actual structures (primitives vs objects)
- [ ] Platform-specific syntax and conventions followed
**Output Quality**:
- [ ] All sections of debug report template populated
- [ ] Changes explained clearly with WHY (not just WHAT)
- [ ] Solution is actionable and specific
- [ ] Alternative approaches provided if applicable
- [ ] Prevention tips included to avoid future similar errors
## Error Pattern Knowledge Base
### Common Error Types
1. **Authentication (401/403)**
- Research: Docs/{Platform}_Documentation/{Connector}/overview.md → Authentication section
- Check: Permission requirements, connection credentials
- Fix: Proper authentication configuration, permission grants
2. **Throttling (429)**
- Research: Docs/{Platform}_Documentation/{Connector}/overview.md → API Limits section
- Check: Call frequency, batch operations
- Fix: Add delays, implement exponential backoff, use batch operations
3. **Data Format**
- Research: Docs/{Platform}_Documentation/BuiltIn/data-operation.md
- Check: Schema requirements, data types, required fields
- Fix: Proper Parse JSON schema, type conversions
4. **Timeout**
- Research: Docs/{Platform}_Documentation/{Connector}/overview.md → Known Limitations
- Check: File sizes, operation duration, Do until loops
- Fix: Implement chunking, set proper timeouts, optimize queries
5. **Not Found (404)**
- Research: Docs/{Platform}_Documentation/{Connector}/actions.md
- Check: Resource paths, IDs, permissions, resource existence
- Fix: Validate paths, check permissions, add existence checks
## Sub-Agent Coordination
### Research Agent (Explore)
**Purpose**: Find relevant documentation and constraints
**Input Requirements**:
- Error type
- Connector name
- Action/trigger name
**Expected Output**:
- File paths: `Docs/{Platform}_Documentation/{Connector}/overview.md`
- Specific sections: "Known Limitations", "API Limits", etc.
- Exact constraints: "Max 50MB file size", "600 calls per 60 seconds"
**Invocation**:
```
Task tool, subagent_type="Explore", thoroughness="very thorough"
```
### Flow-Builder Agent
**Purpose**: Generate complete, valid Power Automate JSON
**Input Requirements**:
- Original error details
- Root cause analysis
- Specific fixes needed
- Documentation constraints
**Expected Output**:
- Complete JSON structure
- Valid GUIDs
- Proper runAfter chains
- No placeholders or TODOs
**Invocation**:
```
Task tool, subagent_type="general-purpose" or "Plan"
```
## Examples
### Example 1: Data Structure Bug (Real Case Study - NEW!)
**User Input**: "Fix the DossierPlusRécent variable - the filter works but loop assignment fails"
**Error Flow Snippet**:
```json
{
"Filtrer_dossier_CNESST": {
"type": "Query",
"inputs": {
"from": "@variables('ListeDesDossier')",
"where": "@contains(toLower(item()), 'cnesst')" // ← KEY: item() without property!
}
},
"LoopCNESST": {
"foreach": "@body('Filtrer_dossier_CNESST')",
"actions": {
"DossierPlusRécent_CNESST": {
"type": "SetVariable",
"inputs": {
"name": "DossierPlusRecent",
"value": "@items('LoopCNESST')?['Nom']" // ← BUG: Accessing property on string!
}
},
"DEBUG_jour_CNESST": {
"type": "Compose",
"inputs": "@int(substring(
split(items('LoopCNESST')?['Nom'], '.')[0], // ← BUG: Also accessing Nom
8, 2))"
}
}
}
}
```
**Debugging Workflow**:
1. **Phase 0 - Data Structure Analysis**:
- ✅ Examined filter: `"where": "@contains(toLower(item()), 'cnesst')"`
- ✅ **CRITICAL INSIGHT**: `item()` without property → Array of **strings**!
- ✅ Not `item()?['Nom']` → So NOT array of objects
- ✅ **Conclusion**: `ListeDesDossier` = `["CNESST 2025-01-15", "CNESST 2025-02-20"]`
2. **Phase 1 - Error Analysis**:
- Loop iterates over strings from filter
- But tries to access: `items('LoopCNESST')?['Nom']`
- **ERROR**: Can't access property 'Nom' on string primitive!
- Same bug in DEBUG_jour_CNESST action
3. **Phase 2 - Research** (Skipped - structure analysis sufficient):
- Data structure analysis revealed the issue
- No need for documentation research for this bug type
4. **Phase 3 - Solution Design**:
- Remove `?['Nom']` property access throughout
- Access string directly: `@items('LoopCNESST')`
- Update all actions that incorrectly access 'Nom' property
5. **Phase 4 - Generate Fix**:
```json
{
"DossierPlusRécent_CNESST": {
"type": "SetVariable",
"inputs": {
"name": "DossierPlusRecent",
"value": "@items('LoopCNESST')" // ← FIXED: Direct string access
}
},
"DEBUG_jour_CNESST": {
"type": "Compose",
"inputs": "@int(substring(
split(items('LoopCNESST'), '.')[0], // ← FIXED: Removed ?['Nom']
8, 2))"
}
}
```
**Key Lessons**:
- ✅ **ALWAYS check filter syntax first**: `item()` reveals data type
- ✅ `item()` = primitives, `item()?['Prop']` = objects
- ✅ Complete Phase 0 before jumping to solutions
- ✅ Trace data flow from source through transformations
- ❌ **NEVER assume** objects with properties without verification
**Prevention**:
- Use consistent data structures throughout flow
- If you need object properties, use Select action to create them
- Add comments explaining data structure at key points
- Validate filter patterns match loop usage
---
### Example 2: SharePoint Throttling Error
**Input**:
```json
{
"error": "Status code: 429, TooManyRequests",
"action": "sharepointonline_getitems",
"connector": "SharePoint"
}
```
**Workflow**:
1. Classify: Throttling error
2. Research: Launch Explore agent → Docs/{Platform}_Documentation/SharePoint/overview.md
3. Find: "600 API calls per 60 seconds per connection"
4. Design: Add delays between calls, implement paging
5. Build: Launch flow-builder agent → Generate complete JSON with delays
6. Output: Structured debug report with fixed JSON
### Example 2: OneDrive File Size Error
**Input**:
```json
{
"error": "File not processed",
"trigger": "onedrive_whenfilecreated",
"connector": "OneDrive"
}
```
**Workflow**:
1. Classify: File processing error
2. Research: Launch Explore agent → Docs/{Platform}_Documentation/OneDrive/overview.md
3. Find: "Files over 50MB skipped by triggers"
4. Design: Add file size check, alternative large file handling
5. Build: Launch flow-builder agent → Generate JSON with size validation
6. Output: Structured debug report with fixed JSON
## Supporting Files
See also:
- [Output Template](../../../output-style/template-debug-output.md) - Complete output format specification
- [Power Automate JSON Format](../../../Docs/{Platform}_Documentation/power-automate-json-format.md) - Valid JSON structure requirements
- [Repository Guide](../../../CLAUDE.md) - Full repository documentation and patterns
## Best Practices
1. **Always use sub-agents** - Never skip research or flow-builder phases
2. **Reference real documentation** - Every claim must cite Docs/{Platform}_Documentation/
3. **Output complete JSON** - No placeholders, no TODOs, production-ready
4. **Explain changes** - User must understand WHY each fix is needed
5. **Include error handling** - Every fix should have proper error handling
6. **Validate thoroughly** - Check JSON syntax, structure, and compliance
7. **Be specific** - Name exact parameters, values, file paths
8. **Provide alternatives** - Offer multiple approaches when applicable
## Skill Invocation Examples
User messages that trigger this skill:
- "Debug this error.json file"
- "I'm getting a 429 error in my SharePoint flow, here's the JSON: {...}"
- "Fix this Power Automate error: [error message]"
- "My OneDrive trigger isn't working, here's the error JSON"
- "Analyze this flow failure: [JSON content]"
- "Help me fix this authentication error in Power Automate"
- "My workflow is broken, here's the error file"
- "Debug my failing automation, error attached"
- "This n8n workflow keeps erroring out"
---
**Version**: 2.0
**Last Updated**: 2025-10-31
**Platforms**: Power Automate, n8n, Make, Zapier + extensible
## Changelog
### Version 2.0 (2025-10-31)
**Major Improvements**:
- ✅ Added **Phase 0: Data Structure Analysis** (CRITICAL) - Prevents incorrect fixes based on wrong data type assumptions
- ✅ Added **Multi-Source Research Strategy** - Local docs → Web → Ask user (never guess!)
- ✅ Added **Common Pitfalls section** - Highlights critical debugging errors to avoid
- ✅ Added **Real Case Study example** - Data structure bug from actual debugging session
- ✅ Enhanced **Quality Assurance Checklist** - Now includes comprehensive data structure validation
- ✅ Added **AskUserQuestion workflow** - Clarify ambiguous structures before creating fixes
- ✅ Added **WebSearch integration** - Fallback when local docs don't have the answer
**Key Changes**:
- Never assume array structures (strings vs objects) without verification
- Always analyze `item()` vs `item()?['Property']` patterns in filters/queries
- Ask user when uncertain rather than guessing and potentially creating bugs
- Search web for official documentation if local docs are insufficient
- Validate data type consistency throughout flow before generating fixes
- Trace data sources back to understand actual structures
**Lessons Learned from Real Bugs**:
- Mismatched data structure assumptions are the #1 source of incorrect debugging fixes
- Filter syntax is the most reliable indicator: `contains(item(), 'x')` = array of primitives
- Property access on primitives (`items('Loop')?['Prop']` on strings) causes runtime errors
- Always complete Phase 0 before jumping to solutions
- User confirmation beats confident but wrong assumptions