Files
gh-launchcg-claude-marketpl…/skills/story-refiner/SKILL.md
2025-11-30 08:36:58 +08:00

12 KiB

name, description, allowed-tools, mcpServers
name description allowed-tools mcpServers
story-refiner Analyzes Jira stories and proposes improvements for quality, clarity, acceptance criteria, and completeness mcp__atlassian__*
atlassian

Story Refiner Skill

This skill analyzes Jira stories and proposes specific, actionable improvements to enhance story quality, clarity, and completeness.

When This Skill is Invoked

Claude will automatically use this skill when you mention:

  • "refine story"
  • "improve story quality"
  • "enhance acceptance criteria"
  • "fix story description"
  • "story improvement suggestions"

Capabilities

1. Story Analysis

Fetch and analyze a Jira story to identify improvement opportunities.

Quality Dimensions Analyzed:

  • Acceptance Criteria: Completeness, testability, clarity
  • Description: User value, context, technical details
  • Story Format: User story structure (As a... I want... So that...)
  • Refinement: Story points, labels, components
  • Documentation: Supporting links, attachments, examples

2. Improvement Proposal Generation

Generate specific, actionable proposals for enhancing the story.

Proposal Structure:

## Proposed Changes for [STORY-KEY]

### Current State Assessment
- **Quality Score:** 65/100 (Good)
- **Main Issues:** Missing acceptance criteria, unclear scope
- **Strengths:** Clear user value, good description length

### Recommended Changes

#### 1. Add Acceptance Criteria (High Priority)
**Current:** No acceptance criteria defined
**Proposed:**

Acceptance Criteria:

  • Given a logged-in user When they click "Export Report" Then a PDF report downloads within 5 seconds
  • Report includes all data from selected date range
  • Error message shown if date range exceeds 1 year
**Impact:** Improves testability and clarity

#### 2. Enhance Description (Medium Priority)
**Current:** "Add export feature"
**Proposed:** Add technical context:

Technical Notes:

  • Use existing PDF library (pdfkit)
  • Add export button to dashboard header
  • Store exports in /tmp (auto-cleanup after 1 hour)
  • Max file size: 10MB
**Impact:** Reduces ambiguity, speeds development

#### 3. Add Story Points (Medium Priority)
**Recommendation:** Estimate as 5 points based on similar stories
**Rationale:** Comparable to PROJ-123 (PDF export feature)

3. Diff Generation

Show exact before/after for proposed changes to fields.

Example:

=== Description ===
- As a user, I want to export reports
+ As a user, I want to export dashboard reports to PDF format
+ So that I can share insights with stakeholders offline
+
+ Technical Implementation:
+ - Add "Export to PDF" button in dashboard header
+ - Use pdfkit library for PDF generation
+ - Include all visible widgets and date range
+ - Max export size: 10MB

=== Acceptance Criteria ===
+ Acceptance Criteria:
+ - Given a logged-in user viewing the dashboard
+   When they click "Export to PDF"
+   Then a PDF file downloads within 5 seconds
+ - Report includes dashboard title and selected date range
+ - Report includes all visible widgets with current data
+ - Error message displays if export exceeds size limit

4. Impact Assessment

Estimate the impact of proposed changes on story quality.

Metrics:

  • Quality Score Improvement: Before vs. After
  • Confidence Level: High/Medium/Low that changes will help
  • Effort to Implement: Time to apply changes (Low/Medium/High)

How to Use This Skill

Step 1: Fetch Story from Jira

Use Atlassian MCP to get story details:

mcp__atlassian__jira_get_issue(
  issueKey="PROJ-123",
  fields=["summary", "description", "acceptanceCriteria", "storyPoints", "labels", "components", "status", "priority", "assignee"]
)

Step 2: Analyze Story Quality

Assess each quality dimension:

  1. Acceptance Criteria Check:

    • Look for AC in custom fields (acceptanceCriteria, AC, acceptance_criteria, DoD)
    • Check description for "Acceptance Criteria:" section
    • Evaluate completeness: Are all paths covered?
    • Evaluate testability: Are criteria measurable?
  2. Description Quality:

    • Check for user story format (As a... I want... So that...)
    • Assess length and detail level (>100 words ideal)
    • Verify context explains WHY (user value/problem)
    • Check for technical details where appropriate
  3. Refinement Indicators:

    • Story points assigned?
    • Labels/components tagged?
    • Supporting documentation linked?
    • Sprint assigned?
  4. Clarity & Completeness:

    • Is scope clear (what's included/excluded)?
    • Are edge cases mentioned?
    • Is technical approach defined (if needed)?
    • Are dependencies noted?

Step 3: Generate Improvement Proposals

For each identified gap, create a specific proposal:

### [Priority Level] Improvement: [What to Fix]

**Current State:**
[What's currently in the story]

**Proposed Change:**
[Specific text/content to add or modify]

**Rationale:**
[Why this change improves the story]

**Impact:**
- Quality Score: +X points
- Development Clarity: High/Medium/Low
- Testing Clarity: High/Medium/Low

Step 4: Calculate Quality Impact

Before Refinement Score:

Acceptance Criteria: 0/30 (missing)
Description: 15/30 (minimal)
Refinement: 8/15 (partial)
Estimation: 0/15 (no points)
Documentation: 2/10 (minimal)
Total: 25/100 (Poor)

After Refinement Score (Estimated):

Acceptance Criteria: 28/30 (comprehensive)
Description: 26/30 (detailed with context)
Refinement: 14/15 (well-refined)
Estimation: 12/15 (points assigned)
Documentation: 8/10 (examples added)
Total: 88/100 (Excellent)

Improvement: +63 points

Step 5: Format Output for User Review

Structure the response:

# Story Refinement Proposal: [STORY-KEY]

## Executive Summary
- **Current Quality:** 25/100 (Poor)
- **Projected Quality:** 88/100 (Excellent)
- **Improvement:** +63 points
- **Confidence:** High
- **Effort:** 15 minutes to apply changes

## Priority Improvements

### 🔴 Critical: Add Acceptance Criteria
[Details...]

### 🟡 Important: Enhance Description
[Details...]

### 🟢 Optional: Add Technical Context
[Details...]

## Full Diff Preview
[Show complete before/after...]

## Next Steps
1. Review proposed changes
2. Confirm or modify suggestions
3. Apply changes to Jira story

Refinement Templates

Template 1: User Story Format

As a [user type]
I want [capability]
So that [business value]

Context:
[Why this matters, background information]

Technical Approach:
[High-level implementation notes if needed]

Acceptance Criteria:
- Given [precondition]
  When [action]
  Then [expected outcome]
- [Additional criteria...]

Definition of Done:
- [ ] Code complete and reviewed
- [ ] Unit tests pass
- [ ] Integration tested
- [ ] Documentation updated

Template 2: Technical Story Format

Story: [Concise technical summary]

Problem:
[What technical issue or gap exists]

Proposed Solution:
[How to address it]

Technical Details:
- [Specific implementation notes]
- [Dependencies, libraries, approaches]

Acceptance Criteria:
- [Measurable technical outcomes]
- [Performance criteria if applicable]

Risks/Considerations:
- [Potential issues or dependencies]

Template 3: Bug Fix Story Format

Bug: [Clear summary of the issue]

Current Behavior:
[What's happening now]

Expected Behavior:
[What should happen]

Steps to Reproduce:
1. [Step 1]
2. [Step 2]
3. [Observe issue]

Root Cause (if known):
[Technical explanation]

Fix Approach:
[How to resolve]

Acceptance Criteria:
- Bug no longer reproducible with original steps
- [Additional validation criteria]
- [Regression tests added]

Common Improvement Patterns

Pattern 1: Missing Acceptance Criteria

Detection: No AC field populated, no "Acceptance Criteria:" in description

Proposal:

Add specific, testable acceptance criteria:
- Define happy path scenario
- Cover edge cases (empty state, max limits, errors)
- Include non-functional requirements (performance, accessibility)

Pattern 2: Vague Description

Detection: Description <50 words, no context on WHY

Proposal:

Enhance description with:
- User value statement (So that...)
- Background context (current state, pain point)
- Success criteria (what does "done" look like?)

Pattern 3: Missing Technical Context

Detection: Complex story without implementation notes

Proposal:

Add technical section:
- Libraries/frameworks to use
- Integration points (APIs, services)
- Data models affected
- Security/performance considerations

Pattern 4: No Estimation

Detection: Story points field empty

Proposal:

Suggest story points based on:
- Similar completed stories
- Complexity assessment
- Team velocity patterns
Recommend: [X] points

Pattern 5: Poor Story Format

Detection: Doesn't follow "As a... I want... So that..." format

Proposal:

Rewrite in user story format:
- Identify user persona
- State desired capability
- Explain business value

Integration with Agent

The story-refiner agent will:

  1. Invoke this skill with a story key
  2. Receive improvement proposals
  3. Format for user review
  4. Confirm with user before applying changes
  5. Update Jira story with approved changes

This skill:

  • Fetches story data via Atlassian MCP
  • Analyzes quality dimensions
  • Generates specific improvement proposals
  • Returns structured recommendations
  • Does NOT update Jira (agent handles that after confirmation)

Best Practices

  1. Be Specific: Don't just say "improve description" - provide exact text
  2. Prioritize Changes: Mark critical vs. optional improvements
  3. Explain Rationale: Help user understand WHY each change helps
  4. Show Before/After: Make it easy to see the difference
  5. Estimate Impact: Quantify quality improvement
  6. Respect Existing Content: Build on what's there, don't replace unnecessarily
  7. Use Templates: Suggest story formats appropriate to type (feature/bug/tech)

Output Format

Always structure skill output as:

  1. Story Analysis Summary:

    • Current quality score
    • Key strengths
    • Main weaknesses
  2. Prioritized Improvements:

    • Critical (must-have)
    • Important (should-have)
    • Optional (nice-to-have)
  3. Detailed Proposals:

    • For each improvement: current state, proposed change, rationale, impact
  4. Full Diff:

    • Complete before/after view of all proposed changes
  5. Quality Projection:

    • Estimated quality score after changes
    • Confidence level
  6. Next Steps:

    • Clear actions for user

Error Handling

Story Not Found

**Error:** Story [STORY-KEY] not found in Jira

**Possible Causes:**
- Story key is incorrect (case-sensitive)
- Story doesn't exist
- No permission to access story

**Solution:** Verify story key and access permissions

Insufficient Permissions

**Error:** Cannot read story [STORY-KEY] - permission denied

**Solution:** Ensure ATLASSIAN_API_KEY has read access to this project

Story Already High Quality

**Analysis:** Story [STORY-KEY] has excellent quality (92/100)

**Assessment:**
- ✓ Comprehensive acceptance criteria
- ✓ Detailed description with context
- ✓ Well-refined with story points
- ✓ Supporting documentation attached

**Recommendation:** No changes needed. Story is ready for development.

Incomplete Story Data

**Warning:** Story [STORY-KEY] has limited data available

**Missing Fields:**
- Story points (not tracked in this project)
- Custom AC field (project doesn't use this field)

**Analysis:** Proceeding with available data
[Continue with analysis based on description field only...]

Data Quality Tracking

Always report:

{
  "story_key": "PROJ-123",
  "quality_score_before": 25,
  "quality_score_after": 88,
  "improvement": 63,
  "confidence": "high",  # high/medium/low
  "fields_analyzed": ["summary", "description", "acceptanceCriteria", "storyPoints"],
  "fields_missing": ["components"],
  "proposals_generated": 5,
  "critical_proposals": 2,
  "important_proposals": 2,
  "optional_proposals": 1
}

When invoked, this skill will analyze the specified Jira story and return detailed, actionable improvement proposals that the story-refiner agent can present for user review and confirmation.