Initial commit

This commit is contained in:
Zhongwei Li
2025-11-29 18:30:25 +08:00
commit 92f7c7980a
14 changed files with 457 additions and 0 deletions

View File

@@ -0,0 +1,35 @@
---
name: normalize-signals
description: Processes enriched datasets into unified schemas with identity resolution and tagging.
usage: /data-signal-enrichment:normalize-signals --source warehouse --outputs crm,cdp --taxonomy intent_v2
---
# Command: normalize-signals
## Inputs
- **source** data origin (warehouse, csv, api, webhook).
- **outputs** comma-separated destinations (crm, cdp, lake, orchestration).
- **taxonomy** schema/taxonomy version to enforce.
- **window** time window or batch ID to process.
- **dry-run** true/false toggle for validation-only runs.
## Workflow
1. **Schema Detection** inspect incoming fields, compare to taxonomy, flag gaps.
2. **Identity Resolution** match accounts/contacts/opps using rules + heuristics.
3. **Normalization** standardize values, units, topics, and metadata.
4. **Tagging & Scoring** add freshness, confidence, and signal-type tags.
5. **Distribution** publish to requested destinations with lineage + control tables.
## Outputs
- Normalized dataset (per destination) with schema compliance report.
- Identity resolution summary (matches, conflicts, unresolved records).
- Taxonomy drift log + remediation checklist.
## Agent/Skill Invocations
- `signal-integrator` runs normalization + distribution.
- `data-quality-steward` validates schema + scores.
- `provider-ops-lead` supplies provider metadata for lineage.
- `signal-taxonomy` skill enforces schema + naming rules.
- `identity-resolution` skill handles matching heuristics.
---