Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:44:39 +08:00
commit acd2279239
11 changed files with 1235 additions and 0 deletions

61
skills/subgraph-explorer/.gitignore vendored Normal file
View File

@@ -0,0 +1,61 @@
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
env/
venv/
ENV/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# MacOS
.DS_Store
.AppleDouble
.LSOverride
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# Logs
*.log
# Temporary files
tmp/
temp/
*.tmp
# Test outputs
queries/
test_*.graphql
test_*.json

View File

@@ -0,0 +1,160 @@
# Subgraph Explorer Skill
A Claude Code skill for exploring and querying blockchain subgraphs through a private MCP server running in Docker.
## Description
This skill enables exploration and querying of blockchain subgraphs through a private MCP server. It provides tools for managing the Docker-based server, exploring GraphQL schemas, executing queries against configured subgraphs, and exporting discovered queries for project integration.
## Features
- **Docker Management**: Start, stop, and check status of the MCP server
- **Schema Exploration**: Introspect GraphQL schemas from subgraphs
- **Query Execution**: Execute GraphQL queries against configured or ad-hoc subgraph endpoints
- **Query Export**: Save discovered queries in multiple formats (JS, Python, GraphQL, JSON)
- **Comprehensive Patterns**: Reference guide with common GraphQL query patterns for blockchain data
## Use Cases
- Exploring NFT transfers and ownership data
- Querying DEX swaps and trading volume
- Analyzing DeFi protocol metrics
- Examining subgraph schemas and entities
- Exporting queries for project integration
## Requirements
- Docker and Docker Compose
- Private Subgraph MCP Server (https://github.com/nschwermann/subgraph-mcp)
- Python 3 (for query export utility)
## Installation
### As a Claude Code Skill
1. Download the `subgraph-explorer.zip` file
2. Install via Claude Code plugin system, or
3. Extract to your Claude skills directory
### From Source
```bash
# Clone or download this repository
git clone <repository-url>
# The skill is ready to use
# Scripts are in scripts/
# Reference docs are in references/
# Main instructions are in SKILL.md
```
## Quick Start
### Starting the MCP Server
```bash
bash scripts/start_mcp_server.sh
```
This starts the Docker container with:
- **SSE endpoint**: `http://localhost:8000` (for MCP communication)
- **Metrics endpoint**: `http://localhost:9091/metrics` (for monitoring)
### Check Server Status
```bash
bash scripts/check_mcp_status.sh
```
### Stop the Server
```bash
bash scripts/stop_mcp_server.sh
```
## Exporting Queries
Export discovered GraphQL queries for project use:
### JavaScript/TypeScript
```bash
python3 scripts/export_query.py queries/myQuery.js --format js --name MyQuery
```
### Python
```bash
python3 scripts/export_query.py queries/myQuery.py --format py --name my_query
```
### GraphQL
```bash
python3 scripts/export_query.py queries/myQuery.graphql --format graphql
```
### JSON
```bash
python3 scripts/export_query.py queries/myQuery.json --format json --name MyQuery
```
## Structure
```
subgraph-explorer/
├── SKILL.md # Main skill documentation
├── scripts/
│ ├── start_mcp_server.sh # Start Docker MCP server
│ ├── stop_mcp_server.sh # Stop Docker MCP server
│ ├── check_mcp_status.sh # Check server status
│ └── export_query.py # Export queries to various formats
└── references/
└── graphql_patterns.md # GraphQL query patterns reference
```
## Configuration
The scripts default to `~/Workspace/subgraph-mcp` as the MCP server project path. Override by setting the `SUBGRAPH_MCP_PATH` environment variable:
```bash
export SUBGRAPH_MCP_PATH=/path/to/your/subgraph-mcp
bash scripts/start_mcp_server.sh
```
## MCP Server Tools
The skill works with the following MCP server tools:
**Registry-based:**
- `list_subgraphs` - List all configured subgraphs
- `search_subgraphs_by_keyword` - Search subgraphs by keyword
- `get_schema_by_id` - Get GraphQL schema for a subgraph
- `execute_query_by_id` - Execute query against a subgraph
- `get_query_examples_by_id` - Get query examples
- `get_subgraph_guidance_by_id` - Get subgraph-specific guidance
**Ad-hoc:**
- `get_schema_by_url` - Get schema from any GraphQL endpoint
- `execute_query_by_url` - Execute query against any GraphQL endpoint
## Documentation
See `SKILL.md` for comprehensive documentation including:
- Core workflows for subgraph exploration
- Query development process
- Data considerations and best practices
- Troubleshooting guide
- Tips and tricks
See `references/graphql_patterns.md` for:
- Pagination strategies
- Filtering and aggregation patterns
- Performance optimization techniques
- Common query scenarios
## License
MIT
## Related Projects
- [Private Subgraph MCP Server](https://github.com/nschwermann/subgraph-mcp) - The MCP server this skill interacts with
- [Claude Code](https://claude.com/claude-code) - The AI coding assistant platform

View File

@@ -0,0 +1,301 @@
---
name: subgraph-explorer
description: Explore and query blockchain subgraphs through a private MCP server running in Docker. Use this skill when exploring GraphQL subgraphs, querying blockchain data from subgraphs (NFT transfers, DEX swaps, DeFi metrics), examining subgraph schemas, or exporting discovered queries for project use. The skill manages Docker-based MCP server interaction and provides utilities for query development and export.
---
# Subgraph Explorer
## Overview
This skill enables exploration and querying of blockchain subgraphs through a private MCP server. It provides tools for managing the Docker-based server, exploring GraphQL schemas, executing queries against configured subgraphs, and exporting discovered queries for project integration.
## Quick Start
### Starting the MCP Server
Before using subgraph exploration features, ensure the MCP server is running:
```bash
bash scripts/start_mcp_server.sh
```
This starts the Docker container with:
- **SSE endpoint**: `http://localhost:8000` (for MCP communication)
- **Metrics endpoint**: `http://localhost:9091/metrics` (for monitoring)
Check server status:
```bash
bash scripts/check_mcp_status.sh
```
Stop the server:
```bash
bash scripts/stop_mcp_server.sh
```
**Note**: The scripts default to `~/Workspace/subgraph-mcp` as the project path. Set `SUBGRAPH_MCP_PATH` environment variable to override.
### MCP Server Connection
The MCP server runs in SSE mode and exposes the following tools via HTTP:
**Registry-based tools:**
- `list_subgraphs` - List all configured subgraphs
- `search_subgraphs_by_keyword` - Search subgraphs by keyword
- `get_schema_by_id` - Get GraphQL schema for a configured subgraph
- `execute_query_by_id` - Execute query against a configured subgraph
- `get_query_examples_by_id` - Get query examples for a subgraph
- `get_subgraph_guidance_by_id` - Get subgraph-specific guidance
**Ad-hoc tools:**
- `get_schema_by_url` - Get schema from any GraphQL endpoint (no registry needed)
- `execute_query_by_url` - Execute query against any GraphQL endpoint (no registry needed)
To interact with the MCP server, use the WebFetch tool to make HTTP requests to the SSE endpoint at `http://localhost:8000`.
## Core Workflows
### 1. Exploring Configured Subgraphs
When exploring subgraphs in the registry (`subgraphs.json`):
**Step 1: List or Search**
- Use `list_subgraphs` to see all available subgraphs
- Use `search_subgraphs_by_keyword` to find specific subgraphs by name/description
**Step 2: Understand the Schema**
- Use `get_schema_by_id` to retrieve the GraphQL schema
- Examine entity types, fields, and relationships
- Check `get_query_examples_by_id` for pre-built query templates
- Review `get_subgraph_guidance_by_id` for subgraph-specific tips
**Step 3: Execute Queries**
- Use `execute_query_by_id` to run GraphQL queries
- Start with simple queries and iterate
- Apply pagination for large result sets
- Reference `references/graphql_patterns.md` for common patterns
**Step 4: Export Useful Queries**
- Use `scripts/export_query.py` to save queries for project use
- Choose format: JavaScript, Python, GraphQL, or JSON
### 2. Ad-hoc Subgraph Exploration
For exploring subgraphs not in the registry:
**Direct URL Access:**
- Use `get_schema_by_url` with the GraphQL endpoint URL
- Optionally provide `auth_header` if authentication is required
- Use `execute_query_by_url` to run queries directly
Example workflow:
1. Get schema: `get_schema_by_url(url="https://example.com/graphql")`
2. Analyze available entities and fields
3. Build query based on schema
4. Execute: `execute_query_by_url(url="https://example.com/graphql", query="...", variables={...})`
### 3. Query Development Process
**Iterative Query Building:**
1. **Start Simple**: Query a single entity to understand data structure
```graphql
query SimpleQuery {
entity(id: "0x123") {
id
name
}
}
```
2. **Add Fields**: Gradually add more fields as needed
```graphql
query ExpandedQuery {
entity(id: "0x123") {
id
name
timestamp
relatedData {
field1
field2
}
}
}
```
3. **Apply Filters**: Use `where` clauses for specific criteria
```graphql
query FilteredQuery($minValue: String!) {
entities(where: { value_gte: $minValue }, first: 100) {
id
value
timestamp
}
}
```
4. **Optimize**: Use aggregated fields instead of large arrays
- Prefer: `contract.holders` (pre-calculated count)
- Avoid: Counting all `tokens` manually
**Reference**: See `references/graphql_patterns.md` for comprehensive query patterns including pagination, filtering, aggregation, and performance optimization.
## Exporting Queries
Use the export utility to save discovered queries for project integration:
### JavaScript/TypeScript Export
```bash
python3 scripts/export_query.py queries/getLatestSwaps.js --format js --name GetLatestSwaps --description "Fetch latest DEX swaps"
```
Then paste your GraphQL query when prompted.
Output:
```javascript
/**
* Fetch latest DEX swaps
*/
export const GetLatestSwaps = `
query GetLatestSwaps($first: Int!) {
swaps(first: $first, orderBy: timestamp, orderDirection: desc) {
id
timestamp
amountUSD
pair {
token0 { symbol }
token1 { symbol }
}
}
}
`;
```
### Python Export
```bash
python3 scripts/export_query.py queries/get_latest_swaps.py --format py --name get_latest_swaps
```
### GraphQL File Export
```bash
python3 scripts/export_query.py queries/latest_swaps.graphql --format graphql
```
### JSON Export (with metadata)
```bash
python3 scripts/export_query.py queries/latest_swaps.json --format json --name GetLatestSwaps --variables '{"first": 100}'
```
## Understanding Subgraph Data
### Common Entity Types
**DEX/Trading Subgraphs:**
- `Swap` - Individual trade transactions
- `Pair` - Trading pairs with liquidity and volume
- `Token` - Token information and metadata
- `DayData` / `PairDayData` - Aggregated daily metrics
**NFT Subgraphs:**
- `ERC721Transfer` / `ERC1155Transfer` - NFT transfer events
- `Account` - User accounts with balances
- `ERC721Token` / `ERC1155Token` - Individual NFT tokens
- `ERC721Contract` - NFT collection contracts
**Common Patterns:**
- Most entities have `id`, `timestamp`, and `blockNumber` fields
- Use relationship fields (e.g., `pair { token0 { symbol } }`) to navigate connections
- Aggregated fields (e.g., `totalSupply`, `holders`) provide pre-calculated stats
### Data Considerations
**Time-based Data:**
- Daily aggregates typically reset at midnight UTC
- For "today's" data, consider querying both current day and previous day
- Calculate partial day metrics: `(yesterday_value * hours_passed / 24) + today_value`
**Pagination:**
- Maximum `first` parameter is typically 1000, recommended 100
- Use `skip` for offset-based pagination
- Use cursor-based pagination (`id_gt`) for large datasets
**Performance:**
- Avoid deep nesting of relationships
- Use aggregated fields when available
- Apply specific filters to reduce result sets
- Request only needed fields, not entire objects
## Troubleshooting
### MCP Server Issues
**Container won't start:**
- Check if ports 8000 or 9091 are already in use
- Verify `subgraphs.json` exists in the subgraph-mcp directory
- View logs: `docker logs subgraph-mcp-server`
**Server not responding:**
- Run: `bash scripts/check_mcp_status.sh`
- Verify Docker is running
- Check firewall settings for localhost access
**Configuration errors:**
- Verify `SUBGRAPH_MCP_PATH` points to correct directory
- Ensure `subgraphs.json` is valid JSON
- Check subgraph URLs are accessible
### Query Issues
**Schema introspection fails:**
- Verify the subgraph endpoint is accessible
- Check authentication headers if required
- Ensure the endpoint is a valid GraphQL API
**Query timeout:**
- Simplify the query (reduce nesting, fewer fields)
- Add more specific filters
- Reduce `first` parameter value
- Use pagination for large datasets
**Type errors:**
- Check field types in schema (String vs Int vs BigInt)
- Ensure variable types match schema requirements
- Use quotes for String types in variables: `{"id": "0x123"}`
**Empty results:**
- Verify entity IDs are correct (case-sensitive)
- Check filter conditions aren't too restrictive
- For time-based queries, verify timestamp format (usually Unix seconds)
- Confirm data exists in the subgraph (check subgraph sync status)
## Resources
### scripts/
**Docker Management:**
- `start_mcp_server.sh` - Start the MCP server in Docker
- `stop_mcp_server.sh` - Stop the MCP server
- `check_mcp_status.sh` - Check server status and health
**Query Export:**
- `export_query.py` - Export GraphQL queries to various formats (JS, Python, GraphQL, JSON)
### references/
- `graphql_patterns.md` - Comprehensive guide to GraphQL query patterns for subgraphs
- Pagination strategies
- Filtering patterns
- Aggregation techniques
- Performance optimization
- Common query scenarios
## Tips
- **Always start with schema exploration** - Use `get_schema_by_id` or `get_schema_by_url` first
- **Check for query examples** - Use `get_query_examples_by_id` for configured subgraphs
- **Read subgraph guidance** - Use `get_subgraph_guidance_by_id` for subgraph-specific tips
- **Test queries incrementally** - Build complex queries step by step
- **Export working queries** - Save successful queries for reuse in projects
- **Monitor performance** - Check metrics endpoint (`http://localhost:9091/metrics`) for server health
- **Use aggregated data** - Prefer pre-calculated fields over manual aggregation
- **Consider UTC timezone** - Daily data typically resets at midnight UTC

View File

@@ -0,0 +1,324 @@
# GraphQL Query Patterns for Subgraphs
This reference provides common GraphQL query patterns for blockchain subgraph exploration.
## Pagination Patterns
### Basic Pagination
```graphql
query PaginatedQuery($first: Int!, $skip: Int!) {
entities(first: $first, skip: $skip, orderBy: timestamp, orderDirection: desc) {
id
timestamp
# ... other fields
}
}
```
Variables:
```json
{
"first": 100,
"skip": 0
}
```
### Cursor-based Pagination
```graphql
query CursorPagination($first: Int!, $lastId: String!) {
entities(first: $first, where: { id_gt: $lastId }, orderBy: id) {
id
# ... other fields
}
}
```
## Filtering Patterns
### Time Range Queries
```graphql
query TimeRangeQuery($startTime: Int!, $endTime: Int!) {
entities(
where: {
timestamp_gte: $startTime
timestamp_lte: $endTime
}
orderBy: timestamp
orderDirection: desc
) {
id
timestamp
# ... other fields
}
}
```
### Address-based Queries
```graphql
query UserActivity($userAddress: String!) {
# Query by exact address
transactions(where: { from: $userAddress }) {
id
to
value
timestamp
}
# Query by address (case-insensitive)
accounts(where: { id: $userAddress }) {
id
balance
# ... other fields
}
}
```
### Complex Filters
```graphql
query ComplexFilter($minAmount: String!, $tokenAddress: String!) {
swaps(
where: {
amountUSD_gte: $minAmount
pair_: {
token0: $tokenAddress
}
}
first: 100
orderBy: timestamp
orderDirection: desc
) {
id
amountUSD
timestamp
pair {
token0 { symbol }
token1 { symbol }
}
}
}
```
## Aggregation Patterns
### Daily Aggregates
```graphql
query DailyStats($dayId: Int!) {
dayData(id: $dayId) {
id
date
dailyVolumeUSD
dailyTxns
totalLiquidityUSD
}
}
```
### Sum and Count
```graphql
query Aggregations {
protocol(id: "1") {
totalValueLockedUSD
txCount
pairCount
totalVolumeUSD
}
}
```
## Relationship Navigation
### Nested Queries
```graphql
query NestedRelationships($pairId: String!) {
pair(id: $pairId) {
id
token0 {
id
symbol
name
decimals
}
token1 {
id
symbol
name
decimals
}
reserve0
reserve1
totalSupply
# Related entities
swaps(first: 10, orderBy: timestamp, orderDirection: desc) {
id
timestamp
amount0In
amount1In
amount0Out
amount1Out
amountUSD
}
}
}
```
## Performance Best Practices
### Limit Response Size
- Use `first` parameter to limit results (max typically 1000, recommended 100)
- Paginate large datasets using `skip` or cursor-based pagination
- Avoid querying all items in large collections
### Use Specific Fields
```graphql
# Good - only request needed fields
query EfficientQuery {
tokens(first: 10) {
id
symbol
name
}
}
# Avoid - requesting unnecessary nested data
query IneffcientQuery {
tokens(first: 10) {
id
symbol
name
pairs {
swaps {
# This can return massive amounts of data
}
}
}
}
```
### Use Aggregated Fields
```graphql
# Good - use pre-aggregated data
query GoodAggregation($contractId: String!) {
erc721Contract(id: $contractId) {
holders # Pre-calculated count
stakedHolders # Pre-calculated count
totalSupply { value }
}
}
# Avoid - counting individual records
query BadAggregation($contractId: String!) {
erc721Tokens(where: { contract: $contractId }) {
# Counting these manually is inefficient
id
}
}
```
## Common Query Scenarios
### Latest Activity
```graphql
query LatestActivity($limit: Int = 20) {
transactions(
first: $limit
orderBy: timestamp
orderDirection: desc
) {
id
timestamp
blockNumber
# ... transaction details
}
}
```
### User Portfolio
```graphql
query UserPortfolio($userAddress: String!) {
account(id: $userAddress) {
id
# ERC20 balances
erc20Balances(where: { value_gt: "0" }) {
token {
id
symbol
name
}
value
}
# ERC721 (NFT) balances
erc721Balances(where: { value_gt: "0" }) {
contract {
id
name
symbol
}
value
}
}
}
```
### Price and Volume Analysis
```graphql
query PriceVolume($pairAddress: String!, $days: Int = 7) {
pair(id: $pairAddress) {
token0 { symbol }
token1 { symbol }
token0Price
token1Price
volumeUSD
reserveUSD
# Daily historical data
pairDayData(
first: $days
orderBy: date
orderDirection: desc
) {
date
dailyVolumeUSD
reserveUSD
}
}
}
```
## Error Handling
### Common GraphQL Errors
1. **Field doesn't exist**: Check schema with introspection
2. **Type mismatch**: Ensure variable types match schema (String vs Int vs BigInt)
3. **Query too complex**: Simplify nested queries or reduce `first` parameter
4. **Timeout**: Reduce query complexity or add more specific filters
### Debugging Queries
Use introspection to explore the schema:
```graphql
query IntrospectTypes {
__schema {
types {
name
kind
}
}
}
query IntrospectType($typeName: String!) {
__type(name: $typeName) {
name
fields {
name
type {
name
kind
}
}
}
}
```

View File

@@ -0,0 +1,51 @@
#!/bin/bash
#
# Check the status of the Subgraph MCP Server
#
# This script checks if the subgraph-mcp Docker container is running
# and verifies that the endpoints are accessible.
set -e
echo "🔍 Checking Subgraph MCP Server status..."
echo ""
# Check if container is running
if docker ps | grep -q "subgraph-mcp-server"; then
echo "✅ Container: Running"
# Get container uptime
UPTIME=$(docker ps --filter "name=subgraph-mcp-server" --format "{{.Status}}")
echo " Status: $UPTIME"
# Check SSE endpoint
echo ""
echo "🌐 Checking endpoints..."
if curl -s -f http://localhost:8000 > /dev/null 2>&1; then
echo " ✅ SSE endpoint (port 8000): Accessible"
else
echo " ⚠️ SSE endpoint (port 8000): Not responding"
fi
# Check metrics endpoint
if curl -s -f http://localhost:9091/metrics > /dev/null 2>&1; then
echo " ✅ Metrics endpoint (port 9091): Accessible"
else
echo " ⚠️ Metrics endpoint (port 9091): Not responding"
fi
echo ""
echo "📊 Recent logs:"
docker logs --tail 10 subgraph-mcp-server
elif docker ps -a | grep -q "subgraph-mcp-server"; then
echo "❌ Container: Stopped"
echo ""
echo " Start with: scripts/start_mcp_server.sh"
echo " Or: docker-compose up -d (from subgraph-mcp directory)"
else
echo "❌ Container: Not found"
echo ""
echo " The subgraph-mcp-server container doesn't exist."
echo " Start it with: scripts/start_mcp_server.sh"
fi

View File

@@ -0,0 +1,160 @@
#!/usr/bin/env python3
"""
Export GraphQL Query Utility
This script helps export discovered GraphQL queries into reusable formats
for easy integration into projects. It supports multiple output formats:
- JavaScript/TypeScript module
- Python module
- Plain GraphQL file
- JSON format with metadata
Usage:
python3 export_query.py <output_file> [--format js|py|graphql|json] [--name QueryName]
Then paste your GraphQL query when prompted, and press Ctrl+D (Unix) or Ctrl+Z (Windows) when done.
Examples:
# Export as JavaScript
python3 export_query.py queries/myQuery.js --format js --name GetLatestSwaps
# Export as Python
python3 export_query.py queries/myQuery.py --format py --name get_latest_swaps
# Export as plain GraphQL
python3 export_query.py queries/myQuery.graphql --format graphql
"""
import sys
import argparse
import json
from pathlib import Path
from datetime import datetime
def format_js(query: str, name: str, description: str = "") -> str:
"""Format query as JavaScript/TypeScript module."""
comment = f"/**\n * {description}\n */\n" if description else ""
return f"""{comment}export const {name} = `
{query}
`;
"""
def format_python(query: str, name: str, description: str = "") -> str:
"""Format query as Python module."""
docstring = f' """{description}"""\n' if description else ""
return f'''{name} = """
{query}
"""
{docstring}
'''
def format_graphql(query: str, description: str = "") -> str:
"""Format as plain GraphQL with optional comment."""
comment = f"# {description}\n\n" if description else ""
return f"{comment}{query}\n"
def format_json(query: str, name: str, description: str = "", variables: dict = None) -> str:
"""Format as JSON with metadata."""
data = {
"name": name,
"description": description,
"query": query,
"variables": variables or {},
"exported_at": datetime.now().isoformat()
}
return json.dumps(data, indent=2)
def read_multiline_input(prompt: str) -> str:
"""Read multiline input from stdin."""
print(prompt)
print("(Press Ctrl+D on Unix/Mac or Ctrl+Z on Windows when done)")
print("-" * 60)
lines = []
try:
while True:
line = input()
lines.append(line)
except EOFError:
pass
print("-" * 60)
return "\n".join(lines).strip()
def main():
parser = argparse.ArgumentParser(
description="Export GraphQL queries into various formats for project use",
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog=__doc__
)
parser.add_argument("output", help="Output file path")
parser.add_argument(
"--format",
choices=["js", "py", "graphql", "json"],
help="Output format (default: inferred from file extension)"
)
parser.add_argument("--name", help="Query name (required for js/py/json formats)")
parser.add_argument("--description", default="", help="Query description")
parser.add_argument("--variables", help="JSON string of query variables (for json format)")
args = parser.parse_args()
# Infer format from extension if not provided
output_path = Path(args.output)
format_type = args.format
if not format_type:
ext = output_path.suffix.lower()
format_map = {".js": "js", ".ts": "js", ".py": "py", ".graphql": "graphql", ".gql": "graphql", ".json": "json"}
format_type = format_map.get(ext)
if not format_type:
print(f"❌ Error: Cannot infer format from extension '{ext}'. Please specify --format", file=sys.stderr)
sys.exit(1)
# Validate name requirement
if format_type in ["js", "py", "json"] and not args.name:
print(f"❌ Error: --name is required for {format_type} format", file=sys.stderr)
sys.exit(1)
# Read the GraphQL query
query = read_multiline_input("\n📝 Paste your GraphQL query:")
if not query:
print("❌ Error: No query provided", file=sys.stderr)
sys.exit(1)
# Parse variables if provided
variables = None
if args.variables:
try:
variables = json.loads(args.variables)
except json.JSONDecodeError as e:
print(f"❌ Error: Invalid JSON in --variables: {e}", file=sys.stderr)
sys.exit(1)
# Format the output
formatters = {
"js": lambda: format_js(query, args.name, args.description),
"py": lambda: format_python(query, args.name, args.description),
"graphql": lambda: format_graphql(query, args.description),
"json": lambda: format_json(query, args.name, args.description, variables)
}
output = formatters[format_type]()
# Create parent directory if it doesn't exist
output_path.parent.mkdir(parents=True, exist_ok=True)
# Write the output
output_path.write_text(output)
print(f"\n✅ Query exported successfully to: {output_path}")
print(f" Format: {format_type}")
if args.name:
print(f" Name: {args.name}")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,62 @@
#!/bin/bash
#
# Start the Subgraph MCP Server using Docker Compose
#
# This script starts the subgraph-mcp server in Docker, which exposes:
# - Port 8000: SSE endpoint for MCP communication
# - Port 9091: Prometheus metrics endpoint
#
# The server runs in SSE mode and uses the subgraphs.json configuration
# from the subgraph-mcp project directory.
set -e
# Default path to subgraph-mcp project
SUBGRAPH_MCP_PATH="${SUBGRAPH_MCP_PATH:-$HOME/Workspace/subgraph-mcp}"
echo "🚀 Starting Subgraph MCP Server..."
echo " Project path: $SUBGRAPH_MCP_PATH"
# Check if the directory exists
if [ ! -d "$SUBGRAPH_MCP_PATH" ]; then
echo "❌ Error: Directory not found: $SUBGRAPH_MCP_PATH"
echo " Set SUBGRAPH_MCP_PATH environment variable to the correct path"
exit 1
fi
# Check if docker-compose.yml exists
if [ ! -f "$SUBGRAPH_MCP_PATH/docker-compose.yml" ]; then
echo "❌ Error: docker-compose.yml not found in $SUBGRAPH_MCP_PATH"
exit 1
fi
# Check if subgraphs.json exists
if [ ! -f "$SUBGRAPH_MCP_PATH/subgraphs.json" ]; then
echo "⚠️ Warning: subgraphs.json not found in $SUBGRAPH_MCP_PATH"
echo " The server may not work properly without configuration"
fi
# Change to the project directory
cd "$SUBGRAPH_MCP_PATH"
# Start the server using docker-compose
echo " Starting Docker container..."
docker-compose up -d
# Wait for the server to be ready
echo " Waiting for server to be ready..."
sleep 3
# Check if the container is running
if docker ps | grep -q "subgraph-mcp-server"; then
echo "✅ Subgraph MCP Server started successfully"
echo " SSE endpoint: http://localhost:8000"
echo " Metrics endpoint: http://localhost:9091/metrics"
echo ""
echo " View logs: docker logs -f subgraph-mcp-server"
echo " Stop server: docker-compose down (from $SUBGRAPH_MCP_PATH)"
else
echo "❌ Failed to start server. Check logs with:"
echo " docker logs subgraph-mcp-server"
exit 1
fi

View File

@@ -0,0 +1,28 @@
#!/bin/bash
#
# Stop the Subgraph MCP Server
#
# This script stops the subgraph-mcp Docker container and cleans up resources.
set -e
# Default path to subgraph-mcp project
SUBGRAPH_MCP_PATH="${SUBGRAPH_MCP_PATH:-$HOME/Workspace/subgraph-mcp}"
echo "🛑 Stopping Subgraph MCP Server..."
echo " Project path: $SUBGRAPH_MCP_PATH"
# Check if the directory exists
if [ ! -d "$SUBGRAPH_MCP_PATH" ]; then
echo "❌ Error: Directory not found: $SUBGRAPH_MCP_PATH"
echo " Set SUBGRAPH_MCP_PATH environment variable to the correct path"
exit 1
fi
# Change to the project directory
cd "$SUBGRAPH_MCP_PATH"
# Stop the server
docker-compose down
echo "✅ Subgraph MCP Server stopped"