Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:42:03 +08:00
commit e32589a11a
8 changed files with 387 additions and 0 deletions

36
commands/extract.md Normal file
View File

@@ -0,0 +1,36 @@
---
description: Extract structured data from a URL using LLM
argument-hint: <url> --prompt "..." [--schema '{...}']
---
# Firecrawl Extract
Extract data from: `$ARGUMENTS`
## Instructions
1. Parse the arguments:
- First argument is the URL to extract from
- Required `--prompt "text"` - What to extract
- Optional `--schema '{...}'` - JSON Schema for structured output
2. Run the CLI command:
```bash
cd /Users/nathanvale/code/side-quest-marketplace/plugins/firecrawl && bun run src/cli.ts extract $ARGUMENTS
```
3. Present the extracted data to the user
## Example Usage
```
/firecrawl:extract https://example.com --prompt "Extract the main heading and description"
/firecrawl:extract https://store.example.com/product --prompt "Extract price and name" --schema '{"price": {"type": "number"}, "name": {"type": "string"}}'
```
## Notes
- Extraction is async - the command polls for completion
- May take 10-30 seconds depending on page complexity
- Use --schema for type-safe structured output
- Requires FIRECRAWL_API_KEY environment variable

35
commands/map.md Normal file
View File

@@ -0,0 +1,35 @@
---
description: Discover all URLs on a website
argument-hint: <url> [--limit N]
---
# Firecrawl Map
Map URLs on the website: `$ARGUMENTS`
## Instructions
1. Parse the arguments:
- First argument is the base URL to map
- Optional `--limit N` flag (default: 100)
2. Run the CLI command:
```bash
cd /Users/nathanvale/code/side-quest-marketplace/plugins/firecrawl && bun run src/cli.ts map $ARGUMENTS
```
3. Present the discovered URLs to the user
## Example Usage
```
/firecrawl:map https://docs.example.com
/firecrawl:map https://example.com --limit 50
```
## Notes
- Returns up to 50 URLs in output (configurable via --limit)
- Includes page titles when available
- Useful for understanding site structure before scraping specific pages
- Requires FIRECRAWL_API_KEY environment variable

34
commands/scrape.md Normal file
View File

@@ -0,0 +1,34 @@
---
description: Scrape content from a URL as markdown
argument-hint: <url> [--format markdown|summary]
---
# Firecrawl Scrape
Scrape content from the URL: `$ARGUMENTS`
## Instructions
1. Parse the arguments:
- First argument is the URL to scrape
- Optional `--format` flag: `markdown` (default) or `summary`
2. Run the CLI command:
```bash
cd /Users/nathanvale/code/side-quest-marketplace/plugins/firecrawl && bun run src/cli.ts scrape $ARGUMENTS
```
3. Present the results to the user in a clean format
## Example Usage
```
/firecrawl:scrape https://docs.example.com/api
/firecrawl:scrape https://example.com --format summary
```
## Notes
- Returns markdown content by default
- Content is truncated at 8000 characters to save tokens
- Requires FIRECRAWL_API_KEY environment variable

43
commands/search.md Normal file
View File

@@ -0,0 +1,43 @@
---
description: Search the web and get results with content
argument-hint: <query> [--limit N]
---
# Firecrawl Search
Search the web for: `$ARGUMENTS`
## Instructions
1. Parse the arguments:
- All non-flag arguments form the search query
- Optional `--limit N` flag (default: 5)
2. Run the CLI command:
```bash
cd /Users/nathanvale/code/side-quest-marketplace/plugins/firecrawl && bun run src/cli.ts search $ARGUMENTS
```
3. Present the search results to the user
## Example Usage
```
/firecrawl:search typescript error handling best practices
/firecrawl:search "react hooks" --limit 10
/firecrawl:search site:github.com firecrawl examples
```
## Search Operators
- `"exact phrase"` - Match exact phrase
- `-term` - Exclude term
- `site:example.com` - Limit to domain
- `intitle:word` - Word in page title
- `inurl:word` - Word in URL
## Notes
- Returns up to 10 results with titles, URLs, and descriptions
- Includes scraped markdown content when available
- Requires FIRECRAWL_API_KEY environment variable