Initial commit
This commit is contained in:
12
.claude-plugin/plugin.json
Normal file
12
.claude-plugin/plugin.json
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
{
|
||||||
|
"name": "ai-news-crawler",
|
||||||
|
"description": "Trigger phrase \"watching news!\"; create a .json file named by creation time only in the current directory, fetch the 10 most recent AI news articles in descending order by time, and output a Chinese summary.",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"author": {
|
||||||
|
"name": "zijie",
|
||||||
|
"email": "zijie@feedmob.com"
|
||||||
|
},
|
||||||
|
"skills": [
|
||||||
|
"./skills/SKILL.md"
|
||||||
|
]
|
||||||
|
}
|
||||||
3
README.md
Normal file
3
README.md
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
# ai-news-crawler
|
||||||
|
|
||||||
|
Trigger phrase "watching news!"; create a .json file named by creation time only in the current directory, fetch the 10 most recent AI news articles in descending order by time, and output a Chinese summary.
|
||||||
45
plugin.lock.json
Normal file
45
plugin.lock.json
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
{
|
||||||
|
"$schema": "internal://schemas/plugin.lock.v1.json",
|
||||||
|
"pluginId": "gh:feed-mob/claude-code-marketplace:plugins/ai-news-crawler",
|
||||||
|
"normalized": {
|
||||||
|
"repo": null,
|
||||||
|
"ref": "refs/tags/v20251128.0",
|
||||||
|
"commit": "fbf2e0437a012d7bee519ec2bd0aa7da73b8bdf0",
|
||||||
|
"treeHash": "8edf912be8cbfac1aba5d313adefb808481978167e6be3f9def705dcf9b34319",
|
||||||
|
"generatedAt": "2025-11-28T10:16:52.795563Z",
|
||||||
|
"toolVersion": "publish_plugins.py@0.2.0"
|
||||||
|
},
|
||||||
|
"origin": {
|
||||||
|
"remote": "git@github.com:zhongweili/42plugin-data.git",
|
||||||
|
"branch": "master",
|
||||||
|
"commit": "aa1497ed0949fd50e99e70d6324a29c5b34f9390",
|
||||||
|
"repoRoot": "/Users/zhongweili/projects/openmind/42plugin-data"
|
||||||
|
},
|
||||||
|
"manifest": {
|
||||||
|
"name": "ai-news-crawler",
|
||||||
|
"description": "Trigger phrase \"watching news!\"; create a .json file named by creation time only in the current directory, fetch the 10 most recent AI news articles in descending order by time, and output a Chinese summary.",
|
||||||
|
"version": "1.0.0"
|
||||||
|
},
|
||||||
|
"content": {
|
||||||
|
"files": [
|
||||||
|
{
|
||||||
|
"path": "README.md",
|
||||||
|
"sha256": "38c8f5a6e19615e2b757f5bf43a00f630e5e6408dd3f63baf31e5651f72865a6"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"path": ".claude-plugin/plugin.json",
|
||||||
|
"sha256": "060457ec87281f322e0a759dc3cf8e8f5b327ad75d0c225718ccb8cd3f8514af"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"path": "skills/SKILL.md",
|
||||||
|
"sha256": "0c4844f2e236b4678d9bc5abdaf520a9e163a9092638de0ae0787b6d524f06ff"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dirSha256": "8edf912be8cbfac1aba5d313adefb808481978167e6be3f9def705dcf9b34319"
|
||||||
|
},
|
||||||
|
"security": {
|
||||||
|
"scannedAt": null,
|
||||||
|
"scannerVersion": null,
|
||||||
|
"flags": []
|
||||||
|
}
|
||||||
|
}
|
||||||
52
skills/SKILL.md
Normal file
52
skills/SKILL.md
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
---
|
||||||
|
name: ai-news-crawler
|
||||||
|
description: Trigger phrase \"watching news!\"; Create a JSON file named with the current timestamp only in the current directory, crawl 10 AI-related news in descending order of time and output Chinese summaries
|
||||||
|
---
|
||||||
|
|
||||||
|
# Purpose
|
||||||
|
Crawl the latest AI-related news from multiple websites, merge and deduplicate them, select 10 in descending order of time, unify summaries into Chinese, and write to a JSON file.
|
||||||
|
|
||||||
|
# When to Trigger
|
||||||
|
When the request contains the phrase "Go, Little Plane".
|
||||||
|
|
||||||
|
# Inputs
|
||||||
|
- optional keywords?: Topic keywords (optional)
|
||||||
|
- optional time_window?: Time window (default: 144h)
|
||||||
|
|
||||||
|
# Output
|
||||||
|
- news_json_file_only
|
||||||
|
|
||||||
|
# Constraints
|
||||||
|
- Create only one file: <YYYYMMDD_HHMMSS>.json in the current working directory; Do not output any text or paths in the conversation.
|
||||||
|
- Each record must include fields: title, source, url, published_at (ISO8601), summary_zh, language (zh/en/mixed).
|
||||||
|
- Fixed quantity: 10 entries, sorted in descending order of published_at; Strict deduplication (using url and title as keys).
|
||||||
|
- Summaries for English content must be in Chinese; Retain the original language identifier in the language field.
|
||||||
|
- Avoid crawling paywalled or login-required pages; Exclude ads and sponsored content; Do not write sensitive information or keys.
|
||||||
|
- If the publication time is missing, use the explicit time on the page; If unavailable, exclude the entry or mark published_at: null and lower its priority—do not guess.
|
||||||
|
|
||||||
|
# Candidate Sources
|
||||||
|
- The Verge / AI tag
|
||||||
|
- TechCrunch / AI tag
|
||||||
|
- Liangziwei (Quantum Bits) / AI section
|
||||||
|
- 36Kr / Artificial Intelligence tag
|
||||||
|
- Official Announcements / Anthropic, Google AI Blog (select latest announcements)
|
||||||
|
|
||||||
|
# Steps
|
||||||
|
1. Confirm time_window and keywords (if provided).
|
||||||
|
2. Iterate through candidate sources, crawl recent content lists, and parse title/url/published_at and body summaries.
|
||||||
|
3. Condense and translate English summaries into Chinese, maintaining neutrality and information density.
|
||||||
|
4. Normalize timestamps to ISO8601, merge lists, and deduplicate by url/title.
|
||||||
|
5. Sort in descending order of published_at, select the top 10 entries; Write to <YYYYMMDD_HHMMSS>.json as a JSON array.
|
||||||
|
6. Do not output any text in the conversation; Only create the file.
|
||||||
|
|
||||||
|
# Output Example (Structure)
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"title": "示例标题",
|
||||||
|
"source": "TechCrunch",
|
||||||
|
"url": "https://example.com/news/123",
|
||||||
|
"published_at": "2025-11-16T08:30:00Z",
|
||||||
|
"summary_zh": "用中文概括要点,80-150字,覆盖背景、事件与影响。",
|
||||||
|
"language": "en"
|
||||||
|
}
|
||||||
|
]
|
||||||
Reference in New Issue
Block a user