智能每日新闻简报 Skill。支持两种新闻获取模式:(1) 主动搜索 - 基于用户关注的话题/关键词/人名,通过多 AI API 检索全网信息;(2) RSS 订阅 - 从配置的 RSS/Atom 源自动抓取新闻。 支持智能去重、兴趣管理、AI 摘要、定时推送。 触发关键词: "newsman", "新闻简报",...
Security Analysis
medium confidenceThe skill's code and SKILL.md match a news-aggregator purpose, but the registry metadata omits several required sensitive environment variables (multiple LLM API keys and a gateway token) and that mismatch plus broad external API use warrants caution.
The name/description (news aggregator with AI search, RSS, and push delivery) aligns with the included scripts (RSS fetcher, summarizer, search engine, digest generator). However the registry metadata claims no required environment variables or primary credential, while SKILL.md and multiple code files clearly expect several API keys (KIMI_API_KEY, MINIMAX_API_KEY, ANTHROPIC_API_KEY) and push gateway credentials. The presence of multiple LLM provider clients is coherent with a multi-provider search aggregator, but the metadata omission is inconsistent and should be corrected/clarified.
Runtime instructions and code focus on fetching RSS feeds, running web-search via configured LLM APIs, summarizing and formatting digests, caching results, and optionally pushing digests to channels (Slack/OpenClaw gateway). The SKILL.md and scripts reference only local config, feed sources, caches, and configured push endpoints; they do not instruct reading arbitrary system files or hidden data. The cron examples schedule local script execution. Review of the remaining truncated orchestrator scripts (newsman.py, search_engine.py, result_aggregator.py) is recommended to confirm no unexpected operations, but nothing in provided files indicates clandestine data collection beyond configured sources and delivery channels.
No install spec in the registry (instruction-only install). The README instructs creating a venv and pip installing requirements.txt — a normal, low-risk install path. requirements.txt lists standard libraries (openai, anthropic, requests, beautifulsoup4). There are no downloads from arbitrary URLs or extract/install steps that would write unknown binaries to disk.
Although requiring API keys for multiple LLM/search providers is reasonable for a multi-backend news search/summarization tool, the package metadata advertises no required env vars while SKILL.md and code require several sensitive values (KIMI_API_KEY, MINIMAX_API_KEY, ANTHROPIC_API_KEY, and GATEWAY_URL/GATEWAY_TOKEN for push). The gateway token in particular could allow posting messages through an OpenClaw gateway or similar — ensure tokens given are scoped and limited. The number and sensitivity of credentials should be documented in the registry metadata so users can decide before installing.
The skill does not request 'always: true' and is user-invocable only; it does provide cron examples for scheduled runs, which require explicit user action to set up. There is no evidence it attempts to modify other skills or system-wide agent settings. Autonomous invocation is allowed by platform defaults but is not combined here with other high-risk indicators.
Guidance
What to check before installing: - Metadata mismatch: the registry lists no required env vars but SKILL.md and code require multiple API keys (KIMI_API_KEY, MINIMAX_API_KEY, ANTHROPIC_API_KEY) and gateway credentials (GATEWAY_URL, GATEWAY_TOKEN). Ask the publisher to correct the metadata or document required secrets clearly. - Limit credential scope: only provide API keys with minimal permissions and monitor usage. For the gateway token, prefer a token scoped to a single delivery channel or create a dedicated service account for digests. - Run in isolation: install and run the skill in a dedicated Python virtualenv or isolated environment and avoid reusing high-privilege credentials used elsewhere. - Review push code: before handing over GATEWAY_TOKEN, inspect the scripts that perform delivery (search for code that posts to GATEWAY_URL or sends to Slack) to confirm it only transmits configured digest content and does not exfiltrate other local files or secrets. - Dry-run first: use the provided --dry-run options and run summarization in extractive/local mode (no API keys) to verify behavior before enabling external APIs. - Audit network traffic: if possible, monitor outbound connections during an initial run to confirm requests go only to expected endpoints (configured RSS feeds and declared API hosts such as api.moonshot.cn, api.minimax.chat, api.anthropic.com). - Request missing details: ask the skill author for the full content of the orchestration scripts (newsman.py, search_engine.py, result_aggregator.py) if you need higher assurance — these were truncated in the packaged review and may contain additional behaviors. Given the metadata inconsistency and multiple sensitive keys, proceed cautiously and only after the above checks.
Latest Release
v1.0.0
Claw-News 1.0.0 — 智能每日新闻简报发布 - 支持主动搜索、RSS 订阅、定向爬虫三大新闻获取模式 - 提供兴趣管理、智能去重、AI 摘要、定时推送功能 - 多 API(Kimi、MiniMax、Claude)联合检索,支持兴趣关键词及人名关注 - 可自定义 RSS 源、Kickstarter 超热门众筹项目跟踪 - 支持通过 Slack/Channel 推送新闻简报和 Cron 定时任务 - 详细命令与配置文档,便于快速上手与扩展
More by @russellfei
Published by @russellfei on ClawHub