Unified search skill with Intelligent Auto-Routing. Uses multi-signal analysis to automatically select between Serper (Google), Tavily (Research), Exa (Neura...
Security Analysis
high confidenceThe skill's code, instructions, and optional environment variables are consistent with a multi-provider web search aggregator—nothing in the bundle is disproportionate to that purpose, though there are a few operational privacy considerations to review before use.
The name/description match the delivered files and runtime behavior: included Python scripts implement multi-provider search and auto-routing. Required binaries (python3, bash) are expected for running the provided scripts. The declared optional env vars are provider API keys that are appropriate for the described providers (Serper, Tavily, Exa, You.com, Kilo/Perplexity, SearXNG).
SKILL.md directs the agent/operator to run the supplied scripts (setup.py, search.py), copy config.example.json, and set API keys. These instructions stay within the skill's purpose, but the runtime behavior includes: (1) auto-loading a .env file from the skill directory, and (2) writing/reading a local cache under the skill directory (.cache/) that stores queries and provider metadata. Those actions are expected for this type of tool but are relevant privacy/operational considerations.
There is no external install spec or remote binary download—this is a scripts-and-docs package shipped as-is. That keeps install risk low: nothing is fetched from arbitrary URLs during installation. Running the included Python scripts executes shipped code (normal for an instruction-only skill with bundled scripts).
No required credentials are demanded up-front. The optional environment variables listed (SERPER_API_KEY, TAVILY_API_KEY, EXA_API_KEY, YOU_API_KEY, KILOCODE_API_KEY, SEARXNG_INSTANCE_URL) directly map to the providers the skill advertises. The number of optional keys is reasonable for an aggregator that supports multiple backends. Note: the code auto-loads a local .env file if present (it only sets variables not already in the environment).
The skill does not request always:true or system-wide privileges. It writes cache files into a .cache/ directory inside the skill folder (or WSP_CACHE_DIR if overridden) and stores provider health data there; this is normal for caching but means query data and provider metadata persist on disk. The skill does not appear to modify other skills or global agent config.
Guidance
This skill appears to do what it claims, but review and accept these operational details before installing: - The package runs local Python code (scripts/setup.py, scripts/search.py). Only run it if you trust the shipped code or have inspected it. Running setup.py/search.py executes the included logic (no remote install needed). - It auto-loads a .env file from the skill directory if present. Avoid placing high-value secrets in that file unless you intend to and the file is properly protected and gitignored. - By default it caches search requests (queries, routing/provider metadata, cached responses) under .cache/ inside the skill folder. If this is sensitive, set WSP_CACHE_DIR to a secure path or use --no-cache and periodically clear the cache (python3 scripts/search.py --clear-cache). - Perplexity access is via a Kilo gateway and requires KILOCODE_API_KEY; only provide keys you control and understand. All other provider keys are optional and directly related to the stated providers. - The registry metadata indicated 'source: unknown' even though package files reference a GitHub/ClawHub URL—if provenance matters, verify the upstream repository and maintainers before use. If you want additional assurance, you can: (a) inspect scripts/setup.py and scripts/search.py for any unexpected outbound endpoints before running them, (b) run the skill in an isolated environment or container, and (c) set the cache dir to a controlled location and ensure .env/config.json are not committed to version control.
Latest Release
v2.8.6
Documented Perplexity Sonar Pro usage and refreshed release docs.
Popular Skills
Published by @robbyczgw-cla on ClawHub