ZappushZappush
SkillsUse CasesBenchmarkCommunitySign In
      Back to Skills
      twinsgeeks

      Safety Report

      DeepSeek — DeepSeek-V3, DeepSeek-R1, DeepSeek-Coder on Your Local Devices

      @twinsgeeks

      DeepSeek models on your local fleet — DeepSeek-V3, DeepSeek-V3.2, DeepSeek-R1, DeepSeek-Coder routed across multiple devices via Ollama Herd. 7-signal scorin...

      120Downloads
      0Installs
      3Stars
      1Versions
      AI & Machine Learning3,159Networking & DNS2,106

      Security Analysis

      medium confidence
      Clean0.04 risk

      The skill's requirements and instructions are coherent with its stated purpose of running DeepSeek models on a local fleet, but installing the third‑party package and pulling large models carries the usual supply‑chain and disk/network risks you should review first.

      Mar 30, 20261 files1 concern
      Purpose & Capabilityok

      Name/description (running DeepSeek via an Ollama Herd router) align with the runtime instructions: installing ollama-herd, running herd/herd-node, and using ollama pull to fetch models. Declared binaries (curl/wget, optional python/pip) make sense for interacting with local HTTP endpoints and installing the Python package.

      Instruction Scopeok

      SKILL.md contains only setup and usage steps for a local fleet router and examples showing how to call localhost endpoints. It does not instruct reading or exfiltrating unrelated system files or environment variables; it even warns not to delete/edit ~/.fleet-manager. Sample code points at localhost (http://localhost:11435).

      Install Mechanismnote

      Installation is via pip install ollama-herd (PyPI) and running local binaries (herd, herd-node). Using PyPI is a common approach but carries moderate supply‑chain risk — the package and its GitHub repo should be reviewed before installation.

      Credentialsok

      The skill declares no required environment variables or unrelated credentials. Metadata lists config paths under ~/.fleet-manager, which are consistent with a fleet manager and are not excessive for the stated purpose.

      Persistence & Privilegeok

      No 'always' privilege requested; the skill is user‑invocable only. It does not request writing to other skills' configs or system‑wide settings in the instructions.

      Guidance

      This skill appears to be what it claims: a guide to running DeepSeek models locally via an Ollama Herd router. Before installing, verify the ollama-herd PyPI package and its GitHub repository (review code, recent activity, and maintainers). Be prepared for large downloads and big disk/RAM usage when pulling models. Run installations on a trusted machine or isolated environment, check network access (model pulls will download large artifacts), and inspect the ~/.fleet-manager directory and any created services before granting broader network access. If you need higher assurance, review the package source or run it in a VM/container first.

      Latest Release

      v1.0.1

      Initial public release of DeepSeek models on local hardware through Ollama Herd. - Run DeepSeek-V3, V3.2, R1, and Coder models locally on Apple Silicon or Linux, with zero cloud costs. - Supports automatic fleet routing: selects the best node for each request based on 7-signal scoring; seamless failover and VRAM-aware fallback. - Compatible with OpenAI and Ollama APIs for chat, code, image generation, speech-to-text, and embeddings. - Provides setup instructions, recommended hardware guidance, and dashboard monitoring at a unified endpoint. - Prioritizes privacy, local performance, and user control over model management.

      More by @twinsgeeks

      Social Analytics. 社交分析。Análisis social.

      3 stars

      Dating - First Date. 约会。Citas.

      3 stars

      Love - Find Love. 爱情。Amor.

      3 stars

      Ollama Proxy

      3 stars

      Latin — Experience Latin Music: 29 Layers of Audio, Lyrics & Equations

      3 stars

      First Date - Dating. 初次约会。Primera cita.

      3 stars

      Published by @twinsgeeks on ClawHub

      Zappush© 2026 Zappush
      HomeGuaranteeSupport

      Something feels unusual? We want to help: [email protected]