Option 1 — npm SDK (simplest)
Wrap your existing Anthropic / OpenAI client. No infrastructure changes.Option 2 — Docker (self-hosted)
http://localhost:8080 instead of the LLM
provider’s API URL.
Best for: regulated environments, on-prem, self-hosted Kubernetes.
Option 3 — Vercel Edge
Deploy the gateway as an edge function alongside your Next.js app.Verify install
After install, make any LLM tool call. Within 30 seconds it should appear at app.inviolet.ai/decision-feed.Read next
- Connect your first data source
- Quickstart for the end-to-end walkthrough