MeG4/Autonom4
PricingDashboardSign in
Get started
▍ pricing

Pricing

Pay for what gets resolved. Pick Hosted (we supply the LLM keys) or BYOK (any LLM — OpenAI, Anthropic, OpenRouter, Ollama, your own endpoint — you pay the provider, platform-only fee from us).

Free
Starter
Growth
Most popular
Scale
Enterprise
Hosted€49€299€1,499Contact
BYOK€0€25€129€599Contact
Bugs / month5251501,000Custom
Reviewers355 + custom1–20 + quorumCustom
SSO + SCIM
On-prem option
Overage hosted€3 / bug€2.50 / bug€2 / bugCustom
Overage BYOKhard cap€1.50 / bug€1.25 / bug€1 / bugCustom
Start freeStart trialStart trialStart trialContact sales

Hosted vs BYOK

Hosted: Autonom4 supplies the LLM keys and pays the inference bill — one number on your invoice. BYOK: bring any LLM provider — OpenAI, Anthropic, OpenRouter, a self-hosted Ollama, your own custom endpoint, anything LiteLLM speaks. You pay the provider directly. We charge a lower platform-only fee — and on Free, that fee is €0. Free is BYOK-only. Switch any time from settings.

Add-ons

Available on Growth and above. Stack any combination on top of your plan.

Priority lane

+€199 / mo

Skip the queue on every run.

Custom reviewer prompts

+€499 / mo

Author and ship reviewer prompts tuned to your codebase.

Dedicated Slack + 24h SLA

+€799 / mo

Shared Slack channel with a 24h response SLA.

On-prem gateway (Enterprise only)

€5,000 setup + €999 / mo support

Run the Autonom4 gateway inside your VPC.