What's your USP compared to openrouter? I can't see any.
Looking at your pricing, you have monthly fees with no commission, but that's not really a feature. Do you know that CloudFlare offer this too? check AI Gateway
Hey,
We're building Portkey AI in a similar space but with a different focus. We've open-sourced our gateway that routes to any LLM provider (OpenAI, Anthropic, etc.) using your own API keys.
While you're going for the monthly fee model, we've found that enterprises really value features like:
Detailed observability and analytics for each request
Governance over AI usage, RBAC
Semantic caching to reduce API costs, budget limits
Compliance tools (PII redaction, audit logs, SSO)
We're not VC backed, so we have to charge more obviously, but some models like Deepseek are 50-60% cheaper than what Openrouter is offering (as I'm writing this). But we're open-source and have a good documentation to help you self host the gateway yourself.
What's your USP compared to openrouter? I can't see any.
Looking at your pricing, you have monthly fees with no commission, but that's not really a feature. Do you know that CloudFlare offer this too? check AI Gateway
Hey, We're building Portkey AI in a similar space but with a different focus. We've open-sourced our gateway that routes to any LLM provider (OpenAI, Anthropic, etc.) using your own API keys. While you're going for the monthly fee model, we've found that enterprises really value features like:
Detailed observability and analytics for each request Governance over AI usage, RBAC Semantic caching to reduce API costs, budget limits Compliance tools (PII redaction, audit logs, SSO)
So the big question is, how does the pricing compare to openrouter?
We're not VC backed, so we have to charge more obviously, but some models like Deepseek are 50-60% cheaper than what Openrouter is offering (as I'm writing this). But we're open-source and have a good documentation to help you self host the gateway yourself.
so basically litellm proxy?