
You can now deploy LiteLLM server on Vercel, giving developers LLM access with an OpenAI-compatible gateway connecting to any supported provider, including Vercel AI Gateway.
To route a single model through Vercel AI Gateway, use the below configuration in litellm_config.yaml:
Deploy LiteLLM on Vercel or learn more on our documentation