LiteLLM server now supported on Vercel

Vercel News Mar 16, 2026

You can now deploy LiteLLM server on Vercel, giving developers LLM access with an OpenAI-compatible gateway connecting to any supported provider, including Vercel AI Gateway.

To route a single model through Vercel AI Gateway, use the below configuration in litellm_config.yaml:

Deploy LiteLLM on Vercel or learn more on our documentation

Read more

link to the original content