Getting started
In this guide, you will learn how to set up and use your first AI Gateway.
Before making requests, you need two things:
- Your Account ID — find it in the Cloudflare dashboard.
- A Cloudflare API token — create an API token with
AI Gateway - ReadandAI Gateway - Editpermissions. The example below also uses Workers AI, so addWorkers AI - Readas well.
Run the following command to make your first request through AI Gateway:
curl -X POST https://gateway.ai.cloudflare.com/v1/$CLOUDFLARE_ACCOUNT_ID/default/compat/chat/completions \ --header "cf-aig-authorization: Bearer $CLOUDFLARE_API_TOKEN" \ --header 'Content-Type: application/json' \ --data '{ "model": "workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'Create a gateway manually
You can also create gateways manually with a custom name and configuration through the dashboard or API.
- Log into the Cloudflare dashboard ↗ and select your account.
- Go to AI > AI Gateway.
- Select Create Gateway.
- Enter your Gateway name. Note: Gateway name has a 64 character limit.
- Select Create.
To set up an AI Gateway using the API:
-
Create an API token with the following permissions:
AI Gateway - ReadAI Gateway - Edit
-
Get your Account ID.
-
Using that API token and Account ID, send a
POSTrequest to the Cloudflare API.
Authenticate with your upstream AI provider using one of the following options:
- Unified Billing: Use the AI Gateway billing to pay for and authenticate your inference requests. Refer to Unified Billing.
- BYOK (Store Keys): Store your own provider API Keys with Cloudflare, and AI Gateway will include them at runtime. Refer to BYOK.
- Request headers: Include your provider API Key in the request headers as you normally would (for example,
Authorization: Bearer <OPENAI_API_KEY>).
The easiest way to get started with AI Gateway is through our OpenAI-compatible /chat/completions endpoint. This allows you to use existing OpenAI SDKs and tools with minimal code changes while gaining access to multiple AI providers.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions
Key benefits:
- Drop-in replacement for OpenAI API, works with existing OpenAI SDKs and other OpenAI compliant clients
- Switch between providers by changing the
modelparameter - Dynamic Routing - Define complex routing scenarios requiring conditional logic, conduct A/B tests, set rate / budget limits, etc
import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "openai/gpt-5.2", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "anthropic/claude-4-5-sonnet", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "google/gemini-2.5-pro", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "grok/grok-4", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "dynamic/customer-support", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{cf_api_token}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{openai_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "openai/gpt-5.2", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{anthropic_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "anthropic/claude-4-5-sonnet", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{google_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "google/gemini-2.5-pro", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{grok_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "grok/grok-4", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{dynamic_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "dynamic/customer-support", messages: [{ role: "user", content: "Hello, world!" }],});import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{workers-ai_api_token}", defaultHeaders: { // if gateway is authenticated "cf-aig-authorization": `Bearer {cf_api_token}`, }, baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast", messages: [{ role: "user", content: "Hello, world!" }],});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('openai/gpt-5.2')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('anthropic/claude-4-5-sonnet')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('google/gemini-2.5-pro')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('grok/grok-4')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('dynamic/customer-support')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('openai/gpt-5.2')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('anthropic/claude-4-5-sonnet')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('google/gemini-2.5-pro')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('grok/grok-4')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('dynamic/customer-support')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createOpenAI } from 'ai-gateway-provider/providers/openai';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const openai = createOpenAI();
const { text } = await generateText({ model: aigateway(openai.chat('gpt-5.2')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createAnthropic } from 'ai-gateway-provider/providers/anthropic';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const anthropic = createAnthropic();
const { text } = await generateText({ model: aigateway(anthropic('claude-4-5-sonnet')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createGoogle } from 'ai-gateway-provider/providers/google';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const google = createGoogle();
const { text } = await generateText({ model: aigateway(google('gemini-2.5-pro')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createXai } from 'ai-gateway-provider/providers/xai';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const xai = createXai();
const { text } = await generateText({ model: aigateway(xai('grok-4')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('customer-support')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified();
const { text } = await generateText({ model: aigateway(unified('@cf/meta/llama-3.3-70b-instruct-fp8-fast')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createOpenAI } from 'ai-gateway-provider/providers/openai';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const openai = createOpenAI({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(openai.chat('gpt-5.2')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createAnthropic } from 'ai-gateway-provider/providers/anthropic';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const anthropic = createAnthropic({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(anthropic('claude-4-5-sonnet')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createGoogle } from 'ai-gateway-provider/providers/google';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const google = createGoogle({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(google('gemini-2.5-pro')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createXai } from 'ai-gateway-provider/providers/xai';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const xai = createXai({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(xai('grok-4')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('customer-support')), prompt: 'What is Cloudflare?',});import { createAiGateway } from 'ai-gateway-provider';import { createUnified } from 'ai-gateway-provider/providers/unified';import { generateText } from "ai";
const aigateway = createAiGateway({ accountId: "{CLOUDFLARE_ACCOUNT_ID}", gateway: '{GATEWAY_NAME}', apiKey: '{CF_AIG_TOKEN}',});
const unified = createUnified({ apiKey: '{API_KEY}' });
const { text } = await generateText({ model: aigateway(unified('@cf/meta/llama-3.3-70b-instruct-fp8-fast')), prompt: 'What is Cloudflare?',});curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "openai/gpt-5.2", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "anthropic/claude-4-5-sonnet", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "google/gemini-2.5-pro", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "grok/grok-4", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "dynamic/customer-support", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Content-Type: application/json' \ --data '{ "model": "workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {openai_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "openai/gpt-5.2", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {anthropic_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "anthropic/claude-4-5-sonnet", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {google_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "google/gemini-2.5-pro", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {grok_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "grok/grok-4", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {dynamic_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "dynamic/customer-support", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \ --header 'cf-aig-authorization: Bearer {CF_AIG_TOKEN}' \ --header 'Authorization: Bearer {workers-ai_api_token}' \ --header 'Content-Type: application/json' \ --data '{ "model": "workers-ai/@cf/meta/llama-3.3-70b-instruct-fp8-fast", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'Refer to Unified API to learn more about OpenAI compatibility.
For direct integration with specific AI providers, use dedicated endpoints that maintain the original provider's API schema while adding AI Gateway features.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/{provider}Available providers:
- OpenAI - GPT models and embeddings
- Anthropic - Claude models
- Google AI Studio - Gemini models
- Workers AI - Cloudflare's inference platform
- AWS Bedrock - Amazon's managed AI service
- Azure OpenAI - Microsoft's OpenAI service
- and more...
- Learn more about caching for faster requests and cost savings and rate limiting to control how your application scales.
- Explore how to specify model or provider fallbacks, ratelimits, A/B tests for resiliency.
- Learn how to use low-cost, open source models on Workers AI - our AI inference service.