Azure Open AI Model Configuration
  • 12 Aug 2025
  • 1 Minute to read
  • Contributors
  • Dark
    Light

Azure Open AI Model Configuration

  • Dark
    Light

Article summary

Connection

To integrate your Unleash self-hosted environment with Azure OpenAI, your instance must be deployed with the models Unleash uses to operate:

Name

Model

Version

Min Recommended Rate Limit

gpt4o

gpt-4o

2024-11-20

200K Tokens per Minute

gpt4o-mini

gpt-4o-mini

2024-07-18

200K Tokens per Minute

gpt4.1

gpt-4.1

2025-04-14

500K Tokens per Minute

gpt4.1-mini

gpt-4.1-mini

2025-04-14

200K Tokens per Minute

gpt4.1-nano

gpt-4.1-nano

2025-04-14

200K Tokens per Minute

gpt5

gpt-5

2025-08-07

500K Tokens per Minute

gpt5-mini

gpt-5-mini

2025-08-07

200K Tokens per Minute

text-embedding-ada-003

text-embedding-3-large

1

300K Tokens per Minute

Regions

To get the best from Unleash we recommend the following regions:

  • EU Deployment is recommended to be hosted in the swedencentral region

  • US Deployment is recommended to be hosted in the eastus or eastus2 region

Please provide your Unleash representative with the following information:

  1. Instance URL

  2. API Key


Was this article helpful?

What's Next
ESC

Eddy AI, facilitating knowledge discovery through conversational intelligence