Skip to main content

Adding an AI Provider

Adding an AI provider in elvex is how you allow your elvex users to access LLMs that will power elvex. Without an AI provider, users won't have access to any large language models (LLMs), rendering elvex non-functional.

Updated over a week ago

Note: Only elvex Admin users have permissions to add AI providers.

Initial Setup Requirement

Important: When your organization is first invited to elvex, the person who receives the initial invitation from the elvex team must set up at least one AI provider with a valid API key before anyone can access elvex. Until this initial provider is configured, elvex will not be accessible to any users in your organization.

After the initial AI provider is set up, additional providers can be added at any time through Settings by users with Admin permissions.

Supported AI Providers

elvex currently supports the following AI providers:

  • OpenAI

  • Azure OpenAI

  • Azure AI Foundry

  • Anthropic

  • Google Gemini

  • Cohere

  • Mistral

  • AWS Bedrock

  • xAI (Grok)

Additional providers will be supported in the future.

Who Can Add AI Providers

Only elvex Admin users have permissions to add AI providers.

Step-by-Step Guide to Adding an AI Provider

  1. Navigate to Settings by clicking on "Settings" from the left navigation menu

  2. Select AI Providers in the Settings menu

  3. Click Add a Provider to begin adding a new provider

  4. Enter Provider Details by filling in the required fields:

    • Provider Name: If left blank, this field will default to the name of the provider (e.g., OpenAI, Anthropic, etc.). You can add a more descriptive name if needed

    • API Key: The API key provided by the AI provider

  5. Create a New Agent to test the integration—a simple passthrough agent to GPT-4o would suffice. [Click here to learn more about creating agents]

  6. Send a Test Message using the newly created agent to ensure that the AI provider is functioning correctly

Finding API Keys for Common Providers

For instructions on how to create an API key, refer to the documentation for your chosen provider:

Important Notes on Generating an API Key from an AI Provider

Read & Write Access: elvex will need read and write access to connect with the AI provider of your choice. If the API key you generate for the AI provider does not have read and write access, the connection between elvex and that provider will not be successful.

Fund Your Provider Account: The account you generate the AI provider API key from must be funded and connected with a valid credit card, otherwise the API key will generate an error when trying to use it.

Getting AWS Bedrock Credentials and Model ID

If you wish to use AWS Bedrock to host your models, we recommend following the AWS Getting Started document.

For a quick step-by-step tutorial, follow these steps to get the Access Key ID, Secret Access Key, and Model ID you'll need to use AWS Bedrock.

1. Create an IAM User (Programmatic Access)

  1. Go to the AWS Management Console → IAM → Users → Create user

  2. Enable Programmatic access (this generates access keys)

  3. Attach a policy that allows Bedrock use:

    • For testing: AmazonBedrockFullAccess

    • For production: create a least-privilege policy with only the actions you need (bedrock:InvokeModel, bedrock:ListFoundationModels, etc.)

2. Create (or View) the Access Keys

  1. After creating the IAM user, open the user page

  2. Go to the Security credentials tab → Create access key

  3. Copy and save both the Access Key ID and Secret Access Key

  4. Store these access keys securely for later use

Note: The Secret Access Key is only shown once. If lost, you must generate a new one.

3. Enable Model Access

  1. In the AWS Console, go to Amazon Bedrock

  2. Navigate to Model access (under Bedrock configurations)

  3. Click Modify model access and request access for the models you need

  4. Accept any required agreements (e.g., provider EULAs)

  5. Wait for access to be granted (usually a few minutes)

4. Find the Model ID

  1. Go to the Bedrock console → Base models list

  2. Click a model to view its Model ID

5. Set up an elvex Provider

  1. Go to elvex Settings → AI Providers

  2. Click Add a Provider

  3. Choose Bedrock

  4. Enter an optional Name, your AWS Access Key ID, AWS Secret Access Key, and Model ID

  5. Click Add

You will now be able to choose this model as a provider in your elvex agents!

Adding Azure OpenAI

When adding Azure OpenAI, you will need to provide specific configuration details from your Azure portal in addition to the standard fields.

  1. Select Azure OpenAI from the Provider dropdown menu

  2. Enter a name to help identify this provider (if left blank, it defaults to "OpenAI")

  3. Enter the Endpoint URL for your resource

  4. Enter your API Key

    • You can find this value in the Keys & Endpoint section when examining your resource from the Azure portal

  5. Enter the Deployment Name

    • This value corresponds to the custom name you chose for your deployment when you deployed a model

    • You can find this value under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure OpenAI Studio

  6. Select a Model from the dropdown menu

    • Note: This model selector is for display purposes only. Each deployment has a model associated with it on the Azure side. The actual model used by the agent will be whatever model is configured in your Azure deployment settings, regardless of what is selected here

Adding Azure AI Foundry

When adding Azure AI Foundry, you will need to provide specific configuration details from your Azure AI Foundry portal. Azure AI Foundry allows you to deploy and access models from various providers, including Anthropic's Claude models.

Finding Your Azure AI Foundry Configuration Details

Before adding Azure AI Foundry as a provider in elvex, you'll need to gather the following information from your Azure AI Foundry deployment:

  1. Navigate to your Azure AI Foundry project in the Azure portal

  2. Select your deployment from the deployments list

  3. Click on the Details tab to view your deployment information

  4. Locate the following values:

    • Target URI - This is your endpoint URL

    • Key - This is your API key

    • Name (under Deployment info) - This is your deployment name

Adding Azure AI Foundry in elvex

  1. Go to Settings > AI Providers

  2. Click Add a Provider

  3. Select Azure AI Foundry from the Provider dropdown menu

  4. Enter a name (Optional)

    • Enter a name to help identify this provider. If left blank, it defaults to "Azure AI Foundry"

  5. Enter the Endpoint URL

    • Use the Target URI value from your Azure AI Foundry deployment details. This typically follows the format: https://{resource-name}.services.ai.azure.com/anthropic/v1/messages

    • Important: Do not use the "Project's deployment endpoint" URL. You must use the Target URI that ends with /anthropic/v1/messages or the appropriate path for your model provider

  6. Enter your API Key

    • Copy the Key value from your Azure AI Foundry deployment details. Click the copy button in the Azure portal to ensure you capture the complete key without any missing characters

  7. Enter the Model Name

    • Enter the Name value from the Deployment info section of your Azure AI Foundry deployment details (for example: hhcs-insights-claude-opus-4-6)

    • Important: This should be your deployment name, not the underlying model name. Make sure there are no leading or trailing spaces in this field

  8. Select the Output Modality

    • Choose the appropriate output modality for your deployment:

      • Text - For text-based models

      • Image - For image generation models

  9. Click Add

Common Issues and Troubleshooting

Wrong Endpoint URL

  • Issue: You receive an error stating the deployment does not exist

  • Solution: Verify you are using the Target URI from your Azure AI Foundry deployment details, not the "Project's deployment endpoint." The correct URL should end with the model provider's API path (e.g., /anthropic/v1/messages)

Extra Spaces in Model Name

  • Issue: elvex cannot find your deployment even though it exists in Azure

  • Solution: Check that there are no leading or trailing spaces in the Model Name field. The deployment name should be entered exactly as it appears in Azure without any extra whitespace

Incomplete API Key

  • Issue: You receive a validation error when adding the provider

  • Solution: Ensure you copied the complete API key from Azure. Use the copy button in the Azure portal rather than manually selecting the text to avoid missing characters

Model Not Available

  • Issue: The provider is added successfully but the model doesn't work in agents

  • Solution: Verify that your Azure AI Foundry deployment is active and the model has been successfully deployed. Check the deployment status in the Azure portal

Related Articles

Did this answer your question?