Skip to main content

Adding an AI Provider

This article explains how to connect elvex with AI Providers like OpenAI, Anthropic, and others, so you can make use of their LLMs.

Updated over a week ago

Adding an AI provider in elvex is how you allow your elvex users to access LLMs that will power the agents you build in elvex. Without an AI provider, users won't have access to any large language models (LLMs), rendering the agents non-functional. Here’s how you can add one or more AI Providers to your account.

Note: Only elvex Admin users have the authority to add AI providers.

Supported AI Providers

elvex currently supports the following AI providers:

  • OpenAI

  • Azure OpenAI

  • Anthropic

  • Google Gemini

  • Cohere

  • Mistral

  • AWS Bedrock

  • Additional providers will be supported in the future

Step-by-Step Guide to Adding an AI Provider

  1. Navigate to Settings: Click on "Settings" from the left navigation menu

  2. Select "AI Providers": In Settings, click the "AI Providers" option

  3. Click "Add a Provider": From AI Providers, click the "Add a Provider"

  4. Enter Provider Details: Fill in the required fields

    • Provider Name: If left blank, this field will default to the name of the Provider (e.g., OpenAI, Anthropic, etc.). If this is fine, you can move to the next field, or you can add a more descriptive name.

    • API Key: The API key provided by the AI provider.

    For example, you can find instructions on how to create an API key for:

  5. Create a New Agent: To test the integration, create a new agent and send a test message. A simple passthrough agent to GPT-4o would suffice. [Click here to learn more about creating agents].

  6. Send a Test Message: Use the newly created agent to send a test message to ensure that the AI provider is functioning correctly.

Important Notes on Generating an API Key from an AI Provider

  • Read & Write Access: elvex will need read and write access to connect with the AI Provider of your choice. If the API key you generate for the AI Provider does not have read and write access, the connection between elvex and that provider will not be successful.

  • Fund Your Provider Account: The account you generate the AI Provider API key from must be funded and connected with a valid credit card, otherwise the API key will generate an error when trying to use it.

Getting AWS Bedrock Credentials and Model ID

If you wish to use AWS Bedrock to host your models we recommend following the AWS Getting Started document here: https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started.html

For a quick step by step tutorial, follow these steps to get the Access Key ID, Secret Access Key, and Model ID you’ll need to use AWS Bedrock.

1. Create an IAM User (Programmatic Access)

  1. Go to the AWS Management ConsoleIAMUsersCreate user

  2. Enable Programmatic access (this generates access keys)

  3. Attach a policy that allows Bedrock use:

    • For testing: AmazonBedrockFullAccess

    • For production: create a least-privilege policy with only the actions you need (`bedrock:InvokeModel`, `bedrock:ListFoundationModels`, etc.)

2. Create (or View) the Access Keys

  1. After creating the IAM user, open the user page

  2. Go to the Security credentials tab → Create access key

  3. Copy and save both the Access Key ID and Secret Access Key

  4. Store these access keys securely for later use

Note: The Secret Access Key is only shown once. If lost, you must generate a new one.

3. Enable Model Access

  1. In the AWS Console go to Amazon Bedrock

  2. Navigate to Model access (under Bedrock configurations)

  3. Click Modify model access and request access for the models you need

  4. Accept any required agreements (e.g., provider EULAs)

  5. Wait for access to be granted (usually a few minutes)

4. Find the Model ID

Go to the Bedrock console → Base models list. Click a model to view its Model ID

5. Setup up an elvex Provider

  1. Go to elvex SettingsAI Providers

  2. Click Add a Provider

  3. Choose Bedrock

  4. Enter an optional Name, your AWS Access Key ID, AWS Secret Access Key, and Model ID

  5. Click Add

You will now be able to choose this model as a provider in your elvex agents!

Adding Azure OpenAI

When adding Azure OpenAI, you will need to provide specific configuration details from your Azure portal in addition to the standard fields.

  1. Select Azure OpenAI: From the Provider dropdown menu, select Azure OpenAI

  2. Enter a name to help identify this provider. If left blank, it defaults to "OpenAI"

  3. Enter the Endpoint URL for your resource.

    • This typically follows the format: https://{resource-name}.openai.azure.com

  4. Enter your API Key

    • You can find this value in the Keys & Endpoint section when examining your resource from the Azure portal

  5. Enter the Deployment Name

    • This value corresponds to the custom name you chose for your deployment when you deployed a model.

    • You can find this value under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure OpenAI Studio.

  6. Select a Model from the dropdown menu

Note: This model selector is for display purposes only. Each deployment has a model associated with it on the Azure side. The actual model used by the agent will be whatever model is configured in your Azure deployment settings, regardless of what is selected here.

Did this answer your question?