LLM Models
An overview of the LLM providers and models you can use with the Voice Agent API.
Defines the LLM (Large Language Model) to be used with your Agent. The provider.type
field specifies the format or protocol of the API.
For example:
open_ai
means the API follows OpenAI’s Chat Completions format.- This option can be used with OpenAI, Azure OpenAI, or Amazon Bedrock — as long as the endpoint behaves like OpenAI’s Chat Completion API.
You can set your Voice Agent’s LLM model in the Settings Message See the docs for more information.
Supported LLM providers
If you don’t specify agent.think.provider.type
the Voice Agent will use Deepgram’s default LLM.
The agent.think.endpoint
is optional or required based on the provider type:
- For
open_ai
andanthropic
, theendpoint
field is optional because Deepgram provides managed LLMs for these providers. - For all other provider types,
endpoint
is required because Deepgram does not manage those LLMs. - If an
endpoint
is provided theurl
is required butheaders
are optional.
Example Payload
JSON
Passing a custom LLM through a Cloud Provider
You can use a custom LLM hosted by a 3rd party Cloud Provider by setting the provider.type
to one of the supported provider values and setting the endpoint.url
and endpoint.headers
fields to the correct values for your Cloud Provider.
JSON