Configure LLM Connector
Use this topic to configure LLM connector in Routing Director.
LLM connector connects large language models (LLMs) to the Routing Director infrastructure. This connection allows a user to query their network for troubleshooting and monitoring through Routing Director by using conversational English, bypassing the need for traditional CLI commands. For example, What is the status of the OSPF neighbors of device-a?
To be able to query your network through Routing Director, you should configure the LLM connector in Routing Director.
|
Field |
Description |
|---|---|
|
Name* |
Enter a name for the LLM model. The name need not be unique within an organization. For example, demo-llm. |
|
Provider* |
Select the provider of the LLM. For example, Azure OpenAI, OpenAI. OpenAI is the recommended LLM provider in this release. |
|
Model* |
Enter an LLM from the selected provider to be used. For example, GPT-4o, or GPT-4.1 (recommended), |
|
API Key* |
Enter an API key for the LLM. The LLM uses the API key to authenticate requests from LLM Connector. |
|
Base URL* |
Enter the base URL value of the Azure OpenAI model or OpenAI model used. You obtain the base URL when you deploy your LLM. This is a mandatory field if you use Azure as your LLM provider. For example, https://demo-openai-azure.com/ |
|
API Version* |
Enter the API version of the AI model. You obtain the base URL when you deploy your LLM. This field is mandatory for Azure. |
|
Active |
Click to enable (default) or disable the LLM. Only one LLM configuration can be active at a time. |