Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

 
 

Configure LLM Connector

Use this topic to configure LLM connector in Routing Director.

LLM connector connects large language models (LLMs) to the Routing Director infrastructure. This connection allows a user to query their network for troubleshooting and monitoring through Routing Director by using conversational English, bypassing the need for traditional CLI commands. For example, What is the status of the OSPF neighbors of device-a?

To be able to query your network through Routing Director, you should configure the LLM connector in Routing Director.

To configure LLM Connector:
  1. Click Settings Menu > System Settings on the banner.

    The Organization Settings page appears.

  2. On the Configure LLM Connector tile, click + (Add) icon.

    The LLM Connector Configuration page appears.

  3. Enter values referring to table Table 1.

    Fields marked * are mandatory.

  4. Click Create.
    The values are listed in the Configure LLM Connector tile.
Table 1: Fields to configure LLM connector

Field

Description

Name*

Enter a name for the LLM model. The name need not be unique within an organization. For example, demo-llm.

Provider*

Select the provider of the LLM. For example, Azure Open AI, Open AI, and Ollama.

Open AI is the recommended LLM provider in this release.

Model*

Select the LLM from the selected provider to be used. For example, GPT-3.5, GPT-4, GPT-4o, or Llama3.1.

GPT-4 and GPT4o are the recommended models to be used in this release.

API Key*

Enter an API key for the LLM.

The LLM uses the API key to authenticate requests from LLM Connector.

Base URL*

Enter the base URL value of the Azure Open AI model or Ollama model used.

You obtain the base URL when you deploy your LLM. This is a mandatory field if you use Azure or Ollama as your LLM provider.

For example, https://demo-openai-azure.com/

API Version*

Enter the API version of the AI model.

You obtain the base URL when you deploy your LLM. This field is mandatory for Azure.

Active

Click to enable (default) or disable the LLM.

Only one LLM configuration can be active at a time.