Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

 
 

LLM Connector Overview

The LLM Connector tool, embedded in Routing Director, is an advanced AI-driven tool for leveraging your large language models (LLMs) (also known as bring your own LLM) to streamline network monitoring operations.

Note:

The LLM Connector tool is a beta feature in this release.

LLM Connector facilitates the use of natural language to query network status and obtain troubleshooting information, without the need for traditional CLI commands. For example, if you encounter an issue with OSPF neighbors of a device, you query LLM Connector What is the status of OSPF neighbors of device A?. LLM Connector by using the knowledge that the LLM (defined and configured by the user) has obtained from technical documentation and by its ability to execute functions executes the show ospf neighbor command on the device to present you with the relevant information for troubleshooting.

When you initiate a query, LLM Connector connects with an LLM such as OpenAI GPT-4o, or Ilama 3.1 that you configure to provide contextual assistance. The LLM is trained on extensive documentation to interpret your query and prompts LLM Connector to execute the necessary "read-only" network commands. The LLM also parses the result of the executed command and presents it in an easy-to-understand format.

The natural language interface supports a variety of query types, ranging from simple status checks to more complex troubleshooting commands. For example, you can ask, Show me the interface statistics or Show insights on input traffic rate on et-0/0/0 on device A.

LLM Connector can help you with:

  • Retrieving device information.

  • Executing Junos OS operational commands.

  • Getting Insights based on the telemetry collected from the device.

  • Retrieving a list of all VPNs in your network and their details, metrics, and health information.

  • Fetching information about customers and service instances associated with customers.

  • Plotting graphs related to KPIs. For example, plot graphs for CPU utilization, fan RPM, and device temperature.

To use the LLM Connector tool, you need to set up the LLM and provide parameters such as API keys and model details. For information about configuring LLM, see Configure LLM Connector.

You can rate and provide feedback regarding the output response. Routing Director stores your feedback for up to an year.

You can access LLM Connector by clicking the LLM Connector icon displayed on the right-bottom corner of the UI. However, you can move the icon around and anchor it on any of the four corners on the GUI.

LLM Connector provides a chat window where you can enter your queries. LLM Connector retains your queries in a session so that you can continue with the queries after a break, if needed. A session is active for an hour. If your session is idle for more than an hour, LLM Connector creates a new session for your next set of queries. If you want to go back to any of your older queries, click the Chat History (clock) icon (present on right-corner of the LLM Connector window). The chat history lists all your previous conversations. Click on any conversation to continue the conversation.

LLM Connector Actions

LLM Connector provides the following action menu items on the LLM Connector window:

  • Chat History—Click the Chat History (clock) icon on the top of the LLM Connector window to view chat history. Click the Chat History icon again to close the chat history.

    The chat history is listed, grouped by the time of chat as Today, Yesterday, Last Week, and so on. Chat History is displayed on the left side of the LLM Connector window when the window is maximized.

    To open a new conversation, click the New Conversation icon (+).

  • Search—Enter a search text in the Search field to search for conversations and chats. LLM Connector matches the search text that you enter with the your query text in the chat history and fetches the corresponding conversation.

    The Search field is present above the Chat History list and displayed when you view the chat history.

  • Settings—Configure the following settings for the LLM Connector:

    • Model—Select the model for use by LLM Connector.

      Available options: OpenAI or Ollama

    • Temperature—Set the LLM temperature. LLM Temperature helps the LLM to produce more coherent and consistent text and avoid irrelevant responses.

      Range: 0.1 (default) to 1

    • Top P—Set the Top P of the LLM connector. LLM Top P helps the LLM to avoid providing inconsistent outputs.

    • Range: 0.1 (default) to 1

  • Maximize—Click to maximize the LLM Connector window. When the LLM Connector window is maximized, chat history is listed on the left side of the window.

  • Close—Click to close the LLM Connector window.

Benefits of LLM Connector

  • Reduces Workload on NOC Teams: By automating repetitive troubleshooting tasks, LLM Connector significantly lowers the operational burden on Network Operations Center (NOC) teams, freeing them to focus on more complex issues.

  • Accelerates Problem Resolution: LLM Connector provides rapid, contextually relevant solutions by executing necessary commands and parsing results, which enables you to solve problems quickly.

  • Simplifies Training Requirements: LLM Connector minimizes the learning curve to work with Routing Director for new operators by allowing them to interact with the Routing Director using natural language queries, reducing the time and effort required to become proficient.

  • Enhances Data Privacy and Security: By supporting Cloud, Private Cloud, and On-Premises deployments, LLM Connector offers flexible options that cater to various data privacy and security needs, ensuring secure operations.

  • Improves System Usability: The integration of function calling capabilities with a range of LLM models enables precise and efficient interactions with external systems, enhancing the overall usability of Routing Director as a troubleshooting tool.