Inference • Last Updated 3/27/2024

Telnyx open-sources AI chatbot services & context class

Take a closer look at how our context class can help you build a more efficient, and accurate chatbot.

By Ciaran Palmer

Learn how to build an efficient, accurate AI chatbot with context class.

Last winter, we released our in-house AI Assistant, built to enhance customer satisfaction and increase efficiencies in our support bots.

Today, we’re happy to announce that we are open-sourcing our chatbot service and its context class. Our public repo will be a valuable asset for developers interested in integrating their stored data with large language models (LLMs), especially those using the Retrieval-Augmented Generation (RAG) model.

The need for context classes in AI chatbots

Even in 2024, the gap between vast stores of unstructured data and the potent analytical capabilities of LLMs remains a challenge for many businesses.

From our open-source code, developers can learn how to harness the advanced document handling and analysis capabilities of the context class. Enhancing the efficiency and relevance of information retrieval and processing within their LLM services paves the way for more sophisticated, context-aware, and responsive LLM applications.

How the context class works

The context class provides a sophisticated mechanism for managing and utilizing stored data—like documents stored in a Telnyx Storage bucket. It breaks down into several vital functionalities that work together to handle and interpret large volumes of text data efficiently. Let’s take a look at how context class operates.

1. Document chunking and embedding

Initially, documents uploaded to a Telnyx Storage bucket are split into chunks. This approach facilitates granular similarity searches, allowing users to find the most relevant sections of documents based on their query.

2. Similarity search with raw_matches() and matches() methods

Users can perform similarity searches on the resultant chunks. The raw_matches() method retrieves the top N similar chunks to an input query, while the matches() method goes further by reconstructing the original documents from the matched chunks.

3. High-level description with ‘describe()’

This method generates markdown text providing an overview of the matches returned by matches(), tailored to the document type. It's a preparatory step for more detailed analysis or response generation.

4. Contextual prompt creation with ‘prompt()’

This context class arranges the matched documents or chunks to maximize relevance and information within a given token limit. This is crucial for feeding structured, concise, and relevant information into downstream processes or language models.

To get started building your own chatbot follow our public repo and README.

Why the context class is essential

Context class can increase the efficiency and accuracy of AI tools, making them more effective, all while reducing the computing power required for AI applications.

Efficiency in handling large text volumes

By breaking down documents into chunks and selectively reassembling them based on relevance to the query, the context class significantly reduces the computational load, and improves response times for information retrieval and analysis tasks.

Flexibility in search and analysis

The class offers granular control over the similarity search process and the ability to zoom out for a broader document overview. This dual capability makes it versatile for various use cases, from detailed text analysis to high-level content summarization.

Optimization for language models

The ability to curate context-specific prompts within token limits makes this class especially valuable for feeding into language models. It ensures that the models operate on highly relevant and structured information, enhancing the quality of outputs.

Powered by the Telnyx AI Platform

In addition to open-sourcing our AI chatbot and context classes, we’re excited to announce that our AI chatbot is powered by Telnyx Embedding and Inference APIs. These OpenAI-compatible APIs power the AI chatbots across our platform, reducing our dependence on OpenAI and operational costs and putting Telnyx in the driver's seat.

To get started building your own chatbot follow our public repo and README.

If you’re interested in testing our suite of AI APIs, take a look at our developer documentation or reach out to a member of the team today.

Share on Social

Related articles

Sign up and start building.