LLM Library
Choose from a range of state-of-the-art proprietary and open-source LLMs and stay on the bleeding edge of AI.
Open-source large language models (OS LLMs) foster collaboration between developers and researchers, accelerating the pace of development of new models for various industries and use cases. OS LLMs are cost-effective for businesses, increasing access to advanced AI to help companies stay competitive without having to make heavy investments in infrastructure or compute costs.
Our LLM Library makes it easy for developers to choose and use both OS and proprietary LLMs in Inference. Our intuitive platform empowers companies to choose AI models that best serve their unique needs so they can stay at the bleeding edge of AI technology to enhance their products and increase operational efficiency.
Explore Our LLM Library
Discover the power and diversity of large language models available with Telnyx. Explore the options below to find the perfect model for your project.
BENEFITS
More choice
Build what you want how you want it with a vast range of models for all your use cases.
20+
models available
Limitless potential
Build your entire RAG pipeline on a single AI platform, and focus on what you do best
1
platform
Get started
Check out our helpful tools to help get you started.
Interested in building AI with Telnyx?
We’re looking for companies that are building AI products and applications to test our new Sources and Inference products while they're in beta. If you're interested, get in touch!