LLM Library

Choose from a range of state-of-the-art proprietary and open-source LLMs and stay on the bleeding edge of AI.

Choose from hundreds of open-source LLMs in our model directory.
ABOUT

Open-source large language models (OS LLMs) foster collaboration between developers and researchers, accelerating the pace of development of new models for various industries and use cases. OS LLMs are cost-effective for businesses, increasing access to advanced AI to help companies stay competitive without having to make heavy investments in infrastructure or compute costs.

Our LLM Library makes it easy for developers to choose and use both OS and proprietary LLMs in Inference. Our intuitive platform empowers companies to choose AI models that best serve their unique needs so they can stay at the bleeding edge of AI technology to enhance their products and increase operational efficiency.

BENEFITS

More choice

Build what you want how you want it with a vast range of models for all your use cases.

20+

models available

Limitless potential

Build your entire RAG pipeline on a single AI platform, and focus on what you do best

1

platform

HOW IT WORKS
Sign-up to get started with the Telnyx model library
1/4
RESOURCES

Get started

Check out our helpful tools to help get you started.

  • Icon Resources EBook

    Test in the portal

    Easily browse and select your preferred model in the AI Playground.

  • Icon Resources Docs

    Explore the docs

    Don’t wait to scale, start today with our public API endpoints.

  • Icon Resources Article

    Stay up to date

    Keep an eye on our AI changelog so you don't miss a beat.

Interested in building AI with Telnyx?

We’re looking for companies that are building AI products and applications to test our new Sources and Inference products while they're in beta. If you're interested, get in touch!

Interested in testing Inference API?

Start building your future with Telnyx AI