Tech News
← Back to articles

Any-LLM: A unified API to access any LLM provider

read original related products more articles

When it comes to using LLMs, it’s not always a question of which model to use: it’s also a matter of choosing who provides the LLM and where it is deployed. Today, we announce the release of any-llm, a Python library that provides a simple unified interface to access the most popular providers.

When it comes to using Large Language Models (LLMs), it’s not always a question of which model to use: it’s also a matter of choosing who provides the LLM and where it is deployed. As we’ve written about previously , there are many options available for how to access an LLM. The provider you choose to use can have implications in terms of cost, latency, and security. Most AI labs offer their own provider platform (OpenAI, Google, Mistral, etc.), and other provider platforms (Azure, AWS, Cerebras, etc.) provide access to a wide variety of LLMs. But what if you want to build your LLM application without having to worry about being locked in to a certain provider?

Today, we’re happy to announce the release of our new Python library: any-llm ! any-llm provides a simple unified interface to access the most popular providers. By changing only a single configuration parameter , you can easily switch between providers and models.

from any_llm import completion import os # Make sure you have the appropriate environment variable set assert os.environ.get('MISTRAL_API_KEY') # Basic completion response = completion( model="mistral/mistral-small-latest", # / messages=[{"role": "user", "content": "Hello!"}] ) print(response.choices[0].message.content)

any-llm fills a gap in the LLM provider interface landscape through several key design principles:

Use of provider SDKs: any-llm leverages official provider SDKs when available, reducing the maintenance burden and ensuring compatibility. Committed to active maintenance: any-llm is integrated with any-agent , one of our most community-engaged projects, so we're motivated to maintain it. No proxy or gateway server required: no need to set up another service as a gateway: download the any-llm SDK, and you’re good to communicate with all the supported providers, without having to send your data through another third-party provider.

You can view the list of our supported providers here .

OpenAI API Standard

The OpenAI API has become the standard for LLM provider interfaces. Although some providers provide perfect compatibility with the OpenAI API standard, others (like Mistral, Anthropic, etc.) may slightly diverge from the OpenAI standard when it comes to expected input parameters and output values.

In order to make it easy to switch between these providers, this creates a need for lightweight wrappers that can gracefully handle these differences while maintaining as consistent an interface as possible.

... continue reading