Show HN: My LLM CLI tool can run tools now, from Python code or plugins
Published on: 2025-06-19 03:53:03
Large Language Models can run tools in your terminal with LLM 0.26
LLM 0.26 is out with the biggest new feature since I started the project: support for tools. You can now use the LLM CLI tool—and Python library—to grant LLMs from OpenAI, Anthropic, Gemini and local models from Ollama with access to any tool that you can represent as a Python function.
LLM also now has tool plugins, so you can install a plugin that adds new capabilities to whatever model you are currently using.
There’s a lot to cover here, but here are the highlights:
LLM can run tools now ! You can install tools from plugins and load them by name with --tool/-T name_of_tool .
! You can and load them by name with . You can also pass in Python function code on the command-line with the --functions option.
with the option. The Python API supports tools too : llm.get_model("gpt-4.1").chain("show me the locals", tools=[locals]).text()
: Tools work in both async and sync contexts.
Here’s what’s covered in this post
... Read full article.