Find Related products on Amazon

Shop on Amazon

Show HN: Use Third Party LLM API in JetBrains AI Assistant

Published on: 2025-07-26 20:52:12

ProxyAsLocalModel Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant. Powered by Ktor and kotlinx.serialization. Thanks to their no-reflex features. Story of this project Currently, JetBrains AI Assistant provides a free plan with very limited quotes. I tried out and my quote ran out quickly. I already bought other LLM API tokens, such like Gemini and Qwen. So I started to think of using them in AI Assistant. Unfortunately, only local models from LM Studio and Ollama are supported. So I started to work on this proxy application that proxy third party LLM API as LM Studio and Ollama API so that I can use them in my JetBrains IDEs. This is Just a simple task, so I started to use the official SDKs as clients and write a simple Ktor server that provides endpoints as LM Studio and Ollama. The problem appears when I try to distribute it as a GraalVM native image. The official Java SDKS uses too many dynamic features, making it hard to c ... Read full article.