Elelem
Chat Client with Tool-Calling Support
An interactive C-based command-line chat client that enables AI models to execute real-world actions through a comprehensive tool-calling system.
🚀 Quick Start
Prerequisites
Ubuntu/Debian:
sudo apt-get update sudo apt-get install build-essential libglib2.0-dev libjson-glib-dev libsoup2.4-dev libreadline-dev
Fedora/RHEL:
sudo dnf install gcc glib2-devel json-glib-devel libsoup-devel readline-devel
macOS:
brew install glib json-glib libsoup readline pkg-config
Building
git clone cd gobject make clean && make
Setup API Keys
For DeepSeek API:
export DEEPSEEK_API_KEY = "sk-your-key-here"
For Ollama (local):
# Install and start Ollama curl -fsSL https://ollama.ai/install.sh | sh ollama serve # Pull a model (in another terminal) ollama pull qwen3
📖 Usage
Basic Commands
# Start with DeepSeek (requires API key) ./elelem # Use Ollama with specific model ./elelem -p ollama -m qwen3 # Use custom Ollama server ./elelem -p ollama -u http://remote-server:11434 -m mistral
Interactive Commands
Command Description Example /tools List available tools /tools /history Show conversation history /history /save Save current conversation /save /load Load previous conversation /load conversations/chat_2024-01-01_10-30-00.txt /model Switch AI model /model llama3.1 /provider Switch between providers /provider ollama /clear Clear conversation history /clear /exit Exit application /exit
The AI can execute these tools to help you:
grep - Search for patterns in files recursively Required: pattern Optional: path , file_pattern , case_sensitive
analyze_code - Analyze code structure and metrics Optional: path (default: current), language (default: c)
read_file - Read and display file contents Required: filename
write_file - Create or modify files Required: filename , content
list_directory - Browse directory contents Optional: path (default: current directory)
shell - Execute shell commands (sandboxed for safety) Required: command
- Execute shell commands (sandboxed for safety)
💡 Example Interactions
Code Analysis
> Analyze this codebase and tell me about its structure [AI will use the analyze_code tool to examine your project, count lines of code, identify functions, and provide architectural insights]
File Search
> Find all TODO comments in C files [AI will use grep tool with pattern "TODO" and file_pattern "*.c" to search recursively through your codebase]
File Operations
> Create a new header file called "utils.h" with basic includes [AI will use write_file tool to create the header file with appropriate content]
System Operations
> What's in the current directory and what's the git status? [AI will use list_directory and shell tools to show directory contents and run "git status"]
🏗️ Architecture
Core Components
main.c - CLI interface and main event loop
- CLI interface and main event loop llm_interface.[ch] - Abstract client interface
- Abstract client interface deepseek_client.[ch] - DeepSeek API implementation
- DeepSeek API implementation ollama_client.[ch] - Ollama local API implementation
- Ollama local API implementation tool_manager.[ch] - Tool orchestration system
- Tool orchestration system tool_definition.[ch] - Tool schema definitions
- Tool schema definitions builtin_tools.c - Built-in tool implementations
- Built-in tool implementations my_http_client.[ch] - HTTP client utilities
Tool-Calling Flow
User Input → User asks AI to perform a task AI Planning → Model decides which tools to use Tool Calls → AI generates JSON function calls Validation → System validates tool calls and parameters Execution → Tools run in sandboxed environment Results → Tool output fed back to AI Response → AI provides final answer with context
🔧 Configuration
System Prompt Customization
The system prompt is loaded from prompts/system_prompt.txt . You can customize it to:
Add new tool descriptions
Modify AI behavior
Change response format preferences
Conversation Storage
Conversations are auto-saved to conversations/ directory
directory Files are named with timestamps: chat_YYYY-MM-DD_HH-MM-SS.txt
Use /load command to resume previous conversations
🛡️ Security Features
Command Filtering - Dangerous shell commands are blocked
- Dangerous shell commands are blocked Path Validation - File operations validate and sanitize paths
- File operations validate and sanitize paths Output Limits - Large outputs are truncated to prevent memory issues
- Large outputs are truncated to prevent memory issues Sandboxed Execution - Tools run with limited permissions
- Tools run with limited permissions Input Validation - All tool parameters are validated before execution
🔨 Development
Implement the handler in builtin_tools.c :
static ToolResult * my_custom_tool_handler ( JsonObject * arguments , gpointer user_data ) { // Your implementation here ToolResult * result = g_new0 ( ToolResult , 1 ); result -> success = TRUE ; result -> content = g_strdup ( "Tool output" ); return result ; }
Register the tool in tool_manager_register_builtin_tools() :
tool = tool_definition_new ( "my_custom_tool" , "Description of what it does" ); tool_definition_add_string_param ( tool , "param_name" , "Parameter description" , TRUE ); tool_manager_register_tool ( manager , tool , my_custom_tool_handler , NULL , NULL );
Update system prompt in prompts/system_prompt.txt to describe the new tool.
Building with Debug Info
make clean CFLAGS = "-g -O0 -DDEBUG" make
Running Tests
# Test with different providers ./elelem -p deepseek ./elelem -p ollama -m llama3.1 # Test tool functionality echo "List the files in this directory" | ./elelem -p ollama -m llama3.1
🐛 Troubleshooting
Common Issues
Build Errors:
Ensure all dependencies are installed
Check pkg-config can find libraries: pkg-config --cflags glib-2.0
DeepSeek API Issues:
Verify API key is set: echo $DEEPSEEK_API_KEY
Check network connectivity and API quotas
Ollama Issues:
Ensure Ollama server is running: curl http://localhost:11434/api/version
Verify model is available: ollama list
Tool Execution Issues:
Check file permissions for file operations
Verify shell commands aren't blocked by security filters
Debug Mode
Set environment variable for verbose output:
G_MESSAGES_DEBUG = all ./elelem
📊 Features
✅ Implemented
Multi-provider support (DeepSeek, Ollama)
Multi-provider support (DeepSeek, Ollama) Real-time streaming responses
Real-time streaming responses Tool-calling with 6 built-in tools
Tool-calling with 6 built-in tools Conversation history with save/load
Conversation history with save/load Command-line interface with readline support
Command-line interface with readline support Security sandboxing and validation
Security sandboxing and validation File-based system prompt configuration
🚧 Planned
Tool result caching for performance
Tool result caching for performance Async tool execution for parallel operations
Async tool execution for parallel operations Plugin system for third-party tools
Plugin system for third-party tools Tool dependency management and chaining
📄 License
GNU Affero General Public License v3