š¦ OllamaCode
An intelligent CLI companion for developers
An AI coding assistant that executes your requests instead of just generating code blocks
šÆ Why OllamaCode?
ā The Problem with Other AI Assistants
ā The OllamaCode Solution

⨠Key Features
šÆ Direct Tool Execution
LLM calls tools directly instead of generating code blocks. Create files, run commands, manage git - all automatically.
š Smart File Operations
Intelligent code generation and execution from natural language with built-in permission system for secure operations.
š§ Git Workflow Integration
Complete version control operations with AI assistance, from status checks to intelligent commit messages.
šØ Syntax Highlighting
Beautiful code display with auto-language detection supporting 14+ programming languages.
ā” Caching System
Fast responses with intelligent caching and enhanced error messages with actionable suggestions.
š Network Endpoints
Connect to remote Ollama servers for powerful models with persistent configuration management.
š Quick Start
š Requirements
- ⢠Python 3.9+
- ⢠Ollama with a tool-calling model
- ⢠Recommended: qwen3:latest
āļø Supported Languages
# Install from source
git clone https://github.com/tooyipjee/ollamacode.git
cd ollamacode
pip install -e .
# Make sure Ollama is running
ollama serve
# Pull a compatible model with tool calling support
ollama pull qwen3:latest
# Start coding!
ollamacode
š” Usage Examples
š Interactive Mode
$ ollamacode
š¦ OllamaCode - Your AI Coding Companion
You: write a file that generates sine wave data and saves as CSV
š§ OllamaCode creates: sine_wave_generator.py
ā
Created functional Python script with:
⢠numpy for sine wave generation
⢠pandas for CSV export
⢠configurable frequency and amplitude
⢠5 seconds of 440Hz sine wave data
You: now run it
š§ Running python sine_wave_generator.py...
ā
Saved 5000 data points to sine_wave.csv
You: š Perfect!
ā” Command Examples
š¤ AI-Powered File Creation:
"create a Python REST API with FastAPI that handles user authentication"
š§ Git Operations:
"help me write a commit message for these changes"
š Code Analysis:
"explain @main.py and suggest improvements"
šØ Direct Execution:
"optimize this function and run the tests"
š Remote Endpoints
# Use a remote Ollama server
ollamacode --endpoint http://192.168.1.100:11434 "explain this algorithm"
# Set as default endpoint
ollamacode --set-endpoint http://gpu-server:11434
ollamacode --set-model llama3.1:70b
# Now all sessions use the remote server
ollamacode "help me optimize this code"
š ļø Advanced Features
Smart Auto-Completion
Intelligent suggestions for slash commands, file references, and common operations as you type.
Permission System
Safe file operations with granular control. Approve operations once, per session, or deny as needed.
Session Management
Auto-save and resume coding sessions. Never lose your conversation context or progress.
Enhanced Error Messages
Contextual error handling with actionable suggestions and helpful guidance.
Project Context
Automatically understands your project structure and provides relevant suggestions.
Slash Commands
Streamline workflow with built-in commands: /help, /model, /status, /clear, and more.
š§ Built With
- ⢠Python 3.9+
- ⢠Ollama API
- ⢠Rich (Terminal UI)
- ⢠Ripgrep
- ⢠Git Integration
- ⢠File System Tools
šÆ Perfect For
- ⢠Rapid prototyping
- ⢠Code exploration
- ⢠Learning new languages
- ⢠Automating workflows
- ⢠Debugging assistance
- ⢠Documentation
š Why Choose OllamaCode?
Unlike other AI assistants, OllamaCode actually executes your requests:
- Creates and runs files automatically
- Manages git operations intelligently
- Provides real-time feedback
- Works completely offline with Ollama
Ready to supercharge your coding workflow? š
Happy coding with AI! š¦āØ