Aider v0.74.0: Ollama Context & Model Support
Aider v0.74.0 brings dynamic Ollama context windows, better support for o3-mini and o1 models, Docker improvements, and multiple bugfixes. Aider wrote 77% of this release.
TL;DR
- Ollama context window now dynamically adjusts to fit your chat history
- Better support for o3-mini, o1, DeepSeek V3/R1, and secondary API providers
- Multiple bugfixes: filename generation, timeout handling, .gitignore parsing, and watch limits
New
- Dynamic Ollama context window — automatically resizes to hold your current chat instead of fixed limits
- Enhanced model support — improved compatibility with o3-mini, o1-mini, o1, DeepSeek V3 & R1 via secondary API providers
- Temperature control in config — specify
use_temperature: <float>in model settings for fine-grained control - Bedrock support in Docker — full container now includes
boto3for AWS Bedrock integration - Docker HOME persistence — containers set
HOME=/appto persist~/.aiderconfig across runs - R1 response cleanup — removes
<think>tags from R1 responses in commit messages and weak model contexts - Improved .gitignore handling — honors existing ignores regardless of configuration, checks .env only when present
- Watch file optimization — fully ignores top-level directories in ignore files to prevent OS watch limit hits on large subtrees like
node_modules - Faster startup — improved performance with more providers and when model metadata is cached locally
- Yes/No prompt aliases — All/Skip now work as Y/N aliases even outside group confirmations
Fixed
- Incorrect filename generation for reserved names like
python,php --timeoutflag handling/modelcommand now correctly reports when weak model is unchanged- Multi-line mode persistence through ^C at confirmation prompts
Update: pip install --upgrade aider-ai
Source: Aider