Generate Code Reviews for Your PRs Using Ollama
Generate Code Reviews for Your PRs Using Ollama
We are thrilled to announce that ThinkReview now supports local LLMs via Ollama! Generate AI-powered code reviews for your GitLab and Azure DevOps pull requests completely offline and privately.
ThinkReview is Now Open Source & Supports Local AI 🚀
ThinkReview was built to make code reviews smarter and faster. By opening up our source code, we are inviting developers to inspect, contribute, and help shape the future of AI-assisted code reviews.
Whether you want to audit the security, add a new feature, or simply see how it works under the hood, ThinkReview is now yours to explore.
Generate Code Reviews with Ollama (Local LLMs) 🦙
Until version 1.3.10, ThinkReview relied exclusively on cloud-based models. Starting with v1.4.0, you can now connect ThinkReview directly to Ollama running on your local machine.
This means you can generate code reviews using your favorite local models, including:
- Llama 3
- Qwen Coder
- Codestral
- Deepseek
- Gemma
- ...and any other model supported by Ollama!
Why Generate Code Reviews Locally?
Switching to local AI for generating code reviews offers several critical advantages for teams and individual developers:
🔒 Absolute Privacy
Your code never leaves your machine. It stays within your local network, making it perfect for companies with strict security or compliance requirements.
💸 100% Free
No API keys, no subscription fees, and no token limits. You run the model on your hardware.
🏢 Enterprise Ready
Ideal for users on self-hosted GitLab instances or internal networks where cloud access is restricted.
How to Generate Code Reviews with Ollama
Generating code reviews locally is a simple one-time setup.
Install/Update
Ensure you have the latest version of the ThinkReview extension (v1.4.0+).
Configure Ollama
You need to allow the browser extension to access your local Ollama instance.
- Stop any running Ollama processes.
- Restart Ollama with the following environment variable to allow cross-origin requests:
OLLAMA_ORIGINS="*"

Connect and Generate Reviews
- Go to ThinkReview Settings.
- Click Test Connection. You should see your downloaded models appear instantly.
- Select your preferred model.
That's it! You can now generate AI-powered code reviews directly inside your GitLab or Azure DevOps PR pages—completely offline and private.

Start Generating Code Reviews Today
ThinkReview works seamlessly on Chrome, Edge, and any Chromium-based browser.
- Install from Chrome Web Store: Get ThinkReview
- Contribute on GitHub: View Repository
Thank you for your support, and happy coding!
What's Next?
Since our initial launch, we've received fantastic feedback from the developer community. You asked for two things consistently: more transparency through open source and the ability to run models locally for better privacy and control.
We heard you, and today, we are delivering on both.
Community Feedback Drives Innovation
The requests we received show that developers care deeply about:
- Transparency - Understanding how tools work
- Privacy - Keeping code secure and private
- Control - Running models on their own infrastructure
- Flexibility - Choosing the AI models that work best for them
We're excited to see what the community will build, improve, and customize with this open source release.
Resources
- GitHub Repository: github.com/Thinkode/thinkreview-browser-extension
- Chrome Web Store: Install ThinkReview
- Website: thinkreview.dev
- Contact: thinkreview.dev/contact
Built with ❤️ by the Thinkode team. Now powered by local AI and the open source community.