← Back to Blog

AFKmate Respects Your Privacy

Every time you use an AI tool to review your code, something happens that nobody puts in the headline: your code leaves your machine. It travels across the internet, lands on a server you've never seen, and gets processed by a model trained mostly on the promise that your data is safe. That promise is worth examining.

The cloud LLM problem nobody wants to talk about

Here's how almost every AI code review tool works. You write code. The tool, whether it runs in your editor or on your CI pipeline, ships that code to a cloud API. OpenAI, Anthropic, Google, whoever the provider is. The model processes it, sends back an analysis, and the tool displays it. Clean, fast, convenient.

But that code sat on a server you don't control. And the question that most of these tools quietly paper over is: what happens to it there?

Many cloud LLM providers use customer inputs to improve their models. The default terms of service for popular APIs have changed more than once, and not always in the direction of more privacy. Opting out is sometimes possible. Sometimes it requires enterprise agreements. Sometimes it's buried six layers deep in a settings panel. And even when you opt out of explicit training pipelines, your data still passes through infrastructure that can be breached, subpoenaed, or simply mishandled.

The developers who build these tools aren't malicious. But they made a choice: build on cloud APIs because they're powerful, convenient, and cheap to integrate. Privacy became someone else's problem.

Your code is not just code

Think about what actually lives in a real codebase. Not a toy project, not a tutorial, but the code you write at work or for a business you're building.

There's authentication logic that describes exactly how your system validates users. There's database schema that maps the shape of your data. There's API integration code with provider keys that someone, at some point, committed before the .gitignore was set up correctly. There's business logic that your company spent months refining and considers a genuine competitive advantage. There are internal service names, internal domain structures, internal decisions your team made that you'd rather not hand to a competitor.

When an AI code review tool sends your files to a cloud LLM, it sends all of that. Not a summary. Not a sanitized version. Your actual code, in whatever state it's in when the review runs.

Most developers know this abstractly. Most of them haven't thought through what it means in practice.

AI code reviewers make this worse

A general-purpose AI assistant only sees what you paste into it. You decide what to share. You can be deliberate about it, scrub credentials before pasting, anonymize business logic, keep sensitive parts to yourself.

An AI code reviewer doesn't work that way. That's the point of it. It analyzes your files automatically, without you having to think about what to include. The passiveness that makes it so useful is the same thing that makes cloud-based implementations so problematic. The moment it runs, everything in your open files is in scope.

A passive AI code reviewer that phones home to a cloud LLM is silently streaming your entire working context to external servers every time you step away. Not occasionally. Every idle period. Every review cycle. Every file you have open.

The more useful the tool, the more exposure you have.

How AFKmate handles this differently

AFKmate supports Ollama.

Ollama lets you run large language models entirely on your own hardware. No cloud. No API calls leaving your machine. The model runs locally, the analysis happens locally, and the results come back to your editor without a single packet touching an external server.

When you configure AFKmate to use Ollama, the entire review pipeline is self-contained. Your code goes to a local process running on your CPU or GPU and comes back as findings. It never leaves your device. There's no server to breach, no training pipeline to opt out of, no terms of service to read carefully. The data simply stays on your machine because it never had a reason to go anywhere else.

For developers who want the most capable models, cloud providers are still available. AFKmate works with OpenAI, Anthropic, and others. The choice is yours. But the choice exists, and that's the part that matters.

Who this actually protects

This isn't a niche concern for the especially paranoid. It's a real issue for a wide range of developers:

  • Developers at companies with strict data handling policies. Many enterprises explicitly prohibit sending proprietary code to third-party cloud services. With cloud-only tools, you're either violating policy or not using the tool. With Ollama, there's no conflict.
  • Developers building in regulated industries. Healthcare, finance, legal. Industries where the data your code handles is sensitive by nature, and where the infrastructure that processes it gets scrutinized.
  • Solo developers with proprietary algorithms. If the thing you're building is the thing that gives you an edge, handing its internals to a cloud model is a real risk, even if a low-probability one.
  • Developers who just want to own their own stack. Not every privacy concern needs a threat model. Some people want control over their tools because control matters to them. That's a legitimate position.

Privacy shouldn't be an upsell

A lot of developer tools treat privacy as a premium feature. Want your data to stay on your machine? That'll be the enterprise tier. Want to opt out of training? Fill out a form and wait for a sales call.

That framing gets it backwards. Privacy isn't an add-on. It's the baseline you should be able to expect from a tool that sits inside your editor and reads everything you write.

AFKmate's Ollama support isn't a checkbox in a pricing table. It's a first-class configuration option available to every user. You install Ollama, pull a model, point AFKmate at it, and that's it. From that point on, your code stays yours. Completely.

Your code is one of the most sensitive things on your machine. Treat it that way. Install AFKmate, configure Ollama, and let the reviews happen where they belong: on your hardware, under your control.

Ready to set it up? Learn how to enable Ollama support in AFKmate in the docs.