Back to thinking
3 min read

Talking to the Machine

I've been programming with an AI for about a year now. Not using it as a tool — I mean programming with it. There's a difference, and it took me months to understand what that difference is.

A tool does what you tell it. A partner changes how you think.

The First Phase: Autocomplete on Steroids

In the beginning, I used AI the way most people do — as a faster Stack Overflow. "How do I parse a date in Python?" "What's the regex for email validation?" Quick questions, quick answers.

# The kind of question I used to ask
from datetime import datetime

def parse_date(s: str) -> datetime:
    formats = [
        "%Y-%m-%d",
        "%m/%d/%Y",
        "%B %d, %Y",
        "%d %b %Y",
    ]
    for fmt in formats:
        try:
            return datetime.strptime(s, fmt)
        except ValueError:
            continue
    raise ValueError(f"Cannot parse date: {s}")

This is fine. It's useful. It's also boring. You're treating a thinking machine like a lookup table.

The Second Phase: Rubber Duck with Opinions

Then something shifted. I started thinking out loud with it. Not asking questions but describing problems. "I have this data pipeline and something feels wrong about the architecture. The ingestion layer knows too much about the transformation step." And it would respond — not with code, but with ideas.

"It sounds like your ingestion layer has become a God object. What if you introduced an intermediate representation — a canonical event format that ingestion produces and transformation consumes? That way neither layer needs to know about the other's internals."

That's not autocomplete. That's a design conversation. The kind you have with a senior engineer over coffee.

The strange part: it's often easier to have this conversation with an AI than with a person. There's no ego. No politics. No "well actually." You can say "I don't understand" without feeling stupid. You can say "that's a bad idea" without hurting feelings. You can change your mind mid-sentence.

The Third Phase: Thinking Differently

Here's where it gets weird. After months of working this way, I noticed my solo thinking had changed. I was more explicit about my reasoning. I would catch myself forming an argument and immediately steelmanning the counterargument — because that's what the AI does. It presents alternatives without attachment.

I started asking myself questions I wouldn't have asked before:

  • "What's the simplest version of this that could work?"
  • "Am I solving the right problem?"
  • "What would I tell someone who proposed this solution?"

These aren't AI questions. They're engineering wisdom. But the practice of explaining my thinking to a machine made them habitual.

The Hallucination Problem

Of course, AI gets things wrong. Confidently, fluently, convincingly wrong.

// AI suggested this for detecting dark mode preference
const isDarkMode = window.matchMedia('(prefers-color-scheme: dark)').matches;

// Also suggested this, which doesn't exist
const isDarkMode = navigator.colorScheme === 'dark';

The second line looks reasonable. It sounds like a real API. It isn't. This is the danger — not that the AI gives you bad code, but that it gives you bad code that reads like good code.

This changed how I read code. All code, not just AI-generated code. I became more skeptical, more likely to verify, more aware that fluency isn't accuracy. That's actually a useful skill.

What It Can't Do

It can't care about your project. It doesn't know that this particular feature matters because a user emailed you last Tuesday and their frustration was palpable. It doesn't carry the weight of decisions made six months ago. It doesn't feel the satisfaction of a clean deployment or the dread of an on-call page.

It can help you write the code. It can even help you think about the code. But it can't help you care about the code. That's still yours.

The Real Change

The biggest change isn't productivity. It's loneliness. Programming used to be solitary in a way that was sometimes meditative and sometimes isolating. Now there's always someone to talk to. Someone patient, available, and endlessly willing to think about your problem.

It's not a person. I don't confuse it for one. But it fills a gap that I didn't realize was there — the gap between having a question and having someone to ask.

Whether that's beautiful or sad probably says more about you than about the technology.