For parents & grandparents

Your brain. And theirs.

AI isn't going to take your kid's future. Using it badly might. This is the quieter worry nobody's screaming about — and it matters more than most of the loud ones.

What actually happens to brains

Researchers have been studying this for decades. When we offload a skill to a tool, we get weaker at that skill. It's not new and it's not a moral failing — it's how brains work.

None of this means AI is bad. It means the how matters more than the whether.

For you, as an adult

Delegate the tedium. Keep the judgment.

Let AI draft the email. Write the hard sentence yourself. Let it summarize the article. Form your own opinion on what it means. Let it do the first pass on your spreadsheet. Verify the numbers that matter.

The skills you want to keep are the ones you use to decide, not the ones you use to produce. Think of AI as a junior analyst who's fast, confident, often wrong, and never gets offended when you push back.

The rule for adults:

You can't delegate the part that makes you good at what you do. If you're a writer, write. If you analyze, analyze. Let AI do the surrounding work that drains your attention — formatting, drafting, summarizing, searching. Keep the thing you're actually trying to be good at.

For kids — why this matters more

Kids are still building the muscle. If they don't do the hard thinking — because AI will — the muscle never grows. That's not doom, that's just biology. The things we want adults to have as strengths (reading comprehension, writing clearly, solving problems, resilience when stuck) are built by struggling through them as a kid.

That doesn't mean "keep kids away from AI." That's unrealistic and unfair. They'll use it the way we used search engines, and the kids who figure out how to use it well will have an edge.

It means the homework question matters — and it has a simple test.

The rule for kids:

Ask one question before they use AI on schoolwork:

"Did you try first?"

If yes — AI becomes a private tutor that helps them understand. That's powerful.
If no — they're skipping the learning. That's the part that should concern you.

How to set kids up for the good version

Three practical conversations that work better than a blanket rule:

  1. Agree on "try first, then AI."
    "Work on it for 20 minutes yourself. If you're stuck, ask AI to explain the concept — not to give the answer. Then try again." The point is: struggle is how the brain builds the thing. AI arrives after the struggle, to help make sense of what got learned.
  2. Teach them the prompt that matters.
    Instead of "solve this problem," teach them to say: "Explain the concept behind this problem, and ask me one question that will help me figure out the next step." That prompt turns AI from a homework-doer into a tutor. (See the example on Try Tonight.)
  3. Show them AI being wrong.
    Kids trust AI way too easily — because it's confident and fast. Once. Sit with them and ask AI something it will get wrong (there are lots of reliable failure modes — see the FAQ). Let them watch it be confidently wrong. They won't trust it the same way again, and that's a gift.

What not to worry about

A few things parents worry about that probably aren't the real issue:

The next conversation

Tonight, ask your kid one question: "When you've used ChatGPT, was it for help understanding, or to skip the work?" Listen without judgment. The answer tells you where the conversation needs to go next.

See the homework example prompt Learn the Three Questions