AI isn't going to take your kid's future. Using it badly might. This is the quieter worry nobody's screaming about — and it matters more than most of the loud ones.
What actually happens to brains
Researchers have been studying this for decades. When we offload a skill to a tool, we get weaker at that skill. It's not new and it's not a moral failing — it's how brains work.
London cab drivers used to have visibly larger hippocampi — the part of the brain that handles spatial memory — from memorizing 25,000 streets. Now that everyone uses GPS, that advantage is fading. (Maguire et al., PNAS 2000, University College London.)
Before smartphones, most of us knew a dozen phone numbers by heart. Now, maybe two. We remember where information is stored, not the information itself. (Sparrow, Liu & Wegner, Science, 2011 — "Google Effects on Memory.")
AI accelerates this. An MIT Media Lab EEG study in June 2025 ("Your Brain on ChatGPT") found that people using ChatGPT to write essays showed weaker brain connectivity than those using search engines or no tools at all. They got a good essay. They got less from writing it. (Note: the paper was preprint at time of release; sample size 54.)
None of this means AI is bad. It means the how matters more than the whether.
For you, as an adult
Delegate the tedium. Keep the judgment.
Let AI draft the email. Write the hard sentence yourself. Let it summarize the article. Form your own opinion on what it means. Let it do the first pass on your spreadsheet. Verify the numbers that matter.
The skills you want to keep are the ones you use to decide, not the ones you use to produce. Think of AI as a junior analyst who's fast, confident, often wrong, and never gets offended when you push back.
The rule for adults:
You can't delegate the part that makes you good at what you do. If you're a writer, write. If you analyze, analyze. Let AI do the surrounding work that drains your attention — formatting, drafting, summarizing, searching. Keep the thing you're actually trying to be good at.
For kids — why this matters more
Kids are still building the muscle. If they don't do the hard thinking — because AI will — the muscle never grows. That's not doom, that's just biology. The things we want adults to have as strengths (reading comprehension, writing clearly, solving problems, resilience when stuck) are built by struggling through them as a kid.
That doesn't mean "keep kids away from AI." That's unrealistic and unfair. They'll use it the way we used search engines, and the kids who figure out how to use it well will have an edge.
It means the homework question matters — and it has a simple test.
The rule for kids:
Ask one question before they use AI on schoolwork:
"Did you try first?"
If yes — AI becomes a private tutor that helps them understand. That's powerful. If no — they're skipping the learning. That's the part that should concern you.
How to set kids up for the good version
Three practical conversations that work better than a blanket rule:
Agree on "try first, then AI." "Work on it for 20 minutes yourself. If you're stuck, ask AI to explain the concept — not to give the answer. Then try again." The point is: struggle is how the brain builds the thing. AI arrives after the struggle, to help make sense of what got learned.
Teach them the prompt that matters. Instead of "solve this problem," teach them to say: "Explain the concept behind this problem, and ask me one question that will help me figure out the next step." That prompt turns AI from a homework-doer into a tutor. (See the example on Try Tonight.)
Show them AI being wrong. Kids trust AI way too easily — because it's confident and fast. Once. Sit with them and ask AI something it will get wrong (there are lots of reliable failure modes — see the FAQ). Let them watch it be confidently wrong. They won't trust it the same way again, and that's a gift.
What not to worry about
A few things parents worry about that probably aren't the real issue:
"They'll never learn to write." They'll learn to write differently. Like every generation before. The question is whether they learn to think — and that requires struggle, whether the writing happens with pen, typewriter, or AI.
"AI will turn them into robots." It won't. They're still going to be themselves. But if they don't develop independent thinking, they'll have trouble distinguishing their ideas from AI's ideas. That's the actual risk — one worth talking about openly.
"They're smarter than me at this, I can't guide them." You don't need to know AI better than they do. You need to know judgment better than they do. That's what you actually teach anyway.
The next conversation
Tonight, ask your kid one question: "When you've used ChatGPT, was it for help understanding, or to skip the work?" Listen without judgment. The answer tells you where the conversation needs to go next.