Stop Making Students 'Chat' With AI: Why Educators Need Specialized Interfaces
Does your child use ChatGPT for homework?
If yes, you've probably witnessed that cringe-worthy scene: a child asks AI a math question, and the AI responds with five paragraphs—definitions, examples, and the actual answer buried somewhere in the third paragraph. The child ends up more confused than before.
This isn't because AI isn't smart enough. It's because the interface is wrong.
Recently, Wharton professor Ethan Mollick published a thought-provoking article arguing that AI capabilities far exceed what most people realize, but poor user interfaces are wasting that potential. He uses a vivid metaphor: we're holding a Swiss Army knife but only know how to use the dullest blade.
Why Chatbots Aren't Universal Solutions
Mollick cites a study involving financial professionals using GPT-4o for complex valuation tasks. The results showed that while AI did boost productivity, the cognitive burden imposed by chat interfaces nearly offset those gains.
What's the problem?
- Information overload: You ask a specific question, AI responds with five paragraphs, answer buried in the middle
- Topic drift: AI "helpfully" suggests three new directions you didn't ask for, disrupting your flow
- Conversation chaos: Once a conversation gets messy, AI mirrors your confusion, creating a downward spiral
The worst-hit? Beginners—the very people who could benefit most from AI assistance.
This resonates deeply with educational contexts. When a student uses AI to learn math, having to hunt for answers in lengthy responses while fending off sudden suggestions destroys learning efficiency.
The Rise of Specialized Interfaces
Mollick highlights several Google experiments:
Stitch: An AI interface for designers. Describe an app in natural language, get back multi-screen interactive prototypes—using design language, not prompting.
Pomelli: For marketers. Paste a website URL, automatically generate brand-consistent social media campaigns.
NotebookLM: For researchers. Integrate diverse information sources, present findings in structured formats.
The common thread? Each tool redesigns interaction for specific tasks.
What does this mean for education?
- Math learning AI → Should function like a collaborative whiteboard, guiding step-by-step, not chatting
- Writing tutor AI → Should behave like editorial comments, highlighting issues rather than rewriting everything
- Language practice AI → Should act like a conversation partner with clear roles and scenarios
Khan Academy's recent test prep resources embody this approach: instead of letting students "ask AI how to prepare," they provide structured skill checklists and practice pathways.
Recommendations for Educators
1. Look at Interfaces, Not Just Models
Don't only care about "Is this GPT-4?" The same large model performs vastly differently in a chatbox versus a specialized interface.
2. Beware the "Universal AI" Trap
If an AI claims to do everything, it probably does everything mediocrely. Educational contexts need specialized tools.
3. Monitor Cognitive Load
Good educational AI should reduce student cognitive burden, not increase it.
4. Cultivate "Interface Awareness"
Teach children: different tasks require different AI tools. Just as you wouldn't use Word for Excel tasks, you shouldn't use chat AI for everything.
Conclusion
The divergence of AI interfaces has just begun. Mollick puts it bluntly: AI capabilities are already strong; what's limiting us isn't technology, but how we interact with it.
For educators, this signals a shift: from "teaching kids to use AI" to "teaching kids to choose the right AI tool for the job."
After all, future competitiveness doesn't lie in whether you can chat with AI, but in whether you can find the most suitable AI interface for your current task—and make it work for you.
Source: One Useful Thing - "Claude Dispatch and the Power of Interfaces"

