Does ChatGPT make students smarter—or just better at faking it?

Image Credits: UnsplashImage Credits: Unsplash

You can tell when a sentence has no soul. It’s grammatically perfect. Structurally clean. But when you read it, it just… floats. No tension. No voice. No fingerprint. You can’t hear the writer thinking. And that’s how many university professors are now spotting AI-written essays in their inboxes. Not by the errors. But by the eerie smoothness.

It’s happening across campuses, quietly and often awkwardly. Students turn in essays written by large language models like ChatGPT. Teachers mark them—sometimes knowingly, sometimes not. But a deeper question is emerging from all this: If writing is how we think, reflect, and understand our place in the world… what happens when we stop doing it ourselves?

That question has started to haunt classrooms, especially in subjects like ethics, literature, sociology—the kinds of disciplines where personal experience and moral wrestling used to matter. Because when a machine writes your thoughts, what are you learning—really?

Dr. Jocelyn Leitzinger, who teaches business and society at the University of Illinois Chicago, gave her students an assignment to reflect on moments when they had witnessed discrimination. She expected stories that were raw, possibly imperfect, maybe even awkward. What she got instead was story after story featuring a woman named Sally.

No, not the same story. But the same name. Again and again. It didn’t take long to figure out why. ChatGPT, when prompted to “write a story about discrimination,” seemed to favor a default woman named Sally. She had become a placeholder. A fiction. A stand-in for a real experience that never came.

Leitzinger said the bigger problem wasn’t that students were using AI tools. It was that they weren’t even pretending to have their own ideas anymore. “They weren’t coming up with their own stories,” she told reporters. “They weren’t even trying.” The name Sally said more than it seemed to. It was a signal: the students had skipped the process of reflection altogether. They had offloaded their thinking—and their empathy.

Recent findings from MIT researchers shed light on what’s really happening when people write using ChatGPT. In a controlled study, students were divided into three groups. One wrote essays with ChatGPT. One used search engines. One used only their own brains. Each group had 20 minutes per essay, and each wore EEG monitors to track brain activity during the task.

The results? The ChatGPT group consistently performed worse than the others. Their essays were scored lower by teachers. More disturbingly, 80% of the ChatGPT users couldn’t recall anything from the essays they submitted—because they hadn’t actually constructed the ideas themselves.

The EEG scans told the same story. The ChatGPT group showed less neural connectivity—meaning their brains weren’t working as hard to synthesize or engage ideas. By the third session, researchers noticed something else. The ChatGPT group wasn’t even reading what was generated anymore. They were just copying and pasting. Efficient? Maybe. Educational? Not even close.

Writing is slow. Thinking hurts. Anyone who has ever stared at a blinking cursor for 20 minutes knows that generating an original sentence can feel like trying to push a thought uphill.

But that’s the point. Thinking is meant to be hard. It’s a cognitive workout. And writing is how we make the chaos in our minds cohere into something legible. Something testable. Something personal.

When you skip that process—when you let AI generate the argument, structure the paragraphs, fill in the evidence—you’re not just cheating the system. You’re skipping the chance to organize your own beliefs. That’s what Leitzinger noticed most about her students’ work post-ChatGPT. It wasn’t just the grammar that had changed—it was the absence of insight. Essays were clean but hollow. The misspellings were gone. But so were the original ideas.

Some educators are tempted to blame laziness or entitlement. But the real problem isn’t that students don’t want to think. It’s that they don’t know when or how they’re supposed to. At many universities, AI use is banned in one class, encouraged in another, and quietly tolerated in others. There’s no consistent rulebook. No shared ethic. Just a haze of anxiety.

So students do what students have always done: they adapt to survive. They paste prompts into ChatGPT. They edit the outputs a little. They change the font. Or don’t. And they hope that what they’re learning—how to shortcut the process—is enough to get them by.

In one sense, they’re responding to a system that hasn’t caught up with itself. We introduced generative AI to the world, but we didn’t rebuild the rituals that once made learning real.

Some people argue this shift is like when calculators first entered math classrooms. At first, there was resistance. Then, there was redesign. Educators began teaching the concept before allowing the tool.

But calculators don’t give you the answer unless you feed them the logic. ChatGPT, by contrast, offers an entire essay from a single sentence. That’s not support—it’s substitution. The danger isn’t that students will get better grades without learning. It’s that they won’t even notice they’re not learning.

They’ll assume they understand something because the answer reads well. But understanding isn’t about what you read. It’s about what you reconstruct in your own mind. And that’s what we lose when we let machines speak for us.

Here’s where things get interesting. In the same MIT study, researchers did a fourth trial. They gave the group that previously had to write unaided access to ChatGPT. And this time, something surprising happened: their brain connectivity went up.

Why? Because they’d already done the mental work first. They weren’t relying on AI to think—they were using it to polish. To iterate. To reflect. This may be the path forward: treat AI not as an idea factory, but as a feedback loop. Educators could shift grading away from final products and toward the process: brainstorming notes, revision logs, annotated drafts.

Students could be taught not just to write—but to reflect on how they write. To identify their own thought patterns. To decide when AI is a tool—and when it’s a crutch.

Here’s what no one wants to say out loud. When students use AI to write personal essays, especially on topics like ethics, identity, or belief, they’re not just skipping a task. They’re deferring the act of becoming.

Because writing, at its core, is not just performance. It’s identity formation. It’s the moment a thought you didn’t know you had becomes real because you put it into words. When we remove that, we don’t just lose rigor. We lose voice. And over time, that does something to how a generation understands itself.

One anonymous British university student put it simply. He uses ChatGPT to compile notes and find sources. But when it comes to the actual essay, he draws the line.

“I think that using ChatGPT to write your work for you is not right,” he said. “Because it’s not what you’re supposed to be at university for.”

That sentence may seem quaint. But it captures something we risk forgetting in the age of AI: Learning is not just about efficiency. It’s about engagement. Struggle. Even boredom. It’s the gaps in our understanding that invite growth. The slowness that forces reflection. The revision process that builds resilience. No AI model can give us that. It can only simulate the outcome.

What’s at stake here isn’t just plagiarism or academic policy. It’s how we prepare people to contribute to conversations that matter. In a world where opinions are monetized, where platforms reward hot takes, and where algorithms flatten nuance, writing is still one of the few places where people can wrestle with their own beliefs in solitude before making them public.

If we allow that space to be outsourced entirely to AI, we’re not just creating smoother essays. We’re creating shallower humans.

Let students use AI. But make them reflect. Make them revise. Make them present drafts that show their thinking, not just their submission. Recenter the process—not the polish. Because when writing is treated as a mirror, not a mask, it teaches us something AI never will:

Who we are becoming as we think. And who we’re letting think for us.


Image Credits: Unsplash
July 17, 2025 at 11:30:00 PM

Forget what you knew about childhood—Generation Beta’s future looks very different

It starts with a headline. "AI Will Raise Your Child." "Generation Beta Will Skip Driving Altogether." "Kids Born After 2025 May Never Work...

Singapore
Image Credits: Unsplash
July 17, 2025 at 11:30:00 PM

Intern alleges sudden firing, no pay for week of work

One of the most painful lessons I learned as a founder wasn’t about product or revenue. It was about power—and how easily we...

Image Credits: Unsplash
July 17, 2025 at 11:00:00 PM

Which is better for your gut — sourdough or whole-wheat bread?

Bread is having a moment. Not just because of flavor or nostalgia, but because of its impact on digestion. Whether you’re trying to...

United States
Image Credits: Unsplash
July 17, 2025 at 6:00:00 PM

Social Security’s changing—Here’s what it means for your check

Let’s be real. Most Gen Zers and younger millennials have two thoughts when it comes to Social Security: (1) “Will it even be...

Image Credits: Unsplash
July 17, 2025 at 6:00:00 PM

Natural resources that could run out—and what that means for our lives

It’s easy to assume the world will keep giving. Air to breathe. Water to drink. Food that appears, reliably, on the shelf. Electricity...

Malaysia
Image Credits: Unsplash
July 17, 2025 at 6:00:00 PM

Why surfing in Malaysia deserves a spot on your travel list

It’s early. You’re barefoot, holding a longboard, walking down a quiet path lined with casuarina trees. You can smell the ocean before you...

Image Credits: Unsplash
July 17, 2025 at 4:00:00 PM

Why opening an umbrella inside is considered bad luck

The sound is unmistakable—a sudden snap of tension, the stretch of ribs unfurling, and then the quiet defiance of something meant for storms...

Image Credits: Unsplash
July 17, 2025 at 4:00:00 PM

Save money on groceries with ChatGPT

There’s a hidden line item in most household budgets that can swing by hundreds of dollars a month without anyone noticing. It’s not...

Image Credits: Unsplash
July 17, 2025 at 3:30:00 PM

How baby memory development shapes emotional safety

For generations, parents have heard the same phrase tossed around like a casual disclaimer: “Don’t worry, they won’t remember.” It’s been used to...

Image Credits: Unsplash
July 17, 2025 at 3:00:00 PM

Are fish populations in the ocean collapsing?

A slab of salmon glistens behind the glass at the supermarket. A tuna poke bowl arrives at your table with pristine avocado curls....

Image Credits: Unsplash
July 17, 2025 at 1:00:00 PM

Even small increases in activity can extend your life, study finds

If you haven’t touched a gym in months, you’re not alone. But the real problem isn’t gym avoidance—it’s movement avoidance. The body was...

Load More