Does ChatGPT make students smarter—or just better at faking it?

Image Credits: UnsplashImage Credits: Unsplash

You can tell when a sentence has no soul. It’s grammatically perfect. Structurally clean. But when you read it, it just… floats. No tension. No voice. No fingerprint. You can’t hear the writer thinking. And that’s how many university professors are now spotting AI-written essays in their inboxes. Not by the errors. But by the eerie smoothness.

It’s happening across campuses, quietly and often awkwardly. Students turn in essays written by large language models like ChatGPT. Teachers mark them—sometimes knowingly, sometimes not. But a deeper question is emerging from all this: If writing is how we think, reflect, and understand our place in the world… what happens when we stop doing it ourselves?

That question has started to haunt classrooms, especially in subjects like ethics, literature, sociology—the kinds of disciplines where personal experience and moral wrestling used to matter. Because when a machine writes your thoughts, what are you learning—really?

Dr. Jocelyn Leitzinger, who teaches business and society at the University of Illinois Chicago, gave her students an assignment to reflect on moments when they had witnessed discrimination. She expected stories that were raw, possibly imperfect, maybe even awkward. What she got instead was story after story featuring a woman named Sally.

No, not the same story. But the same name. Again and again. It didn’t take long to figure out why. ChatGPT, when prompted to “write a story about discrimination,” seemed to favor a default woman named Sally. She had become a placeholder. A fiction. A stand-in for a real experience that never came.

Leitzinger said the bigger problem wasn’t that students were using AI tools. It was that they weren’t even pretending to have their own ideas anymore. “They weren’t coming up with their own stories,” she told reporters. “They weren’t even trying.” The name Sally said more than it seemed to. It was a signal: the students had skipped the process of reflection altogether. They had offloaded their thinking—and their empathy.

Recent findings from MIT researchers shed light on what’s really happening when people write using ChatGPT. In a controlled study, students were divided into three groups. One wrote essays with ChatGPT. One used search engines. One used only their own brains. Each group had 20 minutes per essay, and each wore EEG monitors to track brain activity during the task.

The results? The ChatGPT group consistently performed worse than the others. Their essays were scored lower by teachers. More disturbingly, 80% of the ChatGPT users couldn’t recall anything from the essays they submitted—because they hadn’t actually constructed the ideas themselves.

The EEG scans told the same story. The ChatGPT group showed less neural connectivity—meaning their brains weren’t working as hard to synthesize or engage ideas. By the third session, researchers noticed something else. The ChatGPT group wasn’t even reading what was generated anymore. They were just copying and pasting. Efficient? Maybe. Educational? Not even close.

Writing is slow. Thinking hurts. Anyone who has ever stared at a blinking cursor for 20 minutes knows that generating an original sentence can feel like trying to push a thought uphill.

But that’s the point. Thinking is meant to be hard. It’s a cognitive workout. And writing is how we make the chaos in our minds cohere into something legible. Something testable. Something personal.

When you skip that process—when you let AI generate the argument, structure the paragraphs, fill in the evidence—you’re not just cheating the system. You’re skipping the chance to organize your own beliefs. That’s what Leitzinger noticed most about her students’ work post-ChatGPT. It wasn’t just the grammar that had changed—it was the absence of insight. Essays were clean but hollow. The misspellings were gone. But so were the original ideas.

Some educators are tempted to blame laziness or entitlement. But the real problem isn’t that students don’t want to think. It’s that they don’t know when or how they’re supposed to. At many universities, AI use is banned in one class, encouraged in another, and quietly tolerated in others. There’s no consistent rulebook. No shared ethic. Just a haze of anxiety.

So students do what students have always done: they adapt to survive. They paste prompts into ChatGPT. They edit the outputs a little. They change the font. Or don’t. And they hope that what they’re learning—how to shortcut the process—is enough to get them by.

In one sense, they’re responding to a system that hasn’t caught up with itself. We introduced generative AI to the world, but we didn’t rebuild the rituals that once made learning real.

Some people argue this shift is like when calculators first entered math classrooms. At first, there was resistance. Then, there was redesign. Educators began teaching the concept before allowing the tool.

But calculators don’t give you the answer unless you feed them the logic. ChatGPT, by contrast, offers an entire essay from a single sentence. That’s not support—it’s substitution. The danger isn’t that students will get better grades without learning. It’s that they won’t even notice they’re not learning.

They’ll assume they understand something because the answer reads well. But understanding isn’t about what you read. It’s about what you reconstruct in your own mind. And that’s what we lose when we let machines speak for us.

Here’s where things get interesting. In the same MIT study, researchers did a fourth trial. They gave the group that previously had to write unaided access to ChatGPT. And this time, something surprising happened: their brain connectivity went up.

Why? Because they’d already done the mental work first. They weren’t relying on AI to think—they were using it to polish. To iterate. To reflect. This may be the path forward: treat AI not as an idea factory, but as a feedback loop. Educators could shift grading away from final products and toward the process: brainstorming notes, revision logs, annotated drafts.

Students could be taught not just to write—but to reflect on how they write. To identify their own thought patterns. To decide when AI is a tool—and when it’s a crutch.

Here’s what no one wants to say out loud. When students use AI to write personal essays, especially on topics like ethics, identity, or belief, they’re not just skipping a task. They’re deferring the act of becoming.

Because writing, at its core, is not just performance. It’s identity formation. It’s the moment a thought you didn’t know you had becomes real because you put it into words. When we remove that, we don’t just lose rigor. We lose voice. And over time, that does something to how a generation understands itself.

One anonymous British university student put it simply. He uses ChatGPT to compile notes and find sources. But when it comes to the actual essay, he draws the line.

“I think that using ChatGPT to write your work for you is not right,” he said. “Because it’s not what you’re supposed to be at university for.”

That sentence may seem quaint. But it captures something we risk forgetting in the age of AI: Learning is not just about efficiency. It’s about engagement. Struggle. Even boredom. It’s the gaps in our understanding that invite growth. The slowness that forces reflection. The revision process that builds resilience. No AI model can give us that. It can only simulate the outcome.

What’s at stake here isn’t just plagiarism or academic policy. It’s how we prepare people to contribute to conversations that matter. In a world where opinions are monetized, where platforms reward hot takes, and where algorithms flatten nuance, writing is still one of the few places where people can wrestle with their own beliefs in solitude before making them public.

If we allow that space to be outsourced entirely to AI, we’re not just creating smoother essays. We’re creating shallower humans.

Let students use AI. But make them reflect. Make them revise. Make them present drafts that show their thinking, not just their submission. Recenter the process—not the polish. Because when writing is treated as a mirror, not a mask, it teaches us something AI never will:

Who we are becoming as we think. And who we’re letting think for us.


Read More

In Trend World
Image Credits: Unsplash
In TrendJuly 17, 2025 at 11:30:00 PM

Forget what you knew about childhood—Generation Beta’s future looks very different

It starts with a headline. "AI Will Raise Your Child." "Generation Beta Will Skip Driving Altogether." "Kids Born After 2025 May Never Work...

Culture World
Image Credits: Unsplash
CultureJuly 17, 2025 at 11:30:00 PM

Intern alleges sudden firing, no pay for week of work

One of the most painful lessons I learned as a founder wasn’t about product or revenue. It was about power—and how easily we...

Financial Planning World
Image Credits: Unsplash
Financial PlanningJuly 17, 2025 at 11:30:00 PM

Signs you’re headed for bankruptcy—and how to intervene early

For most working adults, the word “bankruptcy” carries images of irreversible failure—court summons, repossession, social shame. But in practice, bankruptcy is not the...

Culture World
Image Credits: Unsplash
CultureJuly 17, 2025 at 11:30:00 PM

What is soft influence in the workplace?

In the early years of a startup, it’s easy to mistake visibility for influence. The loudest voice in the meeting. The person who...

Mortgages World
Image Credits: Unsplash
MortgagesJuly 17, 2025 at 11:00:00 PM

How mortgage interest works and what it means for your loan

You’ve found the home, signed the offer, and your mortgage is approved. But beneath the paperwork lies a financial structure that deserves far...

Health & Wellness World
Image Credits: Unsplash
Health & WellnessJuly 17, 2025 at 11:00:00 PM

Which is better for your gut — sourdough or whole-wheat bread?

Bread is having a moment. Not just because of flavor or nostalgia, but because of its impact on digestion. Whether you’re trying to...

Culture World
Image Credits: Unsplash
CultureJuly 17, 2025 at 11:00:00 PM

Losing your boss’s trust isn’t the end—here’s how I recovered

You usually know the moment it happens. That shift in energy. The slight pause before a response. The way your manager glances at...

Credit World
Image Credits: Unsplash
CreditJuly 17, 2025 at 11:00:00 PM

How childhood neighborhood affects credit score

You probably think of your credit score as something you earn. Something that reflects your payment history, your debt levels, your reliability as...

Leadership World
Image Credits: Unsplash
LeadershipJuly 17, 2025 at 8:00:00 PM

The quiet signals that show you’re a powerful leader

In early-stage companies, power is often misunderstood. New founders frequently assume that power must be asserted—through presence, decisiveness, or being the most informed...

Careers World
Image Credits: Unsplash
CareersJuly 17, 2025 at 8:00:00 PM

Should you pursue the C-suite leadership career path?

In boardrooms from London to Dubai, the C-suite retains its gleam as a pinnacle of business achievement. Chief Executive Officer. Chief Marketing Officer....

Investing World
Image Credits: Unsplash
InvestingJuly 17, 2025 at 7:30:00 PM

XRP might break its all-time high—here’s what’s fueling the hype

XRP is back in the headlines. Again. But this time, the vibe feels different. It’s not just another pump and dump chatter on...

Load More