For decades, expertise formed the backbone of how modern societies solved problems. Doctors made health decisions. Engineers built bridges. Economists modeled risks. But the foundations of that system are cracking. From vaccine skeptics to climate deniers to online financial “gurus,” we are seeing a breakdown of trust in those who are trained, tested, and credentialed to know better. The result is not more empowerment, but more confusion. In rejecting experts, we’re not decentralizing knowledge—we’re destabilizing it. The real risk isn’t that experts are sometimes wrong. It’s that we no longer know how to tell who’s right.
The first and most visible sign of this shift is the collapse in public trust. Over the last two decades, surveys from Pew Research and Edelman’s Trust Barometer have tracked a consistent decline in confidence in traditional authorities: scientists, doctors, journalists, and government officials. In the US, trust in scientists dropped from 86% in 2019 to 73% in 2023—and even lower among younger and politically conservative demographics.
Multiple factors have contributed to this erosion:
- The politicization of science, especially during the COVID-19 pandemic, made public health advice seem partisan.
- Algorithmic information bubbles exposed people to anti-expert content more frequently.
- High-profile institutional failures, such as financial crises or medical scandals, led people to believe that expert credentials offered no real accountability.
What’s striking is not just the scale of this mistrust, but its emotional texture. Experts aren’t just viewed with skepticism—they’re often framed as elitist, out of touch, or even malicious. That’s a far cry from the social contract that once granted scientists and scholars a presumption of good faith, even when they were wrong.
Social media has upended the traditional gatekeeping of knowledge. In principle, this is a good thing: more access, more voices, more diversity in perspective. But with it has come a dangerous flattening of epistemic authority. A virologist’s peer-reviewed Twitter thread can appear right next to a viral anti-vax TikTok, with no visual cues to distinguish their credibility.
This flattening fuels a false equivalence: the idea that all views deserve equal weight in all contexts. It’s one thing to debate policy preferences. It’s another to give equal airtime to flat-Earth theory or home remedies for cancer. When platforms don’t distinguish between credentialed expertise and user-generated speculation, audiences are left to fend for themselves—often without the tools to do so.
This problem compounds in areas where the “expert” answer is counterintuitive or inconvenient. It’s easier to believe you can outsmart inflation with crypto trading tips than to accept that financial planning takes decades. It’s more comforting to think vaccine side effects are a cover-up than to understand immunology. In each case, the less sensational—but more accurate—answer loses the attention battle.
The mantra “do your own research” has become a stand-in for intellectual independence. In theory, it encourages curiosity and skepticism. In practice, it often masks a rejection of expertise altogether. Most people don’t have the training, tools, or time to independently verify claims in fields like epidemiology, climate modeling, or semiconductor physics. And that’s not a failure of intelligence—it’s a matter of specialization.
Modern society functions because we trust certain people to know things we don’t. When you fly in a plane, you assume the pilot understands aerodynamics. When you eat at a restaurant, you assume the chef knows food safety standards. These assumptions are not blind faith—they’re necessary shortcuts in a world too complex to navigate alone.
When “do your own research” is used to override expert consensus—rather than complement it—it becomes a form of epistemic arrogance. Worse, it creates a post-truth environment where the loudest or most charismatic voice wins, not the most credible one.
This isn’t just a philosophical or cultural issue. The erosion of expert authority has material consequences. In public health, misinformation about vaccines and treatments has directly contributed to preventable deaths. In finance, the rise of “finfluencers” pushing options trading or crypto speculation has cost retail investors billions. In climate policy, political obstruction fueled by anti-science rhetoric has delayed vital action.
Even infrastructure suffers. Urban planners face backlash from residents who distrust environmental impact assessments. Engineers working on energy projects get caught in political crossfire. Teachers are scrutinized not for pedagogical quality but for perceived ideological content. The net result: paralysis, burnout, and growing risk aversion among professionals who used to lead public initiatives.
The longer-term risk is systemic: a world where evidence-based policy becomes politically untenable. If the public won’t accept expertise in areas like AI governance, pandemic preparedness, or biosecurity, we may face crises with no agreed-upon compass for action.
Rebuilding trust in expertise doesn’t mean demanding blind obedience. It means making the expert process more transparent, more inclusive, and more resilient.
Some key strategies include:
- Communication, not just publication: Experts need to learn how to translate their findings into plain language. This means better media training, public engagement, and digital literacy.
- Credential signaling online: Platforms could provide credibility badges, link to peer-reviewed sources, or highlight consensus positions the way Wikipedia does.
- Admit and explain uncertainty: One strength of science is its ability to update. Experts should own that publicly. Doing so increases credibility, not decreases it.
- Bridge-building across class and culture: Many expert communities are still siloed—by education, wealth, and geography. Investing in more diverse pipelines into academia and professional fields helps reduce the perception of elitism.
Above all, we must distinguish between being wrong in good faith and being wrong in bad faith. Experts will make mistakes. That’s not a reason to dismiss them—it’s a reason to hold them to high, but fair, standards.
The future won’t get simpler. From climate adaptation to AI governance to public health, complexity will only increase. We need experts—not to dictate, but to guide. If we discard the institutions that train, vet, and hold experts accountable, we won’t be left with more democratic discourse. We’ll be left with confusion, manipulation, and polarization.
This isn’t just about science or media. It’s about whether we still believe in the idea that knowledge can be cumulative, that good judgment requires more than gut feeling, and that truth is not just opinion with better branding. The survival of expertise is not elitism. It’s insurance—against error, chaos, and regression. And once it’s gone, getting it back won’t be easy.