Wikipedia Is a Methane Fire
An invisible blaze is consuming the internet’s source of truth.
During the 1981 Indy 500, Penske Racing driver Rick Mears pitted his Chaparral 2K car on lap 59. During the pit stop, a fuel hose that failed to fit properly gushed methanol all over Mears, the car, and the pit crew. Within seconds, the heat from the turbocharger and exhaust ignited the fuel—and the half-dozen or so people it had covered.
Mears is seen flapping his arms and dancing around, as if at a silent disco. If it hadn’t been such a harrowing moment, the footage itself would be almost comical. The reality is that Mears was on fire; you just couldn’t see it.
Methanol burns with an invisible flame, a close cousin to methane fires—phenomena both wondrous and terrifying. In Mears’ case, that invisibility nearly killed him.
Over the past two decades, Wikipedia has been become a cornerstone of the internet. Through brilliant PR, Wikipedia sold a story not just of a new encyclopedia, but of a new knowledge system—crowdsourced, neutral, democratic.
It was never true. Wikipedia is sprawling and impressive, a remarkable human feat. But it is not neutral. Its articles are shaped not by consensus or debate, but by behind-the-scenes maneuvering. Editors skilled enough to master its dense, contradictory rules determine what the world reads. Outlast, outsmart, and outgun your opponent, and your version of reality wins.
As our lives shifted online, Wikipedia’s influence only grew. Its entries dominate Google results, feed AI training datasets, and act as the “ground truth” for what counts as bias. It is the single most important source of information online.
But Wikipedia is on fire—you just can’t see it. Over the past year, I’ve documented just how far this has gone. I’ve reported on the “Gang of 40” editors controlling the Palestine-Israel topic area; shown how the Wikimedia Foundation (WMF) fell to ideological capture in 2017; and chronicled the shadow industry of paid editing shaping articles for corporations like Pfizer, news outlets like The New York Times, and even U.S. government officials.
The real story is what you don’t see: WMF’s near-total absence of oversight. Every major investigation I’ve published relied on user data or my own extensive digging, not WMF disclosures. The foundation has a $200 million annual budget, yet neither the tools nor the will to investigate manipulation.
The result: Wikipedia burns invisibly. Its sleek, minimalist interface hides a crisis turning into a conflagration. Mainstream media still repeats the PR fairy tale about a magical site built on trust. Everything looks perfect. But Wikipedia is engulfed in flames.
The problem is that it’s spreading. As part of the big transformation it underwent in 2017 (much more on that coming), Wikipedia quite deliberately turned itself into the “knowledge infrastructure” of the internet, the connective tissue linking mainstream reporting to social media and AI systems.
As a result, the flames tearing through Wikipedia’s “topic areas” are spreading to X, Facebook, Instagram, Quora. Even more importantly, as the single most important source of training data and reference data for frontier LLM models, Wikipedia now threatens to set alight the AI shaping our future.
Despite this—aside from a few individuals running around, flapping their arms, and jumping up and down—the flames remain (mostly) undetected.
Wikipedia is on fire. It’s just that nobody knows.
A message from Ashley Rindsberg, NPOV Founder, Editor and Chief Investigative Officer:
As we set out on this mission, NPOV faces an uphill battle. Wikipedia is backed by a billion-dollar foundation. We’re just a small journalistic team trying to bring you the unvarnished truth. But you can help. Please share this article with 2 people who need to know the truth about what’s really driving our information ecosystem. Those two shares will make a huge difference. Thank you for your support.



