Daniel Heinz clicked through each folder in the file drive, searching for the answers that had evaded him and his lab mates for years.
Heinz, a graduate student in Brenda Bloodgood’s lab at the University of California, San Diego (UCSD), was working on a Ph.D. project, part of which built on the work of a postdoctoral researcher who had left the lab and started his own a few years prior. The former postdoc studied how various types of electrical activity in the mouse hippocampus induce a gene called NPAS4 in different ways. One of his discoveries was that, in some situations, NPAS4 was induced in the far-reaching dendrites of neurons.
The postdoc’s work resulted in a paper in Cell, landed him more than $1.4 million in grants and an assistant professor position at the University of Utah, and spawned several follow-up projects in the lab. In other words, it was a slam dunk.
But no one else in the lab—including Heinz—could replicate the NPAS4 data. Other lab members always had a technical explanation for why the replication experiments failed, so for years the problem was passed from one trainee to another.
Which explains why, on this day in early April 2023, Heinz was poking around the postdoc’s raw data. What he eventually found would lead to a retraction, a resignation and a reckoning, but in the moment, Heinz says, he was not thinking about any of those possibilities. In fact, he had told no one he was doing this. He just wanted to figure out why his experiments weren’t working.
To visualize the location of NPAS4, the lab used immunohistochemistry, which tags a gene product with a tailored fluorescent antibody. Any part of the cell that expresses the gene should glow. In his replication attempts, Heinz says he struggled to see any expression, and when he saw indications of it, the signal was faint and noisy. So he wanted to compare his own images to the postdoc’s raw results rather than the processed images included in the 2019 Cell paper.
He clicked through each file folder until he found a batch of images that looked like they came from the appropriate imaging session, Heinz recalls. Then he sifted through them, trying to find one that resembled the images in the published paper.
Eventually, Heinz says, he recognized a dendrite section that looked like the mirror image of a dendrite from one of the figures. In the paper figure, the image illustrated that NPAS4 appeared only in the dendrites of some neurons. In the raw image, however, it seemed the signal was not restricted to the dendrites but instead filled entire cells.
Heinz immediately knew something was wrong, he says. The raw image looked more like a section of tissue from a mouse engineered to express green fluorescent protein (GFP) in a subset of neurons. Immunohistochemistry is much messier. Antibodies are notoriously dirty and bind to more than what they are designed to target. There is often background fluorescence that makes it harder to pull out a signal from the noise. But there was almost no noise in this image.
Heinz says he suspected that the postdoc had used the GFP fluorescence in the figure but called it the immunohistochemistry data. If his suspicions were correct, it meant the postdoc’s data did not support his story that NPAS4 was induced in the dendrites. It meant the lab had been heading down a dead-end path. It meant the postdoc had faked data.
In recent decades, scientific misconduct—formally defined as the falsification, fabrication or plagiarism of data—has lurched into the spotlight. Investigations have uncovered fraudulent data at the foundation of a prominent Alzheimer’s disease theory, toppled presidencies at elite universities and shuttered entire families of journals. Fake studies sully both the scientific record and the public’s opinion of science, and they waste time and tax dollars.
The exact prevalence is unknown—it is difficult to conduct a census because cases surface only when the perpetrator is caught, says Lex Bouter, professor emeritus of methodology and integrity at Amsterdam University Medical Center and Vrije Universiteit Amsterdam—but in surveys, about 4 percent of researchers admit to ever having falsified or fabricated data. “Usually, things go right,” Bouter says. It can be hard to fathom when things go wrong.
This is in part because trust and science are intimately intertwined, making scientific misconduct “very much akin to acts of perversion,” argued philosopher of science Michael Ruse in a 2005 essay. For most scientists, “to fake your results is just not understandable.” Plus, a romantic mythology often surrounds scientists. They are portrayed as “more than human, being like gods in their creativity, and also as less than human, being deprived in their work of the passions, attitudes and social ties given to ordinary men,” wrote Robert K. Merton, the founder of the sociology of science, in a 1970 essay. As a result, he wrote, the public idealizes and idolizes them.