A scientific fraud. An investigation. A lab in recovery.

Science is built on trust. What happens when someone destroys it?

By Calli McMurray
4 October 2024 | 26 min read
Illustration of a lab with a smoking crater in the middle of the floor.
Fallout zone: After misconduct occurs in a lab, the bystanders are left to grapple with the scientific and personal consequences.
Illustrations by Jialun Deng

Daniel Heinz clicked through each folder in the file drive, searching for the answers that had evaded him and his lab mates for years.

Heinz, a graduate student in Brenda Bloodgood’s lab at the University of California, San Diego (UCSD), was working on a Ph.D. project, part of which built on the work of a postdoctoral researcher who had left the lab and started his own a few years prior. The former postdoc studied how various types of electrical activity in the mouse hippocampus induce a gene called NPAS4 in different ways. One of his discoveries was that, in some situations, NPAS4 was induced in the far-reaching dendrites of neurons.

The postdoc’s work resulted in a paper in Cell, landed him more than $1.4 million in grants and an assistant professor position at the University of Utah, and spawned several follow-up projects in the lab. In other words, it was a slam dunk.

But no one else in the lab—including Heinz—could replicate the NPAS4 data. Other lab members always had a technical explanation for why the replication experiments failed, so for years the problem was passed from one trainee to another.

Which explains why, on this day in early April 2023, Heinz was poking around the postdoc’s raw data. What he eventually found would lead to a retraction, a resignation and a reckoning, but in the moment, Heinz says, he was not thinking about any of those possibilities. In fact, he had told no one he was doing this. He just wanted to figure out why his experiments weren’t working.

To visualize the location of NPAS4, the lab used immunohistochemistry, which tags a gene product with a tailored fluorescent antibody. Any part of the cell that expresses the gene should glow. In his replication attempts, Heinz says he struggled to see any expression, and when he saw indications of it, the signal was faint and noisy. So he wanted to compare his own images to the postdoc’s raw results rather than the processed images included in the 2019 Cell paper.

He clicked through each file folder until he found a batch of images that looked like they came from the appropriate imaging session, Heinz recalls. Then he sifted through them, trying to find one that resembled the images in the published paper.

Eventually, Heinz says, he recognized a dendrite section that looked like the mirror image of a dendrite from one of the figures. In the paper figure, the image illustrated that NPAS4 appeared only in the dendrites of some neurons. In the raw image, however, it seemed the signal was not restricted to the dendrites but instead filled entire cells.

Heinz immediately knew something was wrong, he says. The raw image looked more like a section of tissue from a mouse engineered to express green fluorescent protein (GFP) in a subset of neurons. Immunohistochemistry is much messier. Antibodies are notoriously dirty and bind to more than what they are designed to target. There is often background fluorescence that makes it harder to pull out a signal from the noise. But there was almost no noise in this image.

Heinz says he suspected that the postdoc had used the GFP fluorescence in the figure but called it the immunohistochemistry data. If his suspicions were correct, it meant the postdoc’s data did not support his story that NPAS4 was induced in the dendrites. It meant the lab had been heading down a dead-end path. It meant the postdoc had faked data.

In recent decades, scientific misconduct—formally defined as the falsification, fabrication or plagiarism of data—has lurched into the spotlight. Investigations have uncovered fraudulent data at the foundation of a prominent Alzheimer’s disease theory, toppled presidencies at elite universities and shuttered entire families of journals. Fake studies sully both the scientific record and the public’s opinion of science, and they waste time and tax dollars.

The exact prevalence is unknown—it is difficult to conduct a census because cases surface only when the perpetrator is caught, says Lex Bouter, professor emeritus of methodology and integrity at Amsterdam University Medical Center and Vrije Universiteit Amsterdam—but in surveys, about 4 percent of researchers admit to ever having falsified or fabricated data. “Usually, things go right,” Bouter says. It can be hard to fathom when things go wrong.

This is in part because trust and science are intimately intertwined, making scientific misconduct “very much akin to acts of perversion,” argued philosopher of science Michael Ruse in a 2005 essay. For most scientists, “to fake your results is just not understandable.” Plus, a romantic mythology often surrounds scientists. They are portrayed as “more than human, being like gods in their creativity, and also as less than human, being deprived in their work of the passions, attitudes and social ties given to ordinary men,” wrote Robert K. Merton, the founder of the sociology of science, in a 1970 essay. As a result, he wrote, the public idealizes and idolizes them.

Illustration of someone sitting at a computer screen in the dark with one hand partially covering their face.
Dead end: The newly uncovered discrepancies between the postdoctoral researcher’s raw data and paper figures confirmed that the lab had been chasing a false signal for years.

Framed in this way, it can seem implausible for scientists to ever fake data. But the scientific world is not powered by curiosity alone: It also runs on a credit system, Merton argued. The scientists who create new knowledge are rewarded with recognition. Jobs, funding, and sometimes awards and fame, follow. Under the credit system, misconduct starts to make more sense.

And when misconduct does occur, it creates a fallout zone in the lab. Certainly it did for Bloodgood’s group. That’s because misconduct is not just a scientific betrayal; it’s a personal one as well, says C.K. Gunsalus, director of the National Center for Principled Leadership and Research Ethics at the University of Illinois Urbana-Champaign. “It’s very hard for a lab to recover.”

This article is about that recovery, and what happens to the people left behind. So we won’t name the person who committed fraud in the Bloodgood Lab, or in any others. (The postdoc did not respond to requests to comment for this story.) Fraud happens every year, in labs all over the world. This story could be about anyone.

T

he postdoc whose work Heinz called into question joined Bloodgood’s lab in January 2015, shortly after finishing his Ph.D. During her own postdoctoral work, Bloodgood, associate professor of neurobiology, worked in a lab that was investigating activity-regulated genes, with a focus on immediate early genes, which are induced in cells right after the arrival of an outside signal. And they studied an immediate early gene that was specific to neurons: NPAS4. In some experiments, Bloodgood noticed whiffs of NPAS4 in the neuropil of the hippocampus.

The postdoc’s first project in Bloodgood’s lab was to explore this phenomenon. He claimed he had found that one kind of electrical stimulation induces NPAS4 in the cell body, another kind induces it in the dendrites, and that the different flavors of NPAS4 interact with DNA in different ways. That is the story he told in the Cell paper, the product of years of work.

These were “really exciting results,” says Pei-Ann Lin Acosta, a graduate student in Bloodgood’s lab at the time, who is now a management consultant. Acosta was working on a similar project, but she used optogenetic stimulation instead of electrophysiology. Yet despite the overlap between her work and the postdoc’s, Acosta says, she never managed to replicate his results.

The group investigated several potential causes for the failed replications.  First, Bloodgood says she chalked it up to Acosta’s inexperience—she was a new graduate student, after all. Then, the team ran out of the initial supply of antibody they used to tag NPAS4, and they struggled to find an effective replacement. Eventually, Bloodgood suggested the postdoc and Acosta work side by side at the lab bench so they could figure out what was going wrong, Acosta recalls. He was “marginally helpful,” Acosta says, but they never discovered the source of the problem. She felt so frustrated that sometimes she cried, and eventually she switched to another lab.

Something similar happened to Andre DeSouza. He transferred into Bloodgood’s lab in the third year of his Ph.D. His project also built on the postdoc’s work. The postdoc had compared three amounts of electrical stimulation: 0, 0.1 and 100 hertz; DeSouza says he wanted to test smaller increments of stimulation to find the threshold that would trigger NPAS4 expression.

Like Acosta, his first step was replicating part of the postdoc’s work. And as with Acosta, it never happened, he says. After a few years of failed replications and dead-end troubleshooting, compounded by some personal issues, DeSouza dropped out, leaving the Ph.D. program with just his master’s degree. “It sucks to feel like, ‘Oh, I was not a good scientist,’ and then realize, like, ‘Oh, I was trying to do something that was just never really going to work,’” DeSouza says.

O

nce Heinz had found the smoking gun in the postdoc’s raw data, it took him a couple of weeks to make “damn sure that I was right,” he says.

First, he needed to work through the logic of what he saw and what it meant, and “put it outside of my brain.” He started a document and spelled out each issue he found, attached screenshots, recorded file names and walked through what evidence would refute or support his hypothesis.

“Intuitively, perhaps, I didn’t have any doubt. But that’s not enough,” Heinz says. “I needed to be able to convince the very critical part of myself that there was no chance that what I was finding was not real.”

He scheduled a meeting with Bloodgood on 13 April to share what he had uncovered, as much as that scared him. “What I was terrified of was the monumental nature of the accusation, in that I was afraid that I would be right, that it would be true, and that many people’s lives and careers would be ruined,” Heinz says. “I was just really feeling the horror—the horror of the consequences of what I’d found.”

Heinz was also tormented personally. The postdoc was his close friend, he says, and he knew this revelation could destroy his career.

At the start of the meeting, Heinz got right to it and told Bloodgood he had found “a really big problem” with the postdoc’s paper.

“Oh no,” Bloodgood remembers saying as a feeling of heaviness sank in.

Heinz says he walked Bloodgood through his findings, telling her it appeared the postdoc had intentionally falsified the image. Bloodgood “didn’t disagree” with Heinz’s findings, she says, but she wanted to give the postdoc a chance to explain.

In the back of his mind, Heinz had been hoping that Bloodgood would “point out the obvious stupidity in what I was saying.” The fact that this didn’t happen shook him, he remembers, but as he was leaving the office, Bloodgood called out to him. He stopped and turned around. “Danny, it might not feel like this now,” he remembers her saying, “but someday, looking back, you’ll be glad you did this.”

The next week, Bloodgood spoke with the postdoc on Zoom. She says she showed him slides with the raw images Heinz had found and the figure from the paper, and she asked for an explanation. The postdoc said it must be a mistake, Bloodgood recalls, though she thought he sounded nervous. “This nervousness is either because he feels put on the spot,” Bloodgood remembers thinking, or because “he feels like he’s been caught.”

The postdoc emailed Bloodgood two days later. In that email, Bloodgood says he admitted he had manipulated images in one figure, but he stood by the findings the images represented. And he offered up an excuse: He wrote that he had felt pressure to produce a beautiful paper. But Bloodgood didn’t trust him anymore, she says. She asked him to send a spreadsheet detailing the name and location of every image file that had gone into making the figure and step-by-step instructions on how he had analyzed them. He complied, and Heinz got to work.

On 4 May, Bloodgood and her lab met for their weekly lab meeting. Normally, someone presented data from the experiments they had been working on. But that day, Bloodgood broke the news of the manipulated images instead.

When Bloodgood finished speaking, the room fell silent, says Chiaki Santiago, a current graduate student in the lab. Santiago says she sensed both sadness and shock in the silence, but also an odd sense of closure. The dendritic NPAS4 antibody experiments had been a “trap” for years, she says, and now they finally had an answer. The group wasn’t incompetent; they had been chasing a false signal. Knowing that felt at least like “a path forward to truth.”

Before the meeting disbanded, Bloodgood gave everyone a chance to ask questions and share reactions. More than one person expressed concern for the postdoc, several people present at the meeting recall: The trainees understood the gravity of what the postdoc had done, and that the consequences could “devastate a person,” says Anja Payne, a graduate student in the lab at the time. How was he doing? they wondered out loud. Was he suicidal, and did he need an emergency intervention?

Bloodgood, hearing this collective goodwill, felt a “huge warmth to the people in my lab,” she says.

Then, Santiago says, the lab trainees went out to lunch and took a walk on the beach together. “That was, like, perfect,” she adds. “It was very soothing and calming and a great reminder that this isn’t the end-all be-all; we’re going to figure out how to fix this, and we’re going to figure out ways to work through this together.”

W

eathering someone else’s scientific misconduct can become a—if not the—defining moment of a career. For Kate Laskowski, it shaped the way she runs her lab.

Toward the end of 2019, Laskowski, assistant professor of evolution and ecology at the University of California, Davis, had just opened her lab when she discovered that three papers she had published in collaboration with a prominent spider biologist contained falsified data: The biologist had collected the data, and Laskowski had analyzed it. In the end, she retracted the papers and published a blog post detailing everything that had happened.

The experience did not sour Laskowski’s feelings about science, but it did shape her lab in “profound ways,” she says. She tells her students, “We live in a glass house; everything we do is going to be public,” she says. “I never want to relive this. And I know that the only reason I survived is because I was so transparent and open.” For example, her lab manual is available on her lab website and outlines detailed expectations for lab notebooks, data storage and analysis, and file organization. The top of the manual states the key mantras: “Don’t be a jerk” and “Don’t fabricate/fudge/alter data.”

A close brush with a colleague’s misconduct left Edward Ester with a lingering worry. In 2015, a few years after he started a postdoc, he says his former Ph.D. adviser told him a graduate student in the lab—and Ester’s close friend—had been accused of fraud in several of his papers. Ester took a closer look at some of the work he had done with the student, found evidence of data falsification in two papers and retracted them, he says.

Today Ester is assistant professor of psychology at the University of Nevada, Reno. When he first opened his lab there, he says he was “very paranoid” about his trainees making mistakes and spent a lot of time doing data analysis that he should have assigned to a student. Ester’s lab has transparency policies that are similar to Laskowski’s, and even now he finds himself “perhaps more of a helicopter [adviser] than I need to be in some instances.”

The experience also instilled in him a cynicism about the incentive structure in science, he says. When a scientist’s worth is measured by their h-index and grant dollars, that can “encourage fraud that might not otherwise occur. Because for some people, I think it’s just out of desperation. Or, for some people, it’s a desire to be the best, but they want to get there too fast, or they don’t care how they get there,” he says. “If you create perverse incentives, you’re going to create perverse behaviors.”

But Ester doesn’t let this awareness ruin his daily experience as a scientist, he says. He keeps his attention focused on his own work and his own lab, which is the only thing he can control. The structural flaws necessitate “a lot of vigilance” from individual researchers to ensure fraud doesn’t occur.

These consequences are amplified when the person faking data is a principal investigator. In 2005, a group of molecular biology graduate students at the University of Wisconsin-Madison discovered their PI had faked data in several grant applications. After months of deliberation, they turned her in, and the lab was shut down. Three of the students left with their master’s degrees. Three others switched to new labs to finish their Ph.D.s, including Mary Ann Allen, who says she initially wanted to leave research and get a computer science degree instead, because she couldn’t imagine trusting another stranger to be her adviser.

She stayed in biology only because a friend recommended his former adviser at the University of Colorado Boulder, she says. Allen moved to the new lab and is now a research associate professor at the university’s BioFrontiers Institute. She migrated from molecular biology into computational biology (in part, she says, because of the field’s data- and code-sharing norms), teaches responsible conduct of research courses to trainees and upholds transparency policies in her own lab.

Her trust in other scientists still ebbs and flows. “I was under the impression nobody committed misconduct. And then you go through this situation, and you start to wonder if everybody does,” Allen says.

A

s Heinz worked through the reanalysis of the postdoc’s paper, it became clear that hundreds of images were not accounted for in the spreadsheet the postdoc had sent, Bloodgood says. She asked the postdoc to send the missing images, and a few weeks later, on 9 June, he did.

Yet when Heinz looked at the images’ metadata—the immutable bits of information marking when an image was taken, and on what microscope—he discovered that what Bloodgood describes as “an overwhelming majority” had been taken within the past few weeks. The postdoc, it seemed, had faked more data to cover his tracks. This was awful news, Bloodgood says, but it carried an “echo of relief,” because it meant the group could stop investigating. Nothing could be explained away, and it left Bloodgood with only one choice, she says.

Bloodgood called an emergency meeting on Thursday, 15 June. Santiago, Heinz, Payne and the other trainees piled into Bloodgood’s office around her computer, and she broke the news. The postdoc could not be relied on to help correct the paper, she told them; it had to be retracted. Also, Bloodgood said she had emailed the postdoc and told him that on the next Tuesday she would notify the National Institutes of Health (NIH), Cell, her department chair and his department chair about what he had done. If he wanted to be the one to tell his chair, he would need to do so before then.

For Heinz, the discovery of the second fraud shifted him from “feeling guilt to feeling anger,” he says. If you give someone an opportunity to fix a mistake, and “they try to take advantage of you, that’s a different level of betrayal.”

Bloodgood was angry, too, she says. “It didn’t have to be this way,” she remembers thinking. “There are so many interesting things to discover in biology. You don’t have to make things up.”

On 15 June 2023, the postdoc confessed to the University of Utah, according to an “admission of research misconduct” statement and the university’s misconduct report, which The Transmitter obtained through a public records request. He admitted to manipulating images of NPAS4 using Photoshop, and he admitted to fabricating data in a set of genetic knockout experiments he never performed. He also admitted to incorporating fabricated data throughout the paper to increase the sample size of different experiments. He then used the fraudulent data in several NIH grant applications that led to more than $1.4 million in funding, and in his job application talk that landed him an assistant professor position. Finally, he admitted to sending Bloodgood images that he took after the paper was published “in an initial attempt to conceal my misconduct,” he wrote. “In truth, this misrepresentation was a falsification of the research record.”

The University of Utah and UCSD conducted separate investigations, and both found that the postdoc had committed research misconduct, as did the U.S. Office of Research Integrity (ORI). The postdoc resigned from his position and entered a voluntary settlement agreement with the ORI—he agreed to be supervised by two to three senior faculty members for the next five years when conducting federally funded research.

On 12 June 2024, Bloodgood and the postdoc’s other co-authors retracted the 2019 paper from Cell, after UCSD concluded its investigation. “We do not stand by the conclusions drawn in this paper and are retracting it,” the retraction notice states. “We apologize to the scientific community for any loss of time, resources, and/or morale caused by this publication.”

W

hen people trust someone, they make themselves vulnerable to being hurt, says Karen Frost-Arnold, professor of philosophy at Hobart and William Smith Colleges. Philosophers describe the feeling that comes from someone taking advantage of that vulnerability as “disrespected in your personhood,” Frost-Arnold says. “It can feel very dehumanizing.”

One component of the healing process involves trying to understand why the betrayal happened, Frost-Arnold says. The betrayed look at themselves and wonder why they trusted the wrong person. They look at the betrayer and try to decipher their motivations. If the motivations are unclear, they may chalk it up to a random act of cruelty from a bad person. And lastly, they look at structures and institutions—what is it about science in general, or their lab specifically, that allowed this to happen?

Illustration of a group of people in a dark lab looking out at an orange sky.
Upward spiral: Members of the Bloodgood Lab are still processing the repercussions of the fraud.

In the 18 months since the postdoc’s fabrication came to light, members of Bloodgood’s lab have wrestled through what happened to them and what it means for the rest of their careers. Heinz can’t write off the whole episode as a bad person being bad, because he knew the postdoc to be good, thoughtful and caring, he says. “This is not just, ‘There are some bad apples.’ It’s this specific person who I couldn’t have believed could have done this, did this. And it forces a different interrogation of the causality.”

Instead, Heinz has taken a close look at the scientific institution. He views its incentive structure as a “moral hazard” for scientists, he says, because it tells them that the only way to advance is to take big swings, but most big swings are misses, not home runs. So some may feel compelled to cut corners or fudge results to propel their status and career. Heinz sees a system in which labs push toward “personal brand goals” instead of biological truths. He loves being a scientist, he says, but he’s not sure if he can protect himself from the dissonance that comes from existing within the scientific world. Heinz hopes to defend his Ph.D. at the end of the year, and he doesn’t know yet what he’ll do next. He says he wants to do something that feels like a valuable contribution to society—maybe that’s in science; maybe it isn’t. But “I almost certainly would have been doing a postdoc if all of this hadn’t happened.”

Bloodgood is still working through what lessons might be generalizable, she says. She doesn’t want to treat all of her trainees like they may be faking data, because “it would be terribly unfair to them, and it would be an awful way for me to go through my life.” Still, her baseline trust in other scientists has dropped, she says, and she finds herself less confident in someone’s results when she perceives them to be an ambitious person.

The fraud in her lab has “extinguished a spark that I had for science,” Bloodgood says, and reigniting it has been “elusive.”

Santiago says she has also lost that spark. Last summer, when all of this took place, she had an internship at Neurocrine Biosciences. She noticed that when the Neurocrine scientists tried to replicate a finding reported in the academic literature, it often failed. This observation, combined with the fraud unfolding in her home lab, caused Santiago to lose faith in academia. Industry incentivizes rigor, she says, because drugs are tested in humans, and vast amounts of money are at stake. But in academia, she sees researchers wed themselves to exciting stories that might not be true.

After the internship, while attending a seminar at UCSD by a visiting professor, she remembers thinking, “I don’t know if that’s real.” Later that semester, she read a grant she had written for an assignment a few years prior. Her writing carried both excitement and pride as she described the lab’s work and the experiments she planned to do. She had even used an exclamation point. As she reread her old writing, Santiago says she realized that enthusiastic version of herself was gone, and the realization made her cry.

Yet she hasn’t lost all hope. After the second meeting with the lab, that terrible one in which the members learned that the postdoc had continued to fake data, Santiago left campus and drove north on Interstate 5 to get back to her internship. She had 30 minutes until her next meeting, so she exited the highway and made a detour to Bird Rock Coffee, a coffee shop across the street from where the marshland meets the beach. Santiago remembers she sat on a stool on the patio, drank her coffee, stared at a great blue heron that was hanging out in the marsh and wondered absently about how much energy it took for the bird to stand on one leg. And this moment, she says, somehow reassured her that everything was going to be alright.

For Payne, it took months to fully process the postdoc’s fraud, she says. The events unfolded not long before she was to defend her thesis and move across the country to Virginia, to start a postdoc at the Janelia Research Campus. She remembers sitting in Bloodgood’s office, thinking, “I can’t internalize this right now.” Her main thought was to defend, graduate and “get out, get out, get out,” she says.

At first, Payne says, she had felt compassion for the postdoc. “Truthfully, my reaction to it was a little bit this sense of like, ‘There but for the grace of God go I,’” she says. But in Virginia, when it was over and she had physically left it behind, she had more time to think. Then she started to feel angry. By October, she felt afraid—she worried that there was “something missing about my understanding of science,” she says. She had once felt that all scientists had “the same goals,” but after the fraud she doubted that.

Payne says she realized that she was grieving—something she had experienced when her brother died during her first quarter of graduate school. In the aftermath of that loss, she joined a graduate student grief group at UCSD to help herself cope. Eventually, she began to facilitate the group alongside a counselor.

While leading that group, she says she learned that working through grief is not about fully healing—the loss of her brother might forever feel raw. Instead, what she needed was to find a way to tolerate the wound in her life. “You truly do just have to get to a place of acceptance or go crazy. There’s not an in-between,” she says.

Payne says she doesn’t expect to understand why the postdoc did what he did, and she also does not expect that her attitudes about science will look the same as they did before. She considers her recovery from the fraud to be an “upward spiral” that will vary each day and won’t be linear.

Now, she says, she sees that the only thing she can control is the rigor of her own work. There is no way to prevent fraud. This realization is still painful at times, but she has accepted it. “It is just, unfortunately, a feature of humanity that we have to contend with,” she says. You have to “look the beast in its ugly face.”

If you or someone you know is having suicidal thoughts, help is available. Here is a worldwide directory of resources and hotlines that you can call for support.

Sign up for our weekly newsletter.

Catch up on what you may have missed from our recent coverage.