T
he cardboard box has been sitting in Maya’s house in Ohio for months. The box, no bigger than a hardcover novel, contains six plastic tubes — one for Maya, one for her husband, Mark, and one for each of their four children, two of whom have autism. It also holds labels with each person’s name, date of birth and a barcode printed on them, ready to be affixed to the tubes once the family has filled them with spit. (Maya requested that only her first name be used in this article, to protect her privacy.)The box came from SPARK, the largest genetic study of autism to date. To participate, Maya will have to ship the family’s samples back to a DNA testing lab in Wisconsin. But she keeps wavering.
On the one hand, Maya applauds SPARK’s mission to speed autism research by collecting genetic data from more than 50,000 families affected by the condition. (SPARK is funded by the Simons Foundation, Spectrum’s parent organization.) She hopes the effort might lead to better means of early diagnosis and treatment. Mark did not know until college that he has autism; by contrast, their children, diagnosed at 23 and 32 months, benefitted from early therapy.
But Maya also worries about giving her family’s DNA and health information to a third party. When she was in graduate school, she was initially denied a job after a prospective employer found an article about her having Marfan syndrome, a genetic condition that affects connective tissue.
The SPARK data are stripped of identifiers, such as a person’s name and birth date. And with rare exceptions, none of the DNA data are shared without a participant’s consent. But Maya questions how well those protections work. Could unauthorized individuals get access to the data and find a way to identify her and her family? Could that affect her children’s future? Most autism research databases allow participants to later withdraw their data. But if those data have already been used in a study, they generally cannot be extracted because doing so could change the study’s results, experts say.
“I want to be really sure that the data will be anonymous,” Maya says. “I don’t want my decisions now to affect my child’s employability 10, 20 years from now.”
Maya is not alone in her unease. Many families who are enthusiastic about participating in autism research also fear that their personal health information could leak out online or get into the wrong hands, exposing them to stigma or discrimination. Their concern is not entirely unjustified: Privacy laws in the United States do nothing to stop a small employer or life-insurance company from discriminating against someone based on their genetic information. And even when data are anonymized, scientists have shown how hackers can match names to genomes and brain scans stored in databases.
But sharing data with a research institution is less risky than sharing them with healthcare providers or with many commercial genetic-testing companies, experts say. Research databases have more safeguards in place, such as data encryption and restricting data access to trusted researchers — measures that have largely dissuaded hackers so far. “Researchers are definitely the best and direct-to-consumer companies in general are definitely the worst, because there are dozens of these companies, and many either don’t have a privacy policy or don’t follow it,” says Mark Rothstein, director of the Institute for Bioethics, Health Policy and Law at the University of Louisville in Kentucky.
No matter where DNA or brain-imaging data go, they are never completely secure — sticking people like Maya with a difficult decision. For now, most participants should feel reassured. “If the scientific databases are properly protected, the risk of data theft is relatively low,” says Jean-Pierre Hubaux, who heads the data-security laboratory at Ecole Polytechnique Fédérale de Lausanne in Switzerland. But researchers need to stay ahead of that curve if they want to preserve their study participants’ trust.