Genetic Test Scores Predicting Intelligence Are Not the New Eugenics

Genetic Test Scores Predicting Intelligence Are Not the New Eugenics

A thinking person.

(© psdesign1/Fotolia)



"A world where people are slotted according to their inborn ability – well, that is Gattaca. That is eugenics."

This was the assessment of Dr. Catherine Bliss, a sociologist who wrote a new book on social science genetics, when asked by MIT Technology Review about polygenic scores that can predict a person's intelligence or performance in school. Like a credit score, a polygenic score is statistical tool that combines a lot of information about a person's genome into a single number. Fears about using polygenic scores for genetic discrimination are understandable, given this country's ugly history of using the science of heredity to justify atrocities like forcible sterilization. But polygenic scores are not the new eugenics. And, rushing to discuss polygenic scores in dystopian terms only contributes to widespread public misunderstanding about genetics.

Can we start genotyping toddlers to identify the budding geniuses among them? The short answer is no.

Let's begin with some background on how polygenic scores are developed. In a genome wide-association study, researchers conduct millions of statistical tests to identify small differences in people's DNA sequence that are correlated with differences in a target outcome (beyond what can attributed to chance or ancestry differences). Successful studies of this sort require enormous sample sizes, but companies like 23andMe are now contributing genetic data from their consumers to research studies, and national biorepositories like U.K. Biobank have put genetic information from hundreds of thousands of people online. When applied to studying blood lipids or myopia, this kind of study strikes people as a straightforward and uncontroversial scientific tool. But it can also be conducted for cognitive and behavioral outcomes, like how many years of school a person has completed. When researchers have finished a genome-wide association study, they are left with a dataset with millions of rows (one for each genetic variant analyzed) and one column with the correlations between each variant and the outcome being studied.

The trick to polygenic scoring is to use these results and apply them to people who weren't participants in the original study. Measure the genes of a new person, weight each one of her millions of genetic variants by its correlation with educational attainment from a genome-wide association study, and then simply add everything up into a single number. Voila! -- you've created a polygenic score for educational attainment. On its face, the idea of "scoring" a person's genotype does immediately suggest Gattaca-type applications. Can we now start screening embryos for their "inborn ability," as Bliss called it? Can we start genotyping toddlers to identify the budding geniuses among them?

The short answer is no. Here are four reasons why dystopian projections about polygenic scores are out of touch with the current science:

The phrase "DNA tests for IQ" makes for an attention-grabbing headline, but it's scientifically meaningless.

First, a polygenic score currently predicts the life outcomes of an individual child with a great deal of uncertainty. The amount of uncertainty around polygenic predictions will decrease in the future, as genetic discovery samples get bigger and genetic studies include more of the variation in the genome, including rare variants that are particular to a few families. But for now, knowing a child's polygenic score predicts his ultimate educational attainment about as well as knowing his family's income, and slightly worse than knowing how far his mother went in school. These pieces of information are also readily available about children before they are born, but no one is writing breathless think-pieces about the dystopian outcomes that will result from knowing whether a pregnant woman graduated from college.

Second, using polygenic scoring for embryo selection requires parents to create embryos using reproductive technology, rather than conceiving them by having sex. The prediction that many women will endure medically-unnecessary IVF, in order to select the embryo with the highest polygenic score, glosses over the invasiveness, indignity, pain, and heartbreak that these hormonal and surgical procedures can entail.

Third, and counterintuitively, a polygenic score might be using DNA to measure aspects of the child's environment. Remember, a child inherits her DNA from her parents, who typically also shape the environment she grows up in. And, children's environments respond to their unique personalities and temperaments. One Icelandic study found that parents' polygenic scores predicted their children's educational attainment, even if the score was constructed using only the half of the parental genome that the child didn't inherit. For example, imagine mom has genetic variant X that makes her more likely to smoke during her pregnancy. Prenatal exposure to nicotine, in turn, affects the child's neurodevelopment, leading to behavior problems in school. The school responds to his behavioral problems with suspension, causing him to miss out on instructional content. A genome-wide association study will collapse this long and winding causal path into a simple correlation -- "genetic variant X is correlated with academic achievement." But, a child's polygenic score, which includes variant X, will partly reflect his likelihood of being exposed to adverse prenatal and school environments.

Finally, the phrase "DNA tests for IQ" makes for an attention-grabbing headline, but it's scientifically meaningless. As I've written previously, it makes sense to talk about a bacterial test for strep throat, because strep throat is a medical condition defined as having streptococcal bacteria growing in the back of your throat. If your strep test is positive, you have strep throat, no matter how serious your symptoms are. But a polygenic score is not a test "for" IQ, because intelligence is not defined at the level of someone's DNA. It doesn't matter how high your polygenic score is, if you can't reason abstractly or learn from experience. Equating your intelligence, a cognitive capacity that is tested behaviorally, with your polygenic score, a number that is a weighted sum of genetic variants discovered to be statistically associated with educational attainment in a hypothesis-free data mining exercise, is misleading about what intelligence is and is not.

The task for many scientists like me, who are interested in understanding why some children do better in school than other children, is to disentangle correlations from causation.

So, if we're not going to build a Gattaca-style genetic hierarchy, what are polygenic scores good for? They are not useless. In fact, they give scientists a valuable new tool for studying how to improve children's lives. The task for many scientists like me, who are interested in understanding why some children do better in school than other children, is to disentangle correlations from causation. The best way to do that is to run an experiment where children are randomized to environments, but often a true experiment is unethical or impractical. You can't randomize children to be born to a teenage mother or to go to school with inexperienced teachers. By statistically controlling for some of the relevant genetic differences between people using a polygenic score, scientists are better able to identify potential environmental causes of differences in children's life outcomes. As we have seen with other methods from genetics, like twin studies, understanding genes illuminates the environment.

Research that examines genetics in relation to social inequality, such as differences in higher education outcomes, will obviously remind people of the horrors of the eugenics movement. Wariness regarding how genetic science will be applied is certainly warranted. But, polygenic scores are not pure measures of "inborn ability," and genome-wide association studies of human intelligence and educational attainment are not inevitably ushering in a new eugenics age.

Paige Harden
Dr. Paige Harden is tenured professor of clinical psychology at the University of Texas at Austin, where she is the Principal Investigator of the Developmental Behavior Genetics lab and co-director of the Texas Twin Project. Dr. Harden has published over 100 scientific articles on genetic influences on complex human behavior, including child cognitive development, academic achievement, risk-taking, mental health, sexual activity, and childbearing. In 2017, she was honored with a prestigious national award from the American Psychological Association for her distinguished scientific contributions to the study of genetics and human individual differences. Dr. Harden’s research is supported by a Jacobs Foundation Early Career Research Fellowship, the Templeton Foundation, and the National Institutes of Health. She is a Public Voices Fellow with the Op-Ed Project.
A 3D-printed tongue reveals why chocolate tastes so good—and how to reduce its fat

Researchers are looking to engineer chocolate with less oil, which could reduce some of its detriments to health.

Adobe Stock

Creamy milk with velvety texture. Dark with sprinkles of sea salt. Crunchy hazelnut-studded chunks. Chocolate is a treat that appeals to billions of people worldwide, no matter the age. And it’s not only the taste, but the feel of a chocolate morsel slowly melting in our mouths—the smoothness and slipperiness—that’s part of the overwhelming satisfaction. Why is it so enjoyable?

That’s what an interdisciplinary research team of chocolate lovers from the University of Leeds School of Food Science and Nutrition and School of Mechanical Engineering in the U.K. resolved to study in 2021. They wanted to know, “What is making chocolate that desirable?” says Siavash Soltanahmadi, one of the lead authors of a new study about chocolates hedonistic quality.

Besides addressing the researchers’ general curiosity, their answers might help chocolate manufacturers make the delicacy even more enjoyable and potentially healthier. After all, chocolate is a billion-dollar industry. Revenue from chocolate sales, whether milk or dark, is forecasted to grow 13 percent by 2027 in the U.K. In the U.S., chocolate and candy sales increased by 11 percent from 2020 to 2021, on track to reach $44.9 billion by 2026. Figuring out how chocolate affects the human palate could up the ante even more.

Keep Reading Keep Reading
Cari Shane
Cari Shane is a freelance journalist (and Airbnb Superhost). Originally from Manhattan, Shane lives carless in Washington, DC and writes on a variety of subjects for a wide array of media outlets including, Scientific American, National Geographic, Discover, Business Insider, Fast Company, Fortune and Fodor’s.
Scientists redesign bacteria to tackle the antibiotic resistance crisis

Probiotic bacteria can be engineered to fight antibiotic-resistant superbugs by releasing chemicals that kill them.

Adobe stock

In 1945, almost two decades after Alexander Fleming discovered penicillin, he warned that as antibiotics use grows, they may lose their efficiency. He was prescient—the first case of penicillin resistance was reported two years later. Back then, not many people paid attention to Fleming’s warning. After all, the “golden era” of the antibiotics age had just began. By the 1950s, three new antibiotics derived from soil bacteria — streptomycin, chloramphenicol, and tetracycline — could cure infectious diseases like tuberculosis, cholera, meningitis and typhoid fever, among others.

Today, these antibiotics and many of their successors developed through the 1980s are gradually losing their effectiveness. The extensive overuse and misuse of antibiotics led to the rise of drug resistance. The livestock sector buys around 80 percent of all antibiotics sold in the U.S. every year. Farmers feed cows and chickens low doses of antibiotics to prevent infections and fatten up the animals, which eventually causes resistant bacterial strains to evolve. If manure from cattle is used on fields, the soil and vegetables can get contaminated with antibiotic-resistant bacteria. Another major factor is doctors overprescribing antibiotics to humans, particularly in low-income countries. Between 2000 to 2018, the global rates of human antibiotic consumption shot up by 46 percent.

Keep Reading Keep Reading
Anuradha Varanasi
Anuradha Varanasi is a freelance science journalist based in Mumbai, India. She has an MA in Science Journalism from Columbia University in the City of New York. Her stories on environmental health, biomedical research, and climate change have been published in Forbes, UnDark, Popular Science, and Inverse. You can follow her on Twitter @AnuradhaVaranas