Who Qualifies as an “Expert” And How Can We Decide Who Is Trustworthy?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Expertise is a slippery concept. Who has it, who claims it, and who attributes or yields it to whom is a culturally specific, sociological process. During the COVID-19 pandemic, we have witnessed a remarkable emergence of legitimate and not-so-legitimate scientists publicly claiming or being attributed to have academic expertise in precisely my field: infectious disease epidemiology. From any vantage point, it is clear that charlatans abound out there, garnering TV coverage and hundreds of thousands of Twitter followers based on loud opinions despite flimsy credentials. What is more interesting as an insider is the gradient of expertise beyond these obvious fakers.
A person's expertise is not a fixed attribute; it is a hierarchical trait defined relative to others. Despite my protestations, I am the go-to expert on every aspect of the pandemic to my family. To a reporter, I might do my best to answer a question about the immune response to SARS-CoV-2, noting that I'm not an immunologist. Among other academic scientists, my expertise is more well-defined as a subfield of epidemiology, and within that as a particular area within infectious disease epidemiology. There's a fractal quality to it; as you zoom in on a particular subject, a differentiation of expertise emerges among scientists who, from farther out, appear to be interchangeable.
We all have our scientific domain and are less knowledgeable outside it, of course, and we are often asked to comment on a broad range of topics. But many scientists without a track record in the field have become favorites among university administrators, senior faculty in unrelated fields, policymakers, and science journalists, using institutional prestige or social connections to promote themselves. This phenomenon leads to a distorted representation of science—and of academic scientists—in the public realm.
Trustworthy experts will direct you to others in their field who know more about particular topics, and will tend to be honest about what is and what isn't "in their lane."
Predictably, white male voices have been disproportionately amplified, and men are certainly over-represented in the category of those who use their connections to inappropriately claim expertise. Generally speaking, we are missing women, racial minorities, and global perspectives. This is not only important because it misrepresents who scientists are and reinforces outdated stereotypes that place white men in the Global North at the top of a credibility hierarchy. It also matters because it can promote bad science, and it passes over scientists who can lend nuance to the scientific discourse and give global perspectives on this quintessentially global crisis.
Also at work, in my opinion, are two biases within academia: the conflation of institutional prestige with individual expertise, and the bizarre hierarchy among scientists that attributes greater credibility to those in quantitative fields like physics. Regardless of mathematical expertise or institutional affiliation, lack of experience working with epidemiological data can lead to over-confidence in the deceptively simple mathematical models that we use to understand epidemics, as well as the inappropriate use of uncertain data to inform them. Prominent and vocal scientists from different quantitative fields have misapplied the methods of infectious disease epidemiology during the COVID-19 pandemic so far, creating enormous confusion among policymakers and the public. Early forecasts that predicted the epidemic would be over by now, for example, led to a sense that epidemiological models were all unreliable.
Meanwhile, legitimate scientific uncertainties and differences of opinion, as well as fundamentally different epidemic dynamics arising in diverse global contexts and in different demographic groups, appear in the press as an indistinguishable part of this general chaos. This leads many people to question whether the field has anything worthwhile to contribute, and muddies the facts about COVID-19 policies for reducing transmission that most experts agree on, like wearing masks and avoiding large indoor gatherings.
So how do we distinguish an expert from a charlatan? I believe a willingness to say "I don't know" and to openly describe uncertainties, nuances, and limitations of science are all good signs. Thoughtful engagement with questions and new ideas is also an indication of expertise, as opposed to arrogant bluster or a bullish insistence on a particular policy strategy regardless of context (which is almost always an attempt to hide a lack of depth of understanding). Trustworthy experts will direct you to others in their field who know more about particular topics, and will tend to be honest about what is and what isn't "in their lane." For example, some expertise is quite specific to a given subfield: epidemiologists who study non-infectious conditions or nutrition, for example, use different methods from those of infectious disease experts, because they generally don't need to account for the exponential growth that is inherent to a contagion process.
Academic scientists have a specific, technical contribution to make in containing the COVID-19 pandemic and in communicating research findings as they emerge. But the liminal space between scientists and the public is subject to the same undercurrents of sexism, racism, and opportunism that society and the academy have always suffered from. Although none of the proxies for expertise described above are fool-proof, they are at least indicative of integrity and humility—two traits the world is in dire need of at this moment in history.
[Editor's Note: To read other articles in this special magazine issue, visit the beautifully designed e-reader version.]
By now you have probably heard something about CRISPR, the simple and relatively inexpensive method of precisely editing the genomes of plants, animals, and humans.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier.
Through CRISPR and other methods of gene editing, scientists have produced crops to be more nutritious, better able to resist pests, and tolerate droughts; engineered animals ranging from fruit flies to monkeys to make them better suited for scientific study; and experimentally treated the HIV virus, Hepatitis B, and leukemia in human patients.
There are also currently FDA-approved trials to treat blindness, cancer, and sickle cell disease in humans using gene editing, and there is consensus that CRISPR's therapeutic applications will grow significantly in the coming years.
While the treatment of human disease through use of gene editing is not without its medical and ethical concerns, the avoidance of disease in embryos is far more fraught. Nonetheless, Nature reported in November that He Jiankui, a scientist in China, had edited twin embryos to disable a gene called CCR5 in hopes of avoiding transmission of HIV from their HIV-positive father.
Though there are questions about the effectiveness and necessity of this therapy, He reported that sequencing has proven his embryonic gene edits were successful and the twins were "born normal and healthy," although his claims have not been independently verified.
More recently, Denis Rebrikov, a Russian scientist, announced his plans to disable the same gene in embryos to be implanted in HIV-positive women later this year. Futuristic as it may seem, prenatal gene editing is already here.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier. Numerous conditions—some minor, some resulting in a lifetime of medical treatment, some incompatible with life outside of the womb—can be diagnosed through use of prenatal diagnostic testing. There is promising research suggesting doctors will soon be able to treat or mitigate at least some of them through use of fetal gene editing.
This research could soon present women carrying genetically anomalous fetuses a third option aside from termination or birthing a child who will likely face a challenging and uncertain medical future: Whether to undergo a fetal genetic intervention.
However, genetic intervention will open the door to a host of ethical considerations, particularly with respect to the relationship between pregnant women and prenatal genetic counselors. Current counselors theoretically provide objective information and answer questions rather than advise their pregnant client whether to continue with her pregnancy, despite the risks, or to have an abortion.
In practice, though, prenatal genetic counseling is most often directive, and the nature of the counseling pregnant women receive can depend on numerous factors, including their religious and cultural beliefs, their perceived ability to handle a complicated pregnancy and subsequent birth, and their financial status. Introducing the possibility of a fetal genetic intervention will exacerbate counselor reliance upon these considerations and in some cases lead to counseling that is even more directive.
Some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus.
Future counselors will have to figure out under what circumstances it is even appropriate to broach the subject. Should they only discuss therapies that are FDA-approved, or should they mention experimental treatments? What about interventions that are available in Europe or Asia, but banned in the United States? Or even in the best case of scenario of an FDA-approved treatment, should a counselor make reference to it if she knows for a fact that her client cannot possibly afford it?
Beyond the basic question of what information to share, counselors will have to confront the fact that the very notion of fixing or "editing" offspring will be repugnant to many women, and inherent in the suggestion is the stigmatization of individuals with disabilities. Prenatal genetic counselors will be on the forefront of debates surrounding which fetuses should remain as they are and which ones should be altered.
Despite these concerns, some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus in utero. Take, for example, a woman who learns during prenatal testing that her fetus has Angelman syndrome, a genetic disorder characterized by intellectual disability, speech impairment, loss of muscle control, epilepsy, and a small head. There is currently no human treatment for Angelman syndrome, which is caused by a loss of function in a single gene, UBE3A.
But scientists at the University of North Carolina have been able to treat Angelman syndrome in fetal mice by reactivating UBE3A through use of a single injection. The therapy has also proven effective in cultured human brain cells. This suggests that a woman might soon have to consider injecting her fetus's brain with a CRISPR concoction custom-designed to target UBE3A, rather than terminate her pregnancy or bring her fetus to term unaltered.
Assuming she receives the adequate information to make an informed choice, she too will face an ethical conundrum. There will be the inherent risks of injecting anything into a developing fetus's brain, including the possibility of infection, brain damage, and miscarriage. But there are also risks specific to gene editing, such as so-called off-target effects, the possibility of impacting genes other than the intended one. Such effects are highly unpredictable and can be difficult to detect. So too is it impossible to predict how altering UBE3A might lead to other genetic and epigenetic changes once the baby is born.
There are no easy answers to the many questions that will arise in this space.
A woman deciding how to act in this scenario must balance these risks against the potential benefits of the therapy, layered on top of her belief system, resources, and personal ethics. The calculus will be different for every woman, and even the same woman might change her mind from one pregnancy to the next based on the severity of the condition diagnosed and other available medical options.
Her genetic counselor, meanwhile, must be sensitive to all of these concerns in helping her make her decision, keeping up to date on the possible new treatments, and carefully choosing which information to disclose in striving to be neutral. There are no easy answers to the many questions that will arise in this space, but better to start thinking about them now, before it is too late.
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.