Your Questions Answered About Kids, Teens, and Covid Vaccines
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
This virtual event convened leading scientific and medical experts to address the public's questions and concerns about Covid-19 vaccines in kids and teens. Highlight video below.
DATE:
Thursday, May 13th, 2021
12:30 p.m. - 1:45 p.m. EDT
Dr. H. Dele Davies, M.D., MHCM
Senior Vice Chancellor for Academic Affairs and Dean for Graduate Studies at the University of Nebraska Medical (UNMC). He is an internationally recognized expert in pediatric infectious diseases and a leader in community health.
Dr. Emily Oster, Ph.D.
Professor of Economics at Brown University. She is a best-selling author and parenting guru who has pioneered a method of assessing school safety.
Dr. Tina Q. Tan, M.D.
Professor of Pediatrics at the Feinberg School of Medicine, Northwestern University. She has been involved in several vaccine survey studies that examine the awareness, acceptance, barriers and utilization of recommended preventative vaccines.
Dr. Inci Yildirim, M.D., Ph.D., M.Sc.
Associate Professor of Pediatrics (Infectious Disease); Medical Director, Transplant Infectious Diseases at Yale School of Medicine; Associate Professor of Global Health, Yale Institute for Global Health. She is an investigator for the multi-institutional COVID-19 Prevention Network's (CoVPN) Moderna mRNA-1273 clinical trial for children 6 months to 12 years of age.
About the Event Series
This event is the second of a four-part series co-hosted by Leaps.org, the Aspen Institute Science & Society Program, and the Sabin–Aspen Vaccine Science & Policy Group, with generous support from the Gordon and Betty Moore Foundation and the Howard Hughes Medical Institute.
:
Kira Peikoff was the editor-in-chief of Leaps.org from 2017 to 2021. As a journalist, her work has appeared in The New York Times, Newsweek, Nautilus, Popular Mechanics, The New York Academy of Sciences, and other outlets. She is also the author of four suspense novels that explore controversial issues arising from scientific innovation: Living Proof, No Time to Die, Die Again Tomorrow, and Mother Knows Best. Peikoff holds a B.A. in Journalism from New York University and an M.S. in Bioethics from Columbia University. She lives in New Jersey with her husband and two young sons. Follow her on Twitter @KiraPeikoff.
Society Needs Regulations to Prevent Research Abuses
[Editor's Note: Our Big Moral Question this month is, "Do government regulations help or hurt the goal of responsible and timely scientific innovation?"]
Government regulations help more than hurt the goal of responsible and timely scientific innovation. Opponents might argue that without regulations, researchers would be free to do whatever they want. But without ethics and regulations, scientists have performed horrific experiments. In Nazi concentration camps, for instance, doctors forced prisoners to stay in the snow to see how long it took for these inmates to freeze to death. These researchers also removed prisoner's limbs in order to try to develop innovations to reconnect these body parts, but all the experiments failed.
Researchers in not only industry, but also academia have violated research participants' rights.
Due to these atrocities, after the war, the Nuremberg Tribunal established the first ethical guidelines for research, mandating that all study participants provide informed consent. Yet many researchers, including those in leading U.S. academic institutions and government agencies, failed to follow these dictates. The U.S. government, for instance, secretly infected Guatemalan men with syphilis in order to study the disease and experimented on soldiers, exposing them without consent to biological and chemical warfare agents. In the 1960s, researchers at New York's Willowbrook State School purposefully fed intellectually disabled children infected stool extracts with hepatitis to study the disease. In 1966, in the New England Journal of Medicine, Henry Beecher, a Harvard anesthesiologist, described 22 cases of unethical research published in the nation's leading medical journals, but were mostly conducted without informed consent, and at times harmed participants without offering them any benefit.
Despite heightened awareness and enhanced guidelines, abuses continued. Until a 1974 journalistic exposé, the U.S. government continued to fund the now-notorious Tuskegee syphilis study of infected poor African-American men in rural Alabama, refusing to offer these men penicillin when it became available as effective treatment for the disease.
In response, in 1974 Congress passed the National Research Act, establishing research ethics committees or Institutional Review Boards (IRBs), to guide scientists, allowing them to innovate while protecting study participants' rights. Routinely, IRBs now detect and prevent unethical studies from starting.
Still, even with these regulations, researchers have at times conducted unethical investigations. In 1999 at the Los Angeles Veterans Affairs Hospital, for example, a patient twice refused to participate in a study that would prolong his surgery. The researcher nonetheless proceeded to experiment on him anyway, using an electrical probe in the patient's heart to collect data.
Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.
Pharmaceutical company scandals, involving Avandia, and Neurontin and other drugs, raise added concerns. In marketing Vioxx, OxyContin, and tobacco, corporations have hidden findings that might undercut sales.
Regulations become increasingly critical as drug companies and the NIH conduct increasing amounts of research in the developing world. In 1996, Pfizer conducted a study of bacterial meningitis in Nigeria in which 11 children died. The families thus sued. Pfizer produced a Nigerian IRB approval letter, but the letter turned out to have been forged. No Nigerian IRB had ever approved the study. Fourteen years later, Wikileaks revealed that Pfizer had hired detectives to find evidence of corruption against the Nigerian Attorney General, to compel him to drop the lawsuit.
Researchers in not only industry, but also academia have violated research participants' rights. Arizona State University scientists wanted to investigate the genes of a Native American group, the Havasupai, who were concerned about their high rates of diabetes. The investigators also wanted to study the group's rates of schizophrenia, but feared that the tribe would oppose the study, given the stigma. Hence, these researchers decided to mislead the tribe, stating that the study was only about diabetes. The university's research ethics committee knew the scientists' plan to study schizophrenia, but approved the study, including the consent form, which did not mention any psychiatric diagnoses. The Havasupai gave blood samples, but later learned that the researchers published articles about the tribe's schizophrenia and alcoholism, and genetic origins in Asia (while the Havasupai believed they originated in the Grand Canyon, where they now lived, and which they thus argued they owned). A 2010 legal settlement required that the university return the blood samples to the tribe, which then destroyed them. Had the researchers instead worked with the tribe more respectfully, they could have advanced science in many ways.
Part of the problem and consequent need for regulations is that researchers have conflicts of interest and often do not recognize ethical challenges their research may pose.
Such violations threaten to lower public trust in science, particularly among vulnerable groups that have historically been systemically mistreated, diminishing public and government support for research and for the National Institutes of Health, National Science Foundation and Centers for Disease Control, all of which conduct large numbers of studies.
Research that has failed to follow ethics has in fact impeded innovation.
In popular culture, myths of immoral science and technology--from Frankenstein to Big Brother and Dr. Strangelove--loom.
Admittedly, regulations involve inherent tradeoffs. Following certain rules can take time and effort. Certain regulations may in fact limit research that might potentially advance knowledge, but be grossly unethical. For instance, if our society's sole goal was to have scientists innovate as much as possible, we might let them stick needles into healthy people's brains to remove cells in return for cash that many vulnerable poor people might find desirable. But these studies would clearly pose major ethical problems.
Research that has failed to follow ethics has in fact impeded innovation. In 1999, the death of a young man, Jesse Gelsinger, in a gene therapy experiment in which the investigator was subsequently found to have major conflicts of interest, delayed innovations in the field of gene therapy research for years.
Without regulations, companies might market products that prove dangerous, leading to massive lawsuits that could also ultimately stifle further innovation within an industry.
The key question is not whether regulations help or hurt science alone, but whether they help or hurt science that is both "responsible and innovative."
We don't want "over-regulation." Rather, the right amount of regulations is needed – neither too much nor too little. Hence, policy makers in this area have developed regulations in fair and transparent ways and have also been working to reduce the burden on researchers – for instance, by allowing single IRBs to review multi-site studies, rather than having multiple IRBs do so, which can create obstacles.
In sum, society requires a proper balance of regulations to ensure ethical research, avoid abuses, and ultimately aid us all by promoting responsible innovation.
[Ed. Note: Check out the opposite viewpoint here, and follow LeapsMag on social media to share your perspective.]
Regulation Too Often Shackles the Hands of Innovators
[Editor's Note: Our Big Moral Question this month is, "Do government regulations help or hurt the goal of responsible and timely scientific innovation?"]
After biomedical scientists demonstrated that they could make dangerous viruses like influenza even more dangerous, the National Institutes of Health (NIH) implemented a three-year moratorium on funding such research. But a couple of months ago, in December, the moratorium was lifted, and a tight set of rules were put in its place, such as a mandate for oversight panels.
"The sort of person who thinks like a bureaucratic regulator isn't the sort of person who thinks like a scientist."
The prospect of engineering a deadly pandemic virus in a laboratory suggests that only a fool would wish away government regulation entirely.
However, as a whole, regulation has done more harm than good in the arena of scientific innovation. The reason is that the sort of person who thinks like a bureaucratic regulator isn't the sort of person who thinks like a scientist. The sad fact of the matter is that those most interested in the regulatory process tend to be motivated by politics and ideology rather than scientific inquiry and technological progress.
Consider genetically engineered crops and animals, for instance. Beyond any reasonable doubt, data consistently have shown them to be safe, yet they are routinely held in regulatory limbo. For instance, it took 20 years for the AquAdvantage salmon, which grows faster than ordinary salmon, to gain approval from the FDA. What investor in his right mind would fund an entrepreneurial scientist who wishes to create genetically engineered consumer goods when he is assured that any such product could be subjected to two decades of arbitrary and pointless bureaucratic scrutiny?
Other well-intentioned regulations have created enormous problems for society. Medicine costs too much. One reason is that there is no international competition in the U.S. marketplace because it is nearly impossible to import drugs from other countries. The FDA's overcautious attitude toward approving new medications has ushered in a grassroots "right-to-try" movement, in which terminal patients are demanding access to potentially life-saving (but also potentially dangerous) treatments that are not yet federally approved. The FDA's sluggishness in approving generics also allowed the notorious former hedge fund manager Martin Shkreli to jack up the price of a drug for HIV patients because there were no competitors on the market. Thankfully, the FDA and politicians are now aware of these self-inflicted problems and are proposing possible solutions.
"Other well-intentioned regulations have created enormous problems for society."
The regulatory process itself drags on far too long and consists of procedural farces, none more so than public hearings and the solicitation of public comments. Hearings are often dominated by activists who are more concerned with theatrics and making the front page of a newspaper rather than contributing meaningfully to the scientific debate.
It is frankly absurd to believe that scientifically untrained laypeople have anything substantive to say on matters like biomedical regulation. The generals at the Pentagon quite rightly do not seek the public's council before they draw up battlefield plans, so why should scientists be subjected to an unjustifiable level of public scrutiny? Besides, there is a good chance that a substantial proportion of feedback is fake, anyway: A Wall Street Journal investigation uncovered that thousands of posts on federal websites seeking public comment on topics like net neutrality are fraudulent.
In other cases, out-of-date regulations remain on the books, holding back progress. For more than 20 years, the Dickey-Wicker Amendment has tied the hands of the NIH, essentially preventing it from funding any research that must first create human embryos or derive new embryonic stem cell lines. This seriously impedes progress in regenerative medicine and dampens the potential revolutionary potential of CRISPR, a genome editing tool that could someday be used in adult gene therapy or to "fix" unhealthy human embryos.
"Regulators and especially politicians give the false impression that any new scientific innovation should be made perfectly safe before it is allowed on the market."
Biomedicine isn't the only science to suffer at the hands of regulators. For years, the Nuclear Regulatory Commission (NRC) – an organization ostensibly concerned about nuclear safety – instead has played politics with nuclear power, particularly over a proposed waste storage facility at Yucca Mountain. Going all the way back to the Reagan administration, Yucca has been subjected to partisan assaults, culminating in the Obama administration's mothballing the project. Under the Trump administration, the NRC is once again reconsidering its future.
Perhaps the biggest problem that results from overregulation is a change in the culture. Regulators and especially politicians give the false impression that any new scientific innovation should be made perfectly safe before it is allowed on the market. This notion is known as the precautionary principle, and it is the law in the European Union. The precautionary principle is a form of technological timidity that is partially to blame for Europe's lagging behind America in groundbreaking research.
Besides, perfect safety is an impossible goal. Nothing in life is perfectly safe. The same people who drive to Whole Foods to avoid GMOs and synthetic pesticides seem not to care that automobiles kill 30,000 Americans every single year.
Government regulation is necessary because people rightfully expect a safe place to work and live. However, charlatans and lawbreakers will always exist, no matter how many new rules are added. The proliferation of safety regulations, therefore, often results in increasing the burden on innovators without any concomitant increase in safety. Like an invasive weed, government regulation has spread far beyond its proper place in the ecosystem. It's time for a weedkiller.
[Ed. Note: Check out the opposite viewpoint here, and follow LeapsMag on social media to share your perspective.]