Blood Money: Paying for Convalescent Plasma to Treat COVID-19

Blood Money: Paying for Convalescent Plasma to Treat COVID-19

A bag of plasma that Tom Hanks donated back in April 2020 after his coronavirus infection. (He was not paid to donate.)

Tom Hanks' Instagram

Convalescent plasma – first used to treat diphtheria in 1890 – has been dusted off the shelf to treat COVID-19. Does it work? Should we rely strictly on the altruism of donors or should people be paid for it?

The biologic theory is that a person who has recovered from a disease has chemicals in their blood, most likely antibodies, that contributed to their recovery, and transferring those to a person who is sick might aid their recovery. Whole blood won't work because there are too few antibodies in a single unit of blood and the body can hold only so much of it.

Plasma comprises about 55 percent of whole blood and is what's left once you take out the red blood cells that carry oxygen and the white blood cells of the immune system. Most of it is water but the rest is a complex mix of fats, salts, signaling molecules and proteins produced by the immune system, including antibodies.

A process called apheresis circulates the donors' blood through a machine that separates out the desired parts of blood and returns the rest to the donor. It takes several times the length of a regular whole blood donation to cycle through enough blood for the process. The end product is a yellowish concentration called convalescent plasma.

Keep Reading Keep Reading
Bob Roehr
Bob Roehr is a biomedical journalist based in Washington, DC. Over the last twenty-five years he has written extensively for The BMJ, Scientific American, PNAS, Proto, and myriad other publications. He is primarily interested in HIV, infectious disease, immunology, and how growing knowledge of the microbiome is changing our understanding of health and disease. He is working on a book about the ways the body can at least partially control HIV and how that has influenced (or not) the search for a treatment and cure.
Staying well in the 21st century is like playing a game of chess

The control of infectious diseases was considered to be one of the “10 Great Public Health Achievements.” What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.

Adobe Stock

This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.

On July 30, 1999, the Centers for Disease Control and Prevention published a report comparing data on the control of infectious disease from the beginning of the 20th century to the end. The data showed that deaths from infectious diseases declined markedly. In the early 1900s, pneumonia, tuberculosis and diarrheal diseases were the three leading killers, accounting for one-third of total deaths in the U.S.—with 40 percent being children under five.

Mass vaccinations, the discovery of antibiotics and overall sanitation and hygiene measures eventually eradicated smallpox, beat down polio, cured cholera, nearly rid the world of tuberculosis and extended the U.S. life expectancy by 25 years. By 1997, there was a shift in population health in the U.S. such that cancer, diabetes and heart disease were now the leading causes of death.

The control of infectious diseases is considered to be one of the “10 Great Public Health Achievements.” Yet on the brink of the 21st century, new trouble was already brewing. Hospitals were seeing periodic cases of antibiotic-resistant infections. Novel viruses, or those that previously didn’t afflict humans, began to emerge, causing outbreaks of West Nile, SARS, MERS or swine flu.

In the years that followed, tuberculosis made a comeback, at least in certain parts of the world. What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.

Keep Reading Keep Reading
Lina Zeldovich

Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.

Alzheimer’s prevention may be less about new drugs, more about income, zip code and education

(Left to right) Vickie Naylor, Bernadine Clay, and Donna Maxey read a memory prompt as they take part in the Sharing History through Active Reminiscence and Photo-Imagery (SHARP) study, September 20, 2017.

OHSU/Kristyna Wentz-Graff

That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.

At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.

Keep Reading Keep Reading
Eve Glicksman
Eve Glicksman is a freelance writer and editor in Silver Spring, MD. She writes for multiple media outlets and associations on health care, trends, culture, psychology, lifestyle, and travel. To see her work in the Washington Post, WebMD, and U.S. News & World Report, visit eveglicksman.com.