Should We Use Technologies to Enhance Morality?

Should We Use Technologies to Enhance Morality?

Should we welcome biomedical technologies that could enhance our ability to tell right from wrong and improve behaviors that are considered immoral such as dishonesty, prejudice and antisocial aggression?

Photo by Asa Rodger on Unsplash

Our moral ‘hardware’ evolved over 100,000 years ago while humans were still scratching the savannah. The perils we encountered back then were radically different from those that confront us now. To survive and flourish in the face of complex future challenges our archaic operating systems might need an upgrade – in non-traditional ways.

Morality refers to standards of right and wrong when it comes to our beliefs, behaviors, and intentions. Broadly, moral enhancement is the use of biomedical technology to improve moral functioning. This could include augmenting empathy, altruism, or moral reasoning, or curbing antisocial traits like outgroup bias and aggression.

The claims related to moral enhancement are grand and polarizing: it’s been both tendered as a solution to humanity’s existential crises and bluntly dismissed as an armchair hypothesis. So, does the concept have any purchase? The answer leans heavily on our definition and expectations.

Keep Reading Keep Reading
Cohen Marcus Lionel Brown
Cohen Marcus Lionel Brown teaches and researches ethics and applied philosophy at UOW in Greater Sydney, Australia. Specifically, he works on questions in neuroethics, moral psychology, aggression studies, and human enhancement. He is a current member of the Australasian Association of Philosophy Postgraduate Committee, Sydney Health Ethics Network, and the International Society for Research on Aggression. Cohen also works as a judge of the International Ethics Olympiad, and volunteers with the not-for-profit organization Primary Ethics. Find him on Twitter @CohenMarcusLio1
Staying well in the 21st century is like playing a game of chess

The control of infectious diseases was considered to be one of the “10 Great Public Health Achievements.” What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.

Adobe Stock

This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.

On July 30, 1999, the Centers for Disease Control and Prevention published a report comparing data on the control of infectious disease from the beginning of the 20th century to the end. The data showed that deaths from infectious diseases declined markedly. In the early 1900s, pneumonia, tuberculosis and diarrheal diseases were the three leading killers, accounting for one-third of total deaths in the U.S.—with 40 percent being children under five.

Mass vaccinations, the discovery of antibiotics and overall sanitation and hygiene measures eventually eradicated smallpox, beat down polio, cured cholera, nearly rid the world of tuberculosis and extended the U.S. life expectancy by 25 years. By 1997, there was a shift in population health in the U.S. such that cancer, diabetes and heart disease were now the leading causes of death.

The control of infectious diseases is considered to be one of the “10 Great Public Health Achievements.” Yet on the brink of the 21st century, new trouble was already brewing. Hospitals were seeing periodic cases of antibiotic-resistant infections. Novel viruses, or those that previously didn’t afflict humans, began to emerge, causing outbreaks of West Nile, SARS, MERS or swine flu.

In the years that followed, tuberculosis made a comeback, at least in certain parts of the world. What we didn’t take into account was the very concept of evolution: as we built better protections, our enemies eventually boosted their attacking prowess, so soon enough we found ourselves on the defensive once again.

Keep Reading Keep Reading
Lina Zeldovich

Lina Zeldovich has written about science, medicine and technology for Popular Science, Smithsonian, National Geographic, Scientific American, Reader’s Digest, the New York Times and other major national and international publications. A Columbia J-School alumna, she has won several awards for her stories, including the ASJA Crisis Coverage Award for Covid reporting, and has been a contributing editor at Nautilus Magazine. In 2021, Zeldovich released her first book, The Other Dark Matter, published by the University of Chicago Press, about the science and business of turning waste into wealth and health. You can find her on http://linazeldovich.com/ and @linazeldovich.

Alzheimer’s prevention may be less about new drugs, more about income, zip code and education

(Left to right) Vickie Naylor, Bernadine Clay, and Donna Maxey read a memory prompt as they take part in the Sharing History through Active Reminiscence and Photo-Imagery (SHARP) study, September 20, 2017.

OHSU/Kristyna Wentz-Graff

That your risk of Alzheimer’s disease depends on your salary, what you ate as a child, or the block where you live may seem implausible. But researchers are discovering that social determinants of health (SDOH) play an outsized role in Alzheimer’s disease and related dementias, possibly more than age, and new strategies are emerging for how to address these factors.

At the 2022 Alzheimer’s Association International Conference, a series of presentations offered evidence that a string of socioeconomic factors—such as employment status, social support networks, education and home ownership—significantly affected dementia risk, even when adjusting data for genetic risk. What’s more, memory declined more rapidly in people who earned lower wages and slower in people who had parents of higher socioeconomic status.

Keep Reading Keep Reading
Eve Glicksman
Eve Glicksman is a freelance writer and editor in Silver Spring, MD. She writes for multiple media outlets and associations on health care, trends, culture, psychology, lifestyle, and travel. To see her work in the Washington Post, WebMD, and U.S. News & World Report, visit eveglicksman.com.