“Coming Back from the Dead” Is No Longer Science Fiction
Last year, there were widespread reports of a 53-year-old Frenchman who had suffered a cardiac arrest and "died," but was then resuscitated back to life 18 hours after his heart had stopped.
The once black-and-white line between life and death is now blurrier than ever.
This was thought to have been possible in part because his body had progressively cooled down naturally after his heart had stopped, through exposure to the outside cold. The medical team who revived him were reported as being "stupefied" that they had been able to bring him back to life, in particular since he had not even suffered brain damage.
Interestingly, this man represents one of a growing number of extraordinary cases in which people who would otherwise be declared dead have now been revived. It is a testament to the incredible impact of resuscitation science -- a science that is providing opportunities to literally reverse death, and in doing so, shedding light on the age-old question of what happens when we die.
Death: Past and Present
Throughout history, the boundary between life and death was marked by the moment a person's heart stopped, breathing ceased, and brain function shut down. A person became motionless, lifeless, and was deemed irreversibly dead. This is because once the heart stops beating, blood flow stops and oxygen is cut off from all the body's organs, including the brain. Consequently, within seconds, breathing stops and brain activity comes to a halt. Since the cessation of the heart literally occurs in a "moment," the philosophical notion of a specific point in time of "irreversible" death still pervades society today. The law, for example, relies on "time of death," which corresponds to when the heart stops beating.
The advent of cardiopulmonary resuscitation (CPR) in the 1960s was revolutionary, demonstrating that the heart could potentially be restarted after it had stopped, and what had been a clear black-and-white line was shown to be potentially reversible in some people. What was once called death—the ultimate end point— was now widely called cardiac arrest, and became a starting point.
From then on, it was only if somebody had requested not to be resuscitated or when CPR was deemed to have failed that people would be declared dead by "cardiopulmonary criteria." Biologically, cardiac arrest and death by cardiopulmonary criteria are the same process, albeit marked at different points in time depending on when a declaration of death is made.
The apparent irreversibility of death as we know it may not necessarily reflect true irretrievable cellular damage inside the body.
Clearly, contrary to many people's perceptions, cardiac arrest is not a heart attack; it is the final step in death irrespective of cause, whether it be a stroke, a heart attack, a car accident, an overwhelming infection or cancer. This is how roughly 95 percent of the population are declared dead.
The only exception is the small proportion of people who may have suffered catastrophic brain injuries, but whose hearts can be artificially kept beating for a period of time on life-support machines. These people can be legally declared dead based on brain death criteria before their hearts have stopped. This is because the brain can die either from oxygen starvation after cardiac arrest or from massive trauma and internal bleeding. Either way, the brain dies hours or possibly longer after these injuries have taken place and not just minutes.
A Profound Realization
What has become increasingly clear is that the apparent irreversibility of death as we know it may not necessarily reflect true irretrievable cellular damage inside the body. This is consistent with a mounting understanding: it is only after a person actually dies that the cells in the body start to undergo their own process of death. Intriguingly, this process is something that can now be manipulated through medical intervention. Being cold is one of the factors that slows down the rate of cellular decay. The 53-year-old Frenchman's case and the other recent cases of resuscitation after prolonged periods of time illustrate this new understanding.
Last week's earth-shattering announcement by neuroscientist Dr. Nenad Sestan and his team out of Yale, published in the prestigious scientific journal Nature, provides further evidence that a time gap exists between actual death and cellular death in cadavers. In this seminal study, these researchers were able to restore partial function in pig brains four hours after their heads were severed from their bodies. These results follow from the pioneering work in 2001 of geneticist Fred Gage and colleagues from the Salk Institute, also published in Nature, which demonstrated the possibility of growing human brain cells in the laboratory by taking brain biopsies from cadavers in the mortuary up to 21 hours post-mortem.
The once black-and-white line between life and death is now blurrier than ever. Some people may argue this means these humans and pigs weren't truly "dead." However, that is like saying the people who were guillotined during the French Revolution were also not dead. Clearly, that is not the case. They were all dead. The problem is not death; it's our reliance on an outdated philosophical, rather than biological, notion of death.
Death can no longer be considered an absolute moment but rather a process that can be reversed even many hours after it has taken place.
But the distinction between irreversibility from a medical perspective and biological irreversibility may not matter much from a pragmatic perspective today. If medical interventions do not exist at any given time or place, then of course death cannot be reversed.
However, it is crucial to distinguish between biologically and medically: When "irreversible" loss of function arises due to inadequate treatment, then a person could be potentially brought back in the future when an alternative therapy becomes available, or even today if he or she dies in a location where novel treatments can slow down the rate of cell death. However, when true irreversible loss of function arises from a biological perspective, then no treatment will ever be able to reverse the process, whether today, tomorrow, or in a hundred years.
Probing the "Grey Zone"
Today, thanks to modern resuscitation science, death can no longer be considered an absolute moment but rather a process that can be reversed even many hours after it has taken place. How many hours? We don't really know.
One of the wider implications of our medical advances is that we can now study what happens to the human mind and consciousness after people enter the "grey zone," which marks the time after the heart stops, but before irreversible and irretrievable cell damage occurs, and people are then brought back to life. Millions have been successfully revived and many have reported experiencing a unique, universal, and transformative mental state.
Were they "dead"? Yes, according to all the criteria we have ever used. But they were able to be brought back before their "dead" bodies had reached the point of permanent, irreversible cellular damage. This reflects the period of death for all of us. So rather than a "near-death experience," I prefer a new terminology to describe these cases -- "an actual-death experience." These survivors' unique experiences are providing eyewitness testimonies of what we will all be likely to experience when we die.
Such an experience reportedly includes seeing a warm light, the presence of a compassionate perfect individual, deceased relatives, a review of their lives, a judgment of their actions and intentions as they pertain to their humanity, and in some cases a sensation of seeing doctors and nurses working to resuscitate them.
Are these experiences compatible with hallucinations or illusions? No -- in part, because these people have described real, verifiable events, which, by definition are not hallucinations, and in part, because their experiences are not compatible with confused and delirious memories that characterize oxygen deprivation.
The challenge for us scientifically is understanding how this is possible at a time when all our science tells us the brain shuts down.
For instance, it is hard to classify a structured meaningful review of one's life and one's humanity as hallucinatory or illusory. Instead, these experiences represent a new understanding of the overall human experience of death. As an intensive care unit physician for more than 10 years, I have seen numerous cases where these reports have been corroborated by my colleagues. In short, these survivors have been known to come back with reports of full consciousness, with lucid, well-structured thought processes and memory formation.
The challenge for us scientifically is understanding how this is possible at a time when all our science tells us the brain shuts down. The fact that these experiences occur is a paradox and suggests the undiscovered entity we call the "self," "consciousness," or "psyche" – the thing that makes us who we are - may not become annihilated at the point of so-called death.
At New York University, the State University of New York, and across 20 hospitals in the U.S. and Europe, we have brought together a new multi-disciplinary team of experts across many specialties, including neurology, cardiology, and intensive care. Together, we hope to improve cardiac arrest prevention and treatment, as well as to address the impact of new scientific discoveries on our understanding of what happens at death.
One of our first studies, Awareness during Resuscitation (AWARE), published in the medical journal Resuscitation in 2014, confirmed that some cardiac arrest patients report a perception of awareness without recall; others report detailed memories and experiences; and a few report full auditory and visual awareness and consciousness of their experience, from a time when brain function would be expected to have ceased.
While you probably have some opinion or belief about this based upon your own philosophical, religious, or cultural background, you may not realize that exploring what happens when we die is now a subject that science is beginning to investigate.
There is no question more intriguing to humankind. And for the first time in our history, we may finally uncover some real answers.
By now you have probably heard something about CRISPR, the simple and relatively inexpensive method of precisely editing the genomes of plants, animals, and humans.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier.
Through CRISPR and other methods of gene editing, scientists have produced crops to be more nutritious, better able to resist pests, and tolerate droughts; engineered animals ranging from fruit flies to monkeys to make them better suited for scientific study; and experimentally treated the HIV virus, Hepatitis B, and leukemia in human patients.
There are also currently FDA-approved trials to treat blindness, cancer, and sickle cell disease in humans using gene editing, and there is consensus that CRISPR's therapeutic applications will grow significantly in the coming years.
While the treatment of human disease through use of gene editing is not without its medical and ethical concerns, the avoidance of disease in embryos is far more fraught. Nonetheless, Nature reported in November that He Jiankui, a scientist in China, had edited twin embryos to disable a gene called CCR5 in hopes of avoiding transmission of HIV from their HIV-positive father.
Though there are questions about the effectiveness and necessity of this therapy, He reported that sequencing has proven his embryonic gene edits were successful and the twins were "born normal and healthy," although his claims have not been independently verified.
More recently, Denis Rebrikov, a Russian scientist, announced his plans to disable the same gene in embryos to be implanted in HIV-positive women later this year. Futuristic as it may seem, prenatal gene editing is already here.
The treatment of disease in fetuses, the liminal category of life between embryos and humans, poses the next frontier. Numerous conditions—some minor, some resulting in a lifetime of medical treatment, some incompatible with life outside of the womb—can be diagnosed through use of prenatal diagnostic testing. There is promising research suggesting doctors will soon be able to treat or mitigate at least some of them through use of fetal gene editing.
This research could soon present women carrying genetically anomalous fetuses a third option aside from termination or birthing a child who will likely face a challenging and uncertain medical future: Whether to undergo a fetal genetic intervention.
However, genetic intervention will open the door to a host of ethical considerations, particularly with respect to the relationship between pregnant women and prenatal genetic counselors. Current counselors theoretically provide objective information and answer questions rather than advise their pregnant client whether to continue with her pregnancy, despite the risks, or to have an abortion.
In practice, though, prenatal genetic counseling is most often directive, and the nature of the counseling pregnant women receive can depend on numerous factors, including their religious and cultural beliefs, their perceived ability to handle a complicated pregnancy and subsequent birth, and their financial status. Introducing the possibility of a fetal genetic intervention will exacerbate counselor reliance upon these considerations and in some cases lead to counseling that is even more directive.
Some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus.
Future counselors will have to figure out under what circumstances it is even appropriate to broach the subject. Should they only discuss therapies that are FDA-approved, or should they mention experimental treatments? What about interventions that are available in Europe or Asia, but banned in the United States? Or even in the best case of scenario of an FDA-approved treatment, should a counselor make reference to it if she knows for a fact that her client cannot possibly afford it?
Beyond the basic question of what information to share, counselors will have to confront the fact that the very notion of fixing or "editing" offspring will be repugnant to many women, and inherent in the suggestion is the stigmatization of individuals with disabilities. Prenatal genetic counselors will be on the forefront of debates surrounding which fetuses should remain as they are and which ones should be altered.
Despite these concerns, some women in the near future will face the choice of whether to abort, keep, or treat a genetically anomalous fetus in utero. Take, for example, a woman who learns during prenatal testing that her fetus has Angelman syndrome, a genetic disorder characterized by intellectual disability, speech impairment, loss of muscle control, epilepsy, and a small head. There is currently no human treatment for Angelman syndrome, which is caused by a loss of function in a single gene, UBE3A.
But scientists at the University of North Carolina have been able to treat Angelman syndrome in fetal mice by reactivating UBE3A through use of a single injection. The therapy has also proven effective in cultured human brain cells. This suggests that a woman might soon have to consider injecting her fetus's brain with a CRISPR concoction custom-designed to target UBE3A, rather than terminate her pregnancy or bring her fetus to term unaltered.
Assuming she receives the adequate information to make an informed choice, she too will face an ethical conundrum. There will be the inherent risks of injecting anything into a developing fetus's brain, including the possibility of infection, brain damage, and miscarriage. But there are also risks specific to gene editing, such as so-called off-target effects, the possibility of impacting genes other than the intended one. Such effects are highly unpredictable and can be difficult to detect. So too is it impossible to predict how altering UBE3A might lead to other genetic and epigenetic changes once the baby is born.
There are no easy answers to the many questions that will arise in this space.
A woman deciding how to act in this scenario must balance these risks against the potential benefits of the therapy, layered on top of her belief system, resources, and personal ethics. The calculus will be different for every woman, and even the same woman might change her mind from one pregnancy to the next based on the severity of the condition diagnosed and other available medical options.
Her genetic counselor, meanwhile, must be sensitive to all of these concerns in helping her make her decision, keeping up to date on the possible new treatments, and carefully choosing which information to disclose in striving to be neutral. There are no easy answers to the many questions that will arise in this space, but better to start thinking about them now, before it is too late.
Agriculture in the 21st century is not as simple as it once was. With a population seven billion strong, a climate in crisis, and sustainability in farming practices on everyone's radar, figuring out how to feed the masses without destroying the Earth is a pressing concern.
Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
In addition to low-emission cows and drone pollinators, there's a promising new solution on the table. How does "lab-grown insect meat" grab you?
Writing in Frontiers in Sustainable Food Systems, researchers at Tufts University say insects that are fed plants and genetically modified for maximum growth, nutrition, and flavor could be the best, greenest alternative to our current livestock farming practices. This lab-grown protein source could produce high volume, nutritious food without the massive resources required for traditional animal agriculture.
"Due to the environmental, public health, and animal welfare concerns associated with our current livestock system, it is vital to develop more sustainable food production methods," says lead author Natalie Rubio. Could insect meat be the key?
Next Up
New sustainable food production includes what's called "cellular agriculture," an emerging industry and field of study in which meat and dairy are produced via cells in a lab instead of whole animals. So far, scientists have primarily focused on bovine, porcine, and avian cells to create this "cultured meat."
But the Tufts scientists argue that insect cells may be better suited to lab-created meat protein than traditional farm animal cells.
"Compared to cultured mammalian, avian, and other vertebrate cells, insect cell cultures require fewer resources and less energy-intensive environmental control, as they have lower glucose requirements and can thrive in a wider range of temperature, pH, oxygen, and osmolarity conditions," reports Rubio.
"Alterations necessary for large-scale production are also simpler to achieve with insect cells, which are currently used for biomanufacturing of insecticides, drugs, and vaccines," she adds.
They still have some details to hash out, however, including how to make cultured insect meat more like the steak and chicken we're all familiar with.
"Despite this immense potential, cultured insect meat isn't ready for consumption," says Rubio. "Research is ongoing to master two key processes: controlling development of insect cells into muscle and fat, and combining these in 3D cultures with a meat-like texture." They are currently experimenting with mushroom-derived fiber to tackle the latter.
People would still be able to eat meat—it would just come from a different source.
Open Questions
As the report points out, one thing that makes cellular agriculture an attractive alternative to high-density animal farming is that it doesn't require consumers to change their behaviors. People would still be able to eat meat—it would just come from a different source.
But the big question remains: How will lab-grown insect meat taste? Will the buggers really taste as good as burgers?
And, of course, there's the "ew" factor. Meat alternatives have proven to work for some people—Tofurky is still in business, after all—but it may be a hard sell to get the masses to jump on board with eating bugs. Consuming creepy crawlies sounds simply unpalatable to many, and the term "lab-grown, cellular insect meat" doesn't help much. Perhaps an entirely new nomenclature is in order.
Another question is whether or not folks will trust such scientifically-created food. People already use the term "frankenfood" to refer to genetic modification -- even though the vast majority of the corn and soybeans planted in the U.S. today are genetically engineered, and other major crops with GM varieties include potatoes, apples, squash, and papayas. Still, combining GM technology with eating insects may be a hard sell.
However, we're all going to have to get used to trying new things if we want to leave a habitable home for our children. If a lab-grown bug burger can save the planet, maybe it's worth a shot.