New tools could catch disease outbreaks earlier - or predict them
Every year, the villages which lie in the so-called ‘Nipah belt’— which stretches along the western border between Bangladesh and India, brace themselves for the latest outbreak. For since 1998, when Nipah virus—a form of hemorrhagic fever most common in Bangladesh—first spilled over into humans, it has been a grim annual visitor to the people of this region.
With a 70 percent fatality rate, no vaccine, and no known treatments, Nipah virus has been dubbed in the Western world as ‘the worst disease no one has ever heard of.’ Currently, outbreaks tend to be relatively contained because it is not very transmissible. The virus circulates throughout Asia in fruit eating bats, and only tends to be passed on to people who consume contaminated date palm sap, a sweet drink which is harvested across Bangladesh.
But as SARS-CoV-2 has shown the world, this can quickly change.
“Nipah virus is among what virologists call ‘the Big 10,’ along with things like Lassa fever and Crimean Congo hemorrhagic fever,” says Noam Ross, a disease ecologist at New York-based non-profit EcoHealth Alliance. “These are pretty dangerous viruses from a lethality perspective, which don’t currently have the capacity to spread into broader human populations. But that can evolve, and you could very well see a variant emerge that has human-human transmission capability.”
That’s not an overstatement. Surveys suggest that mammals harbour about 40,000 viruses, with roughly a quarter capable of infecting humans. The vast majority never get a chance to do so because we don’t encounter them, but climate change can alter that. Recent studies have found that as animals relocate to new habitats due to shifting environmental conditions, the coming decades will bring around 300,000 first encounters between species which normally don’t interact, especially in tropical Africa and southeast Asia. All these interactions will make it far more likely for hitherto unknown viruses to cross paths with humans.
That’s why for the last 16 years, EcoHealth Alliance has been conducting ongoing viral surveillance projects across Bangladesh. The goal is to understand why Nipah is so much more prevalent in the western part of the country, compared to the east, and keep a watchful eye out for new Nipah strains as well as other dangerous pathogens like Ebola.
"There are a lot of different infectious agents that are sensitive to climate change that don't have these sorts of software tools being developed for them," says Cat Lippi, medical geography researcher at the University of Florida.
Until very recently this kind of work has been hampered by the limitations of viral surveillance technology. The PREDICT project, a $200 million initiative funded by the United States Agency for International Development, which conducted surveillance across the Amazon Basin, Congo Basin and extensive parts of South and Southeast Asia, relied upon so-called nucleic acid assays which enabled scientists to search for the genetic material of viruses in animal samples.
However, the project came under criticism for being highly inefficient. “That approach requires a big sampling effort, because of the rarity of individual infections,” says Ross. “Any particular animal may be infected for a couple of weeks, maybe once or twice in its lifetime. So if you sample thousands and thousands of animals, you'll eventually get one that has an Ebola virus infection right now.”
Ross explains that there is now far more interest in serological sampling—the scientific term for the process of drawing blood for antibody testing. By searching for the presence of antibodies in the blood of humans and animals, scientists have a greater chance of detecting viruses which started circulating recently.
Despite the controversy surrounding EcoHealth Alliance’s involvement in so-called gain of function research—experiments that study whether viruses might mutate into deadlier strains—the organization’s separate efforts to stay one step ahead of pathogen evolution are key to stopping the next pandemic.
“Having really cheap and fast surveillance is really important,” says Ross. “Particularly in a place where there's persistent, low level, moderate infections that potentially have the ability to develop into more epidemic or pandemic situations. It means there’s a pathway that something more dangerous can come through."
Scientists are searching for the presence of antibodies in the blood of humans and animals in hopes to detect viruses that recently started circulating.
EcoHealth Alliance
In Bangladesh, EcoHealth Alliance is attempting to do this using a newer serological technology known as a multiplex Luminex assay, which tests samples against a panel of known antibodies against many different viruses. It collects what Ross describes as a ‘footprint of information,’ which allows scientists to tell whether the sample contains the presence of a known pathogen or something completely different and needs to be investigated further.
By using this technology to sample human and animal populations across the country, they hope to gain an idea of whether there are any novel Nipah virus variants or strains from the same family, as well as other deadly viral families like Ebola.
This is just one of several novel tools being used for viral discovery in surveillance projects around the globe. Multiple research groups are taking PREDICT’s approach of looking for novel viruses in animals in various hotspots. They collect environmental DNA—mucus, faeces or shed skin left behind in soil, sediment or water—which can then be genetically sequenced.
Five years ago, this would have been a painstaking work requiring bringing collected samples back to labs. Today, thanks to the vast amounts of money spent on new technologies during COVID-19, researchers now have portable sequencing tools they can take out into the field.
Christopher Jerde, a researcher at the UC Santa Barbara Marine Science Institute, points to the Oxford Nanopore MinION sequencer as one example. “I tried one of the early versions of it four years ago, and it was miserable,” he says. “But they’ve really improved, and what we’re going to be able to do in the next five to ten years will be amazing. Instead of having to carefully transport samples back to the lab, we're going to have cigar box-shaped sequencers that we take into the field, plug into a laptop, and do the whole sequencing of an organism.”
In the past, viral surveillance has had to be very targeted and focused on known families of viruses, potentially missing new, previously unknown zoonotic pathogens. Jerde says that the rise of portable sequencers will lead to what he describes as “true surveillance.”
“Before, this was just too complex,” he says. “It had to be very focused, for example, looking for SARS-type viruses. Now we’re able to say, ‘Tell us all the viruses that are here?’ And this will give us true surveillance – we’ll be able to see the diversity of all the pathogens which are in these spots and have an understanding of which ones are coming into the population and causing damage.”
But being able to discover more viruses also comes with certain challenges. Some scientists fear that the speed of viral discovery will soon outpace the human capacity to analyze them all and assess the threat that they pose to us.
“I think we're already there,” says Jason Ladner, assistant professor at Northern Arizona University’s Pathogen and Microbiome Institute. “If you look at all the papers on the expanding RNA virus sphere, there are all of these deposited partial or complete viral sequences in groups that we just don't know anything really about yet.” Bats, for example, carry a myriad of viruses, whose ability to infect human cells we understand very poorly.
Cultivating these viruses under laboratory conditions and testing them on organoids— miniature, simplified versions of organs created from stem cells—can help with these assessments, but it is a slow and painstaking work. One hope is that in the future, machine learning could help automate this process. The new SpillOver Viral Risk Ranking platform aims to assess the risk level of a given virus based on 31 different metrics, while other computer models have tried to do the same based on the similarity of a virus’s genomic sequence to known zoonotic threats.
However, Ladner says that these types of comparisons are still overly simplistic. For one thing, scientists are still only aware of a few hundred zoonotic viruses, which is a very limited data sample for accurately assessing a novel pathogen. Instead, he says that there is a need for virologists to develop models which can determine viral compatibility with human cells, based on genomic data.
“One thing which is really useful, but can be challenging to do, is understand the cell surface receptors that a given virus might use,” he says. “Understanding whether a virus is likely to be able to use proteins on the surface of human cells to gain entry can be very informative.”
As the Earth’s climate heats up, scientists also need to better model the so-called vector borne diseases such as dengue, Zika, chikungunya and yellow fever. Transmitted by the Aedes mosquito residing in humid climates, these blights currently disproportionally affect people in low-income nations. But predictions suggest that as the planet warms and the pests find new homes, an estimated one billion people who currently don’t encounter them might be threatened by their bites by 2080. “When it comes to mosquito-borne diseases we have to worry about shifts in suitable habitat,” says Cat Lippi, a medical geography researcher at the University of Florida. “As climate patterns change on these big scales, we expect to see shifts in where people will be at risk for contracting these diseases.”
Public health practitioners and government decision-makers need tools to make climate-informed decisions about the evolving threat of different infectious diseases. Some projects are already underway. An ongoing collaboration between the Catalan Institution for Research and Advanced Studies and researchers in Brazil and Peru is utilizing drones and weather stations to collect data on how mosquitoes change their breeding patterns in response to climate shifts. This information will then be fed into computer algorithms to predict the impact of mosquito-borne illnesses on different regions.
The team at the Catalan Institution for Research and Advanced Studies is using drones and weather stations to collect data on how mosquito breeding patterns change due to climate shifts.
Gabriel Carrasco
Lippi says that similar models are urgently needed to predict how changing climate patterns affect respiratory, foodborne, waterborne and soilborne illnesses. The UK-based Wellcome Trust has allocated significant assets to fund such projects, which should allow scientists to monitor the impact of climate on a much broader range of infections. “There are a lot of different infectious agents that are sensitive to climate change that don't have these sorts of software tools being developed for them,” she says.
COVID-19’s havoc boosted funding for infectious disease research, but as its threats begin to fade from policymakers’ focus, the money may dry up. Meanwhile, scientists warn that another major infectious disease outbreak is inevitable, potentially within the next decade, so combing the planet for pathogens is vital. “Surveillance is ultimately a really boring thing that a lot of people don't want to put money into, until we have a wide scale pandemic,” Jerde says, but that vigilance is key to thwarting the next deadly horror. “It takes a lot of patience and perseverance to keep looking.”
This article originally appeared in One Health/One Planet, a single-issue magazine that explores how climate change and other environmental shifts are increasing vulnerabilities to infectious diseases by land and by sea. The magazine probes how scientists are making progress with leaders in other fields toward solutions that embrace diverse perspectives and the interconnectedness of all lifeforms and the planet.
New elevators could lift up our access to space
Story by Big Think
When people first started exploring space in the 1960s, it cost upwards of $80,000 (adjusted for inflation) to put a single pound of payload into low-Earth orbit.
A major reason for this high cost was the need to build a new, expensive rocket for every launch. That really started to change when SpaceX began making cheap, reusable rockets, and today, the company is ferrying customer payloads to LEO at a price of just $1,300 per pound.
This is making space accessible to scientists, startups, and tourists who never could have afforded it previously, but the cheapest way to reach orbit might not be a rocket at all — it could be an elevator.
The space elevator
The seeds for a space elevator were first planted by Russian scientist Konstantin Tsiolkovsky in 1895, who, after visiting the 1,000-foot (305 m) Eiffel Tower, published a paper theorizing about the construction of a structure 22,000 miles (35,400 km) high.
This would provide access to geostationary orbit, an altitude where objects appear to remain fixed above Earth’s surface, but Tsiolkovsky conceded that no material could support the weight of such a tower.
We could then send electrically powered “climber” vehicles up and down the tether to deliver payloads to any Earth orbit.
In 1959, soon after Sputnik, Russian engineer Yuri N. Artsutanov proposed a way around this issue: instead of building a space elevator from the ground up, start at the top. More specifically, he suggested placing a satellite in geostationary orbit and dropping a tether from it down to Earth’s equator. As the tether descended, the satellite would ascend. Once attached to Earth’s surface, the tether would be kept taut, thanks to a combination of gravitational and centrifugal forces.
We could then send electrically powered “climber” vehicles up and down the tether to deliver payloads to any Earth orbit. According to physicist Bradley Edwards, who researched the concept for NASA about 20 years ago, it’d cost $10 billion and take 15 years to build a space elevator, but once operational, the cost of sending a payload to any Earth orbit could be as low as $100 per pound.
“Once you reduce the cost to almost a Fed-Ex kind of level, it opens the doors to lots of people, lots of countries, and lots of companies to get involved in space,” Edwards told Space.com in 2005.
In addition to the economic advantages, a space elevator would also be cleaner than using rockets — there’d be no burning of fuel, no harmful greenhouse emissions — and the new transport system wouldn’t contribute to the problem of space junk to the same degree that expendable rockets do.
So, why don’t we have one yet?
Tether troubles
Edwards wrote in his report for NASA that all of the technology needed to build a space elevator already existed except the material needed to build the tether, which needs to be light but also strong enough to withstand all the huge forces acting upon it.
The good news, according to the report, was that the perfect material — ultra-strong, ultra-tiny “nanotubes” of carbon — would be available in just two years.
“[S]teel is not strong enough, neither is Kevlar, carbon fiber, spider silk, or any other material other than carbon nanotubes,” wrote Edwards. “Fortunately for us, carbon nanotube research is extremely hot right now, and it is progressing quickly to commercial production.”Unfortunately, he misjudged how hard it would be to synthesize carbon nanotubes — to date, no one has been able to grow one longer than 21 inches (53 cm).
Further research into the material revealed that it tends to fray under extreme stress, too, meaning even if we could manufacture carbon nanotubes at the lengths needed, they’d be at risk of snapping, not only destroying the space elevator, but threatening lives on Earth.
Looking ahead
Carbon nanotubes might have been the early frontrunner as the tether material for space elevators, but there are other options, including graphene, an essentially two-dimensional form of carbon that is already easier to scale up than nanotubes (though still not easy).
Contrary to Edwards’ report, Johns Hopkins University researchers Sean Sun and Dan Popescu say Kevlar fibers could work — we would just need to constantly repair the tether, the same way the human body constantly repairs its tendons.
“Using sensors and artificially intelligent software, it would be possible to model the whole tether mathematically so as to predict when, where, and how the fibers would break,” the researchers wrote in Aeon in 2018.
“When they did, speedy robotic climbers patrolling up and down the tether would replace them, adjusting the rate of maintenance and repair as needed — mimicking the sensitivity of biological processes,” they continued.Astronomers from the University of Cambridge and Columbia University also think Kevlar could work for a space elevator — if we built it from the moon, rather than Earth.
They call their concept the Spaceline, and the idea is that a tether attached to the moon’s surface could extend toward Earth’s geostationary orbit, held taut by the pull of our planet’s gravity. We could then use rockets to deliver payloads — and potentially people — to solar-powered climber robots positioned at the end of this 200,000+ mile long tether. The bots could then travel up the line to the moon’s surface.
This wouldn’t eliminate the need for rockets to get into Earth’s orbit, but it would be a cheaper way to get to the moon. The forces acting on a lunar space elevator wouldn’t be as strong as one extending from Earth’s surface, either, according to the researchers, opening up more options for tether materials.
“[T]he necessary strength of the material is much lower than an Earth-based elevator — and thus it could be built from fibers that are already mass-produced … and relatively affordable,” they wrote in a paper shared on the preprint server arXiv.
After riding up the Earth-based space elevator, a capsule would fly to a space station attached to the tether of the moon-based one.
Electrically powered climber capsules could go up down the tether to deliver payloads to any Earth orbit.
Adobe Stock
Some Chinese researchers, meanwhile, aren’t giving up on the idea of using carbon nanotubes for a space elevator — in 2018, a team from Tsinghua University revealed that they’d developed nanotubes that they say are strong enough for a tether.
The researchers are still working on the issue of scaling up production, but in 2021, state-owned news outlet Xinhua released a video depicting an in-development concept, called “Sky Ladder,” that would consist of space elevators above Earth and the moon.
After riding up the Earth-based space elevator, a capsule would fly to a space station attached to the tether of the moon-based one. If the project could be pulled off — a huge if — China predicts Sky Ladder could cut the cost of sending people and goods to the moon by 96 percent.
The bottom line
In the 120 years since Tsiolkovsky looked at the Eiffel Tower and thought way bigger, tremendous progress has been made developing materials with the properties needed for a space elevator. At this point, it seems likely we could one day have a material that can be manufactured at the scale needed for a tether — but by the time that happens, the need for a space elevator may have evaporated.
Several aerospace companies are making progress with their own reusable rockets, and as those join the market with SpaceX, competition could cause launch prices to fall further.
California startup SpinLaunch, meanwhile, is developing a massive centrifuge to fling payloads into space, where much smaller rockets can propel them into orbit. If the company succeeds (another one of those big ifs), it says the system would slash the amount of fuel needed to reach orbit by 70 percent.
Even if SpinLaunch doesn’t get off the ground, several groups are developing environmentally friendly rocket fuels that produce far fewer (or no) harmful emissions. More work is needed to efficiently scale up their production, but overcoming that hurdle will likely be far easier than building a 22,000-mile (35,400-km) elevator to space.
New tech aims to make the ocean healthier for marine life
A defunct drydock basin arched by a rusting 19th century steel bridge seems an incongruous place to conduct state-of-the-art climate science. But this placid and protected sliver of water connecting Brooklyn’s Navy Yard to the East River was just right for Garrett Boudinot to float a small dock topped with water carbon-sensing gear. And while his system right now looks like a trio of plastic boxes wired up together, it aims to mediate the growing ocean acidification problem, caused by overabundance of dissolved carbon dioxide.
Boudinot, a biogeochemist and founder of a carbon-management startup called Vycarb, is honing his method for measuring CO2 levels in water, as well as (at least temporarily) correcting their negative effects. It’s a challenge that’s been occupying numerous climate scientists as the ocean heats up, and as states like New York recognize that reducing emissions won’t be enough to reach their climate goals; they’ll have to figure out how to remove carbon, too.
To date, though, methods for measuring CO2 in water at scale have been either intensely expensive, requiring fancy sensors that pump CO2 through membranes; or prohibitively complicated, involving a series of lab-based analyses. And that’s led to a bottleneck in efforts to remove carbon as well.
But recently, Boudinot cracked part of the code for measurement and mitigation, at least on a small scale. While the rest of the industry sorts out larger intricacies like getting ocean carbon markets up and running and driving carbon removal at billion-ton scale in centralized infrastructure, his decentralized method could have important, more immediate implications.
Specifically, for shellfish hatcheries, which grow seafood for human consumption and for coastal restoration projects. Some of these incubators for oysters and clams and scallops are already feeling the negative effects of excess carbon in water, and Vycarb’s tech could improve outcomes for the larval- and juvenile-stage mollusks they’re raising. “We’re learning from these folks about what their needs are, so that we’re developing our system as a solution that’s relevant,” Boudinot says.
Ocean acidification can wreak havoc on developing shellfish, inhibiting their shells from growing and leading to mass die-offs.
Ocean waters naturally absorb CO2 gas from the atmosphere. When CO2 accumulates faster than nature can dissipate it, it reacts with H2O molecules, forming carbonic acid, H2CO3, which makes the water column more acidic. On the West Coast, acidification occurs when deep, carbon dioxide-rich waters upwell onto the coast. This can wreak havoc on developing shellfish, inhibiting their shells from growing and leading to mass die-offs; this happened, disastrously, at Pacific Northwest oyster hatcheries in 2007.
This type of acidification will eventually come for the East Coast, too, says Ryan Wallace, assistant professor and graduate director of environmental studies and sciences at Long Island’s Adelphi University, who studies acidification. But at the moment, East Coast acidification has other sources: agricultural runoff, usually in the form of nitrogen, and human and animal waste entering coastal areas. These excess nutrient loads cause algae to grow, which isn’t a problem in and of itself, Wallace says; but when algae die, they’re consumed by bacteria, whose respiration in turn bumps up CO2 levels in water.
“Unfortunately, this is occurring at the bottom [of the water column], where shellfish organisms live and grow,” Wallace says. Acidification on the East Coast is minutely localized, occurring closest to where nutrients are being released, as well as seasonally; at least one local shellfish farm, on Fishers Island in the Long Island Sound, has contended with its effects.
The second Vycarb pilot, ready to be installed at the East Hampton shellfish hatchery.
Courtesy of Vycarb
Besides CO2, ocean water contains two other forms of dissolved carbon — carbonate (CO3-) and bicarbonate (HCO3) — at all times, at differing levels. At low pH (acidic), CO2 prevails; at medium pH, HCO3 is the dominant form; at higher pH, CO3 dominates. Boudinot’s invention is the first real-time measurement for all three, he says. From the dock at the Navy Yard, his pilot system uses carefully calibrated but low-cost sensors to gauge the water’s pH and its corresponding levels of CO2. When it detects elevated levels of the greenhouse gas, the system mitigates it on the spot. It does this by adding a bicarbonate powder that’s a byproduct of agricultural limestone mining in nearby Pennsylvania. Because the bicarbonate powder is alkaline, it increases the water pH and reduces the acidity. “We drive a chemical reaction to increase the pH to convert greenhouse gas- and acid-causing CO2 into bicarbonate, which is HCO3,” Boudinot says. “And HCO3 is what shellfish and fish and lots of marine life prefers over CO2.”
This de-acidifying “buffering” is something shellfish operations already do to water, usually by adding soda ash (NaHCO3), which is also alkaline. Some hatcheries add soda ash constantly, just in case; some wait till acidification causes significant problems. Generally, for an overly busy shellfish farmer to detect acidification takes time and effort. “We’re out there daily, taking a look at the pH and figuring out how much we need to dose it,” explains John “Barley” Dunne, director of the East Hampton Shellfish Hatchery on Long Island. “If this is an automatic system…that would be much less labor intensive — one less thing to monitor when we have so many other things we need to monitor.”
Across the Sound at the hatchery he runs, Dunne annually produces 30 million hard clams, 6 million oysters, and “if we’re lucky, some years we get a million bay scallops,” he says. These mollusks are destined for restoration projects around the town of East Hampton, where they’ll create habitat, filter water, and protect the coastline from sea level rise and storm surge. So far, Dunne’s hatchery has largely escaped the ill effects of acidification, although his bay scallops are having a finicky year and he’s checking to see if acidification might be part of the problem. But “I think it's important to have these solutions ready-at-hand for when the time comes,” he says. That’s why he’s hosting a second, 70-liter Vycarb pilot starting this summer on a dock adjacent to his East Hampton operation; it will amp up to a 50,000 liter-system in a few months.
If it can buffer water over a large area, absolutely this will benefit natural spawns. -- John “Barley” Dunne.
Boudinot hopes this new pilot will act as a proof of concept for hatcheries up and down the East Coast. The area from Maine to Nova Scotia is experiencing the worst of Atlantic acidification, due in part to increased Arctic meltwater combining with Gulf of St. Lawrence freshwater; that decreases saturation of calcium carbonate, making the water more acidic. Boudinot says his system should work to adjust low pH regardless of the cause or locale. The East Hampton system will eventually test and buffer-as-necessary the water that Dunne pumps from the Sound into 100-gallon land-based tanks where larvae grow for two weeks before being transferred to an in-Sound nursery to plump up.
Dunne says this could have positive effects — not only on his hatchery but on wild shellfish populations, too, reducing at least one stressor their larvae experience (others include increasing water temperatures and decreased oxygen levels). “If it can buffer water over a large area, absolutely this will [benefit] natural spawns,” he says.
No one believes the Vycarb model — even if it proves capable of functioning at much greater scale — is the sole solution to acidification in the ocean. Wallace says new water treatment plants in New York City, which reduce nitrogen released into coastal waters, are an important part of the equation. And “certainly, some green infrastructure would help,” says Boudinot, like restoring coastal and tidal wetlands to help filter nutrient runoff.
In the meantime, Boudinot continues to collect data in advance of amping up his own operations. Still unknown is the effect of releasing huge amounts of alkalinity into the ocean. Boudinot says a pH of 9 or higher can be too harsh for marine life, plus it can also trigger a release of CO2 from the water back into the atmosphere. For a third pilot, on Governor’s Island in New York Harbor, Vycarb will install yet another system from which Boudinot’s team will frequently sample to analyze some of those and other impacts. “Let's really make sure that we know what the results are,” he says. “Let's have data to show, because in this carbon world, things behave very differently out in the real world versus on paper.”