Massive benefits of AI come with environmental and human costs. Can AI itself be part of the solution?
The recent explosion of generative artificial intelligence tools like ChatGPT and Dall-E enabled anyone with internet access to harness AI’s power for enhanced productivity, creativity, and problem-solving. With their ever-improving capabilities and expanding user base, these tools proved useful across disciplines, from the creative to the scientific.
But beneath the technological wonders of human-like conversation and creative expression lies a dirty secret—an alarming environmental and human cost. AI has an immense carbon footprint. Systems like ChatGPT take months to train in high-powered data centers, which demand huge amounts of electricity, much of which is still generated with fossil fuels, as well as water for cooling. “One of the reasons why Open AI needs investments [to the tune of] $10 billion from Microsoft is because they need to pay for all of that computation,” says Kentaro Toyama, a computer scientist at the University of Michigan. There’s also an ecological toll from mining rare minerals required for hardware and infrastructure. This environmental exploitation pollutes land, triggers natural disasters and causes large-scale human displacement. Finally, for data labeling needed to train and correct AI algorithms, the Big Data industry employs cheap and exploitative labor, often from the Global South.
Generative AI tools are based on large language models (LLMs), with most well-known being various versions of GPT. LLMs can perform natural language processing, including translating, summarizing and answering questions. They use artificial neural networks, called deep learning or machine learning. Inspired by the human brain, neural networks are made of millions of artificial neurons. “The basic principles of neural networks were known even in the 1950s and 1960s,” Toyama says, “but it’s only now, with the tremendous amount of compute power that we have, as well as huge amounts of data, that it’s become possible to train generative AI models.”
Though there aren’t any official figures about the power consumption or emissions from data centers, experts estimate that they use one percent of global electricity—more than entire countries.
In recent months, much attention has gone to the transformative benefits of these technologies. But it’s important to consider that these remarkable advances may come at a price.
AI’s carbon footprint
In their latest annual report, 2023 Landscape: Confronting Tech Power, the AI Now Institute, an independent policy research entity focusing on the concentration of power in the tech industry, says: “The constant push for scale in artificial intelligence has led Big Tech firms to develop hugely energy-intensive computational models that optimize for ‘accuracy’—through increasingly large datasets and computationally intensive model training—over more efficient and sustainable alternatives.”
Though there aren’t any official figures about the power consumption or emissions from data centers, experts estimate that they use one percent of global electricity—more than entire countries. In 2019, Emma Strubell, then a graduate researcher at the University of Massachusetts Amherst, estimated that training a single LLM resulted in over 280,000 kg in CO2 emissions—an equivalent of driving almost 1.2 million km in a gas-powered car. A couple of years later, David Patterson, a computer scientist from the University of California Berkeley, and colleagues, estimated GPT-3’s carbon footprint at over 550,000 kg of CO2 In 2022, the tech company Hugging Face, estimated the carbon footprint of its own language model, BLOOM, as 25,000 kg in CO2 emissions. (BLOOM’s footprint is lower because Hugging Face uses renewable energy, but it doubled when other life-cycle processes like hardware manufacturing and use were added.)
Luckily, despite the growing size and numbers of data centers, their increasing energy demands and emissions have not kept pace proportionately—thanks to renewable energy sources and energy-efficient hardware.
But emissions don’t tell the full story.
AI’s hidden human cost
“If historical colonialism annexed territories, their resources, and the bodies that worked on them, data colonialism’s power grab is both simpler and deeper: the capture and control of human life itself through appropriating the data that can be extracted from it for profit.” So write Nick Couldry and Ulises Mejias, authors of the book The Costs of Connection.
The energy requirements, hardware manufacture and the cheap human labor behind AI systems disproportionately affect marginalized communities.
Technologies we use daily inexorably gather our data. “Human experience, potentially every layer and aspect of it, is becoming the target of profitable extraction,” Couldry and Meijas say. This feeds data capitalism, the economic model built on the extraction and commodification of data. While we are being dispossessed of our data, Big Tech commodifies it for their own benefit. This results in consolidation of power structures that reinforce existing race, gender, class and other inequalities.
“The political economy around tech and tech companies, and the development in advances in AI contribute to massive displacement and pollution, and significantly changes the built environment,” says technologist and activist Yeshi Milner, who founded Data For Black Lives (D4BL) to create measurable change in Black people’s lives using data. The energy requirements, hardware manufacture and the cheap human labor behind AI systems disproportionately affect marginalized communities.
AI’s recent explosive growth spiked the demand for manual, behind-the-scenes tasks, creating an industry described by Mary Gray and Siddharth Suri as “ghost work” in their book. This invisible human workforce that lies behind the “magic” of AI, is overworked and underpaid, and very often based in the Global South. For example, workers in Kenya who made less than $2 an hour, were the behind the mechanism that trained ChatGPT to properly talk about violence, hate speech and sexual abuse. And, according to an article in Analytics India Magazine, in some cases these workers may not have been paid at all, a case for wage theft. An exposé by the Washington Post describes “digital sweatshops” in the Philippines, where thousands of workers experience low wages, delays in payment, and wage theft by Remotasks, a platform owned by Scale AI, a $7 billion dollar American startup. Rights groups and labor researchers have flagged Scale AI as one company that flouts basic labor standards for workers abroad.
It is possible to draw a parallel with chattel slavery—the most significant economic event that continues to shape the modern world—to see the business structures that allow for the massive exploitation of people, Milner says. Back then, people got chocolate, sugar, cotton; today, they get generative AI tools. “What’s invisible through distance—because [tech companies] also control what we see—is the massive exploitation,” Milner says.
“At Data for Black Lives, we are less concerned with whether AI will become human…[W]e’re more concerned with the growing power of AI to decide who’s human and who’s not,” Milner says. As a decision-making force, AI becomes a “justifying factor for policies, practices, rules that not just reinforce, but are currently turning the clock back generations years on people’s civil and human rights.”
Ironically, AI plays an important role in mitigating its own harms—by plowing through mountains of data about weather changes, extreme weather events and human displacement.
Nuria Oliver, a computer scientist, and co-founder and vice-president of the European Laboratory of Learning and Intelligent Systems (ELLIS), says that instead of focusing on the hypothetical existential risks of today’s AI, we should talk about its real, tangible risks.
“Because AI is a transverse discipline that you can apply to any field [from education, journalism, medicine, to transportation and energy], it has a transformative power…and an exponential impact,” she says.
AI's accountability
“At the core of what we were arguing about data capitalism [is] a call to action to abolish Big Data,” says Milner. “Not to abolish data itself, but the power structures that concentrate [its] power in the hands of very few actors.”
A comprehensive AI Act currently negotiated in the European Parliament aims to rein Big Tech in. It plans to introduce a rating of AI tools based on the harms caused to humans, while being as technology-neutral as possible. That sets standards for safe, transparent, traceable, non-discriminatory, and environmentally friendly AI systems, overseen by people, not automation. The regulations also ask for transparency in the content used to train generative AIs, particularly with copyrighted data, and also disclosing that the content is AI-generated. “This European regulation is setting the example for other regions and countries in the world,” Oliver says. But, she adds, such transparencies are hard to achieve.
Google, for example, recently updated its privacy policy to say that anything on the public internet will be used as training data. “Obviously, technology companies have to respond to their economic interests, so their decisions are not necessarily going to be the best for society and for the environment,” Oliver says. “And that’s why we need strong research institutions and civil society institutions to push for actions.” ELLIS also advocates for data centers to be built in locations where the energy can be produced sustainably.
Ironically, AI plays an important role in mitigating its own harms—by plowing through mountains of data about weather changes, extreme weather events and human displacement. “The only way to make sense of this data is using machine learning methods,” Oliver says.
Milner believes that the best way to expose AI-caused systemic inequalities is through people's stories. “In these last five years, so much of our work [at D4BL] has been creating new datasets, new data tools, bringing the data to life. To show the harms but also to continue to reclaim it as a tool for social change and for political change.” This change, she adds, will depend on whose hands it is in.
Scientists make progress with growing organs for transplants
Story by Big Think
For over a century, scientists have dreamed of growing human organs sans humans. This technology could put an end to the scarcity of organs for transplants. But that’s just the tip of the iceberg. The capability to grow fully functional organs would revolutionize research. For example, scientists could observe mysterious biological processes, such as how human cells and organs develop a disease and respond (or fail to respond) to medication without involving human subjects.
Recently, a team of researchers from the University of Cambridge has laid the foundations not just for growing functional organs but functional synthetic embryos capable of developing a beating heart, gut, and brain. Their report was published in Nature.
The organoid revolution
In 1981, scientists discovered how to keep stem cells alive. This was a significant breakthrough, as stem cells have notoriously rigorous demands. Nevertheless, stem cells remained a relatively niche research area, mainly because scientists didn’t know how to convince the cells to turn into other cells.
Then, in 1987, scientists embedded isolated stem cells in a gelatinous protein mixture called Matrigel, which simulated the three-dimensional environment of animal tissue. The cells thrived, but they also did something remarkable: they created breast tissue capable of producing milk proteins. This was the first organoid — a clump of cells that behave and function like a real organ. The organoid revolution had begun, and it all started with a boob in Jello.
For the next 20 years, it was rare to find a scientist who identified as an “organoid researcher,” but there were many “stem cell researchers” who wanted to figure out how to turn stem cells into other cells. Eventually, they discovered the signals (called growth factors) that stem cells require to differentiate into other types of cells.
For a human embryo (and its organs) to develop successfully, there needs to be a “dialogue” between these three types of stem cells.
By the end of the 2000s, researchers began combining stem cells, Matrigel, and the newly characterized growth factors to create dozens of organoids, from liver organoids capable of producing the bile salts necessary for digesting fat to brain organoids with components that resemble eyes, the spinal cord, and arguably, the beginnings of sentience.
Synthetic embryos
Organoids possess an intrinsic flaw: they are organ-like. They share some characteristics with real organs, making them powerful tools for research. However, no one has found a way to create an organoid with all the characteristics and functions of a real organ. But Magdalena Żernicka-Goetz, a developmental biologist, might have set the foundation for that discovery.
Żernicka-Goetz hypothesized that organoids fail to develop into fully functional organs because organs develop as a collective. Organoid research often uses embryonic stem cells, which are the cells from which the developing organism is created. However, there are two other types of stem cells in an early embryo: stem cells that become the placenta and those that become the yolk sac (where the embryo grows and gets its nutrients in early development). For a human embryo (and its organs) to develop successfully, there needs to be a “dialogue” between these three types of stem cells. In other words, Żernicka-Goetz suspected the best way to grow a functional organoid was to produce a synthetic embryoid.
As described in the aforementioned Nature paper, Żernicka-Goetz and her team mimicked the embryonic environment by mixing these three types of stem cells from mice. Amazingly, the stem cells self-organized into structures and progressed through the successive developmental stages until they had beating hearts and the foundations of the brain.
“Our mouse embryo model not only develops a brain, but also a beating heart [and] all the components that go on to make up the body,” said Żernicka-Goetz. “It’s just unbelievable that we’ve got this far. This has been the dream of our community for years and major focus of our work for a decade and finally we’ve done it.”
If the methods developed by Żernicka-Goetz’s team are successful with human stem cells, scientists someday could use them to guide the development of synthetic organs for patients awaiting transplants. It also opens the door to studying how embryos develop during pregnancy.
Scientists find enzymes in nature that could replace toxic chemicals
Some 900 miles off the coast of Portugal, nine major islands rise from the mid-Atlantic. Verdant and volcanic, the Azores archipelago hosts a wealth of biodiversity that keeps field research scientist, Marlon Clark, returning for more. “You’ve got this really interesting biogeography out there,” says Clark. “There’s real separation between the continents, but there’s this inter-island dispersal of plants and seeds and animals.”
It’s a visual paradise by any standard, but on a microscopic level, there’s even more to see. The Azores’ nutrient-rich volcanic rock — and its network of lagoons, cave systems, and thermal springs — is home to a vast array of microorganisms found in a variety of microclimates with different elevations and temperatures.
Clark works for Basecamp Research, a biotech company headquartered in London, and his job is to collect samples from ecosystems around the world. By extracting DNA from soil, water, plants, microbes and other organisms, Basecamp is building an extensive database of the Earth’s proteins. While DNA itself isn’t a protein, the information stored in DNA is used to create proteins, so extracting, sequencing, and annotating DNA allows for the discovery of unique protein sequences.
Using what they’re finding in the middle of the Atlantic and beyond, Basecamp’s detailed database is constantly growing. The outputs could be essential for cleaning up the damage done by toxic chemicals and finding alternatives to these chemicals.
Catalysts for change
Proteins provide structure and function in all living organisms. Some of these functional proteins are enzymes, which quite literally make things happen.
“Industrial chemistry is heavily polluting, especially the chemistry done in pharmaceutical drug development. Biocatalysis is providing advantages, both to make more complex drugs and to be more sustainable, reducing the pollution and toxicity of conventional chemistry," says Ahir Pushpanath, who heads partnerships for Basecamp.
“Enzymes are perfectly evolved catalysts,” says Ahir Pushpanath, a partnerships lead at Basecamp. ”Enzymes are essentially just a polymer, and polymers are made up of amino acids, which are nature’s building blocks.” He suggests thinking about it like Legos — if you have a bunch of Lego pieces and use them to build a structure that performs a function, “that’s basically how an enzyme works. In nature, these monuments have evolved to do life’s chemistry. If we didn’t have enzymes, we wouldn’t be alive.”
In our own bodies, enzymes catalyze everything from vision to digesting food to regrowing muscles, and these same types of enzymes are necessary in the pharmaceutical, agrochemical and fine chemical industries. But industrial conditions differ from those inside our bodies. So, when scientists need certain chemical reactions to create a particular product or substance, they make their own catalysts in their labs — generally through the use of petroleum and heavy metals.
These petrochemicals are effective and cost-efficient, but they’re wasteful and often hazardous. With growing concerns around sustainability and long-term public health, it's essential to find alternative solutions to toxic chemicals. “Industrial chemistry is heavily polluting, especially the chemistry done in pharmaceutical drug development,” Pushpanath says.
Basecamp is trying to replace lab-created catalysts with enzymes found in the wild. This concept is called biocatalysis, and in theory, all scientists have to do is find the right enzymes for their specific need. Yet, historically, researchers have struggled to find enzymes to replace petrochemicals. When they can’t identify a suitable match, they turn to what Pushpanath describes as “long, iterative, resource-intensive, directed evolution” in the laboratory to coax a protein into industrial adaptation. But the latest scientific advances have enabled these discoveries in nature instead.
Marlon Clark, a research scientist at Basecamp Research, looks for novel biochemistries in the Azores.
Glen Gowers
Enzyme hunters
Whether it’s Clark and a colleague setting off on an expedition, or a local, on-the-ground partner gathering and processing samples, there’s a lot to be learned from each collection. “Microbial genomes contain complete sets of information that define an organism — much like how letters are a code allowing us to form words, sentences, pages, and books that contain complex but digestible knowledge,” Clark says. He thinks of the environmental samples as biological libraries, filled with thousands of species, strains, and sequence variants. “It’s our job to glean genetic information from these samples.”
“We can actually dream up new proteins using generative AI," Pushpanath says.
Basecamp researchers manage this feat by sequencing the DNA and then assembling the information into a comprehensible structure. “We’re building the ‘stories’ of the biota,” Clark says. The more varied the samples, the more valuable insights his team gains into the characteristics of different organisms and their interactions with the environment. Sequencing allows scientists to examine the order of nucleotides — the organic molecules that form DNA — to identify genetic makeups and find changes within genomes. The process used to be too expensive, but the cost of sequencing has dropped from $10,000 a decade ago to as low as $100. Notably, biocatalysis isn’t a new concept — there have been waves of interest in using natural enzymes in catalysis for over a century, Pushpanath says. “But the technology just wasn’t there to make it cost effective,” he explains. “Sequencing has been the biggest boon.”
AI is probably the second biggest boon.
“We can actually dream up new proteins using generative AI,” Pushpanath says, which means that biocataylsis now has real potential to scale.
Glen Gowers, the co-founder of Basecamp, compares the company’s AI approach to that of social networks and streaming services. Consider how these platforms suggest connecting with the friends of your friends, or how watching one comedy film from the 1990s leads to a suggestion of three more.
“They’re thinking about data as networks of relationships as opposed to lists of items,” says Gowers. “By doing the same, we’re able to link the metadata of the proteins — by their relationships to each other, the environments in which they’re found, the way those proteins might look similar in sequence and structure, their surrounding genome context — really, this just comes down to creating a searchable network of proteins.”
On an Azores island, this volcanic opening may harbor organisms that can help scientists identify enzymes for biocatalysis to replace toxic chemicals.
Emma Bolton
Uwe Bornscheuer, professor at the Institute of Biochemistry at the University of Greifswald, and co-founder of Enzymicals, another biocatalysis company, says that the development of machine learning is a critical component of this work. “It’s a very hot topic, because the challenge in protein engineering is to predict which mutation at which position in the protein will make an enzyme suitable for certain applications,” Bornscheuer explains. These predictions are difficult for humans to make at all, let alone quickly. “It is clear that machine learning is a key technology.”
Benefiting from nature’s bounty
Biodiversity commonly refers to plants and animals, but the term extends to all life, including microbial life, and some regions of the world are more biodiverse than others. Building relationships with global partners is another key element to Basecamp’s success. Doing so in accordance with the access and benefit sharing principles set forth by the Nagoya Protocol — an international agreement that seeks to ensure the benefits of using genetic resources are distributed in a fair and equitable way — is part of the company's ethos. “There's a lot of potential for us, and there’s a lot of potential for our partners to have exactly the same impact in building and discovering commercially relevant proteins and biochemistries from nature,” Clark says.
Bornscheuer points out that Basecamp is not the first company of its kind. A former San Diego company called Diversa went public in 2000 with similar work. “At that time, the Nagoya Protocol was not around, but Diversa also wanted to ensure that if a certain enzyme or microorganism from Costa Rica, for example, were used in an industrial process, then people in Costa Rica would somehow profit from this.”
An eventual merger turned Diversa into Verenium Corporation, which is now a part of the chemical producer BASF, but it laid important groundwork for modern companies like Basecamp to continue to scale with today’s technologies.
“To collect natural diversity is the key to identifying new catalysts for use in new applications,” Bornscheuer says. “Natural diversity is immense, and over the past 20 years we have gained the advantages that sequencing is no longer a cost or time factor.”
This has allowed Basecamp to rapidly grow its database, outperforming Universal Protein Resource or UniProt, which is the public repository of protein sequences most commonly used by researchers. Basecamp’s database is three times larger, totaling about 900 million sequences. (UniProt isn’t compliant with the Nagoya Protocol, because, as a public database, it doesn’t provide traceability of protein sequences. Some scientists, however, argue that Nagoya compliance hinders progress.)
“Eventually, this work will reduce chemical processes. We’ll have cleaner processes, more sustainable processes," says Uwe Bornscheuer, a professor at the University of Greifswald.
With so much information available, Basecamp’s AI has been trained on “the true dictionary of protein sequence life,” Pushpanath says, which makes it possible to design sequences for particular applications. “Through deep learning approaches, we’re able to find protein sequences directly from our database, without the need for further laboratory-directed evolution.”
Recently, a major chemical company was searching for a specific transaminase — an enzyme that catalyzes a transfer of amino groups. “They had already spent a year-and-a-half and nearly two million dollars to evolve a public-database enzyme, and still had not reached their goal,” Pushpanath says. “We used our AI approaches on our novel database to yield 10 candidates within a week, which, when validated by the client, achieved the desired target even better than their best-evolved candidate.”
Basecamp’s other huge potential is in bioremediation, where natural enzymes can help to undo the damage caused by toxic chemicals. “Biocatalysis impacts both sides,” says Gowers. “It reduces the usage of chemicals to make products, and at the same time, where contamination sites do exist from chemical spills, enzymes are also there to kind of mop those up.”
So far, Basecamp's round-the-world sampling has covered 50 percent of the 14 major biomes, or regions of the planet that can be distinguished by their flora, fauna, and climate, as defined by the World Wildlife Fund. The other half remains to be catalogued — a key milestone for understanding our planet’s protein diversity, Pushpanath notes.
There’s still a long road ahead to fully replace petrochemicals with natural enzymes, but biocatalysis is on an upward trajectory. "Eventually, this work will reduce chemical processes,” Bornscheuer says. “We’ll have cleaner processes, more sustainable processes.”