|
Illustration by Tenzin Chosang By Madeline Lee These days, artificial intelligence tools like ChatGPT and Google Gemini are taking the world by storm due to their applications in industries like medicine, scientific research, and agriculture. AI has numerous uses in the environmental and climate change mitigation sector as well, from charting methane emissions to assisting with precision agriculture. However, despite these benefits, people should also slow down and consider the potential negative impacts of such widely used AI technologies. In particular, AI data centers harm the environment through large energy consumption, which causes increased demand for fossil fuels and puts strain on nearby communities. What exactly are AI data centers?
Have you ever wondered how large-scale generative AI tools are trained? The deep learning models behind popular tools like Chat-GPT and DALL-E are trained and deployed in AI data centers. Inside these temperature-controlled buildings are mass amounts of computing infrastructure, including servers and data storage drives. Data centers are not only used for artificial intelligence. Amazon, for one, has over 500,000 data centers worldwide, housing thousands of servers used to power their cloud computing services. While the concept of a data center supporting computer-based services is nothing new, the rapid expansion of artificial intelligence technologies in recent years has generated mass demand for new AI data centers. This is a huge problem, as building more and more AI data centers only exacerbates current environmental issues. Increased demand for fossil fuels So what exactly are the negative environmental impacts of AI data centers? Not only is the amount of data centers rapidly increasing due to technological growth, but the amount of energy required by generative AI training software is about seven to eight times greater than the energy consumed by typical computing software. Researchers estimate that electricity consumption of data centers will approach 1,050 terrawatt-hours by 2026, an amount on par with the total electricity demand from countries like Japan and Russia. Although this number includes data centers that are not strictly AI-based, the demand for new data centers to be built is increasing largely due to the constant creation of new AI technologies, meaning that energy demand will exponentially increase in the near future. The amount of energy required to power AI data centers is simply unsustainable. Due to AI companies’ current rate of expansion, the bulk of the energy powering these centers comes from fossil fuels, which have extremely detrimental effects on the environment due to the greenhouse effect. The more fossil fuels that we burn for electricity, the more heat radiating from Earth’s surface will be “bounced back” by the fossil fuel particles in the atmosphere, thus contributing to global warming. One research paper estimated that the training process for OpenAI’s GPT-3 used 1,287 megawatt-hours of electricity, which can generate around 552 tons of carbon dioxide. This shockingly high number indicates the drastic negative environmental impact of the AI training process taking place in these data centers. AI data centers also contribute to mass water usage. Data centers require large amounts of water during the construction process and for the heating and cooling of electrical equipment. Despite the fact that nearly a quarter of the global population already lacks access to clean water, one estimate projected that data centers and other AI-related infrastructure is close to consuming more water than the country of Denmark. Environmental strain on nearby communities Lastly, AI data centers affect not only worldwide energy consumption, but also the resource usage of residential communities in their vicinity. This issue is complicated due to the short-term benefits of AI data centers in small rural towns: oftentimes, the creation of a data center leads to a boom in jobs, as thousands technicians and construction workers are needed to build and operate it. However, the increased expansion of AI data centers has an overall negative environmental impact. The immense amount of electricity and water needed to power the centers often competes with the need for resources from agriculture-based communities. Tech companies are often motivated to build AI data centers in rural agricultural areas to achieve easy access to land and power grids, yet end up straining local resource supplies. A single data center can consume the same amount of water as tens of thousands of households in a day, which puts strain on already-depleted water and electricity resources in smaller, often low-income communities. For example, a data center in the Pacific Northwest currently uses dams from the Columbia River to supply energy, but a reduction in snowpack (compressed masses of snow that melt and collect as water in dams) due to climate change is decreasing the dam’s potential to generate electricity. In the future, there may not be enough resources to power both communities’ agricultural and living needs and the AI data centers. Furthermore, AI data centers greatly affect people’s health by increasing air pollution levels, which will disproportionately affect communities already experiencing the firsthand effects of climate change-related pollution. Many data centers utilize backup diesel generators that create harmful fine particulate matter (PM2.5), nitrogen oxides, and sulfur dioxide–all of which are linked to increased asthma levels. A research study from the University of California Riverside and Caltech found that by 2030, data centers may contribute to 600,000 cases of asthma-related health conditions, marking a potential public health crisis due to the extreme polluting capabilities of these centers.
0 Comments
Leave a Reply. |
Categories
All
Archives
November 2025
|