Generative AI, the remarkable technology driving the creation of language and image content, has gained immense popularity worldwide. However, this technological advancement comes at a significant environmental cost, as it leaves a substantial carbon footprint. Nonetheless, not all AI systems are equally detrimental to the environment.
AI chatbots and image generators rely on thousands of computers housed in data centers like the Google facility in Oregon. Generative AI, the cutting-edge technology behind these chatbots and image generators, has raised concerns regarding its impact on our planet.
As an AI researcher, I frequently contemplate the energy expenses associated with developing artificial intelligence models. The more powerful the AI model, the greater the energy consumption. This leads us to question the implications of increasingly powerful generative AI models on society’s future carbon footprint.
Image generated on USP.ai
The term “generative” refers to the ability of an AI algorithm to produce intricate data, while “discriminative” AI makes decisions among a set of predetermined choices, yielding only a single outcome. For example, a discriminative AI system may determine whether to approve a loan application.
Generative AI is capable of generating highly complex outputs, such as sentences, paragraphs, images, or even short videos. Its applications range from smart speakers that generate audio responses to autocomplete systems suggesting search queries. More recently, generative AI has achieved the ability to produce human-like language and realistic photos.
QUANTIFYING THE ENERGY CONSUMPTION
Precisely estimating the energy cost of a single AI model is challenging, as it includes factors such as the energy used in manufacturing the computing equipment, developing the model itself, and its operational usage. In 2019, researchers discovered that creating a generative AI model called BERT, with 110 million parameters, consumed energy equivalent to a round-trip transcontinental flight for one person. The number of parameters refers to the size of the model, with larger models typically exhibiting higher proficiency. For instance, the much larger GPT-3, boasting 175 billion parameters, was estimated to have consumed 1,287 megawatt hours of electricity during its creation and generated 552 tons of carbon dioxide equivalent. This is equivalent to the emissions produced by 123 gasoline-powered passenger vehicles driven for one year. It’s worth noting that these figures pertain solely to the model’s development phase, prior to any user interaction.
Image generated on USP.ai
Carbon emissions are not solely dictated by model size. The BLOOM model, which is similar in size to GPT-3 and developed by the BigScience project in France, has a significantly lower carbon footprint. It consumed 433 MWh of electricity, resulting in 30 tons of CO2eq emissions. Moreover, a study conducted by Google revealed that by employing a more efficient model architecture, utilizing greener data centers, and optimizing processors, the carbon footprint of a model of the same size can be reduced by a factor of 100 to 1,000.
While larger models do consume more energy during their deployment, there is limited data available on the carbon footprint of a single generative AI query. However, some industry experts estimate that it may be four to five times higher than that of a search engine query. As chatbots and image generators gain popularity, and as Google and Microsoft integrate AI language models into their search engines, the number of daily queries received by these systems could grow exponentially.
Image generated on USP.ai
AI SEARCH BOTS
Just a few years ago, models like BERT or GPT were mainly confined to research labs with limited usage among the general public. However, everything changed on November 30, 2022, when OpenAI unveiled ChatGPT. As of March 2023, ChatGPT had amassed over 1.5 billion visits. Microsoft also integrated ChatGPT into its search engine, Bing, making it accessible to all users on May 4, 2023. If chatbots become as popular as search engines, the energy costs associated with deploying these AI systems could escalate significantly. Nevertheless, AI assistants have diverse applications beyond search, such as content writing, math problem-solving, and marketing campaign creation.
Another challenge lies in the continuous updates required for AI models. For instance, ChatGPT was trained using data only up until 2021, rendering it unaware of any events occurring thereafter. While the carbon footprint of developing ChatGPT is not publicly disclosed, it is likely higher than that of GPT-3. If regular recreation of the model becomes necessary to update its knowledge, the energy costs would surge even further.
Image generated on USP.ai
On the positive side, interacting with a chatbot can offer a more direct route to obtaining information compared to using a search engine. Rather than receiving a page filled with links, users can receive a direct answer akin to human interaction, assuming issues of accuracy are adequately addressed. The potential time saved in accessing information swiftly might counterbalance the increased energy usage in comparison to a search engine.
PATHS AHEAD
Predicting the future is challenging, but it is evident that large generative AI models are here to stay, with people increasingly relying on them for information. For instance, whereas students currently seek assistance from tutors, friends, or textbooks when solving math problems, they may likely turn to chatbots in the future. The same applies to areas of expert knowledge like legal advice or medical expertise.
While a single large AI model may not be detrimental to the environment, if thousands of companies develop slightly different AI bots for various purposes, each serving millions of users, the energy consumption could indeed become a concern. Further research is necessary to enhance the efficiency of generative AI.
Image generated on USP.ai
THE POSITIVE OUTLOOK
The encouraging news is that AI can operate on renewable energy sources. By locating computations in regions abundant in green energy or scheduling computations during times when renewable energy is readily available, emissions can be reduced by a factor of 30 to 40 compared to relying on grids predominantly powered by fossil fuels.
Furthermore, societal pressure can play a constructive role in urging companies and research labs to disclose the carbon footprints of their AI models, as some are already doing. In the future, consumers might even utilize this information to choose a “greener” chatbot option.