The report released by the U.N. Intergovernmental Panel on Climate Change on April 4, 2022, delivers a stark message: the fate of our planet hangs in the balance. U.N. Secretary-General Antonio Guterres emphasized the urgency of the situation, stating that we are rapidly heading towards a world that is uninhabitable. According to the report, there is a high probability that global temperatures will rise by a devastating 3 degrees Celsius, exceeding the agreed-upon limit of 1.5 degrees Celsius. To avert this existential threat, we must take immediate and drastic action by reducing emissions by 43 percent within the next decade.
Image generated on USP.ai
Interestingly, in the midst of this crisis, some researchers are exploring the potential of artificial intelligence (AI) to tackle climate change—an issue often seen as a threat posed by AI itself. Could climate change be the catalyst that forces AI to become more crisis-responsive and focused on innovative solutions for large-scale hazards? Such technology could be precisely what we need at this critical juncture. It has the potential to generate emergency responses that differ significantly from the profit-driven, bias-amplifying, and misinformation-spreading algorithms we are familiar with. By incorporating extensive input from the field and engaged networks of participants, this “climate AI” could revolutionize the tech ecosystem, just as our physical ecosystems face their greatest risks.
This transformative approach is reflected in a report presented at the COP26 climate summit in November, authored by the Global Partnership on Artificial Intelligence. The report, written by 15 co-authors comprising researchers and activists from 12 countries, argues that while we must remain vigilant against racial and gender biases and the tendency of big data to perpetuate inequalities, AI can still play a crucial role in prediction, mitigation, and adaptation. Ignoring this potential would be a grave mistake.
Image generated on USP.ai
Aiming to coincide with the COP26 report, a visualization platform called “This Climate Does Not Exist” has emerged, developed by a group of motivated young researchers. This project demonstrates the potential for repurposing a harmful technology to create a personal, immersive, and unforgettable experience of climate change for the general public. By utilizing a machine learning algorithm commonly used for generating deepfakes—fabricated, hyperrealistic videos that swap visual and audio data—this platform can generate lifelike depictions of floods or wildfires for any given street address.
Lead researcher Sasha Luccioni, a postdoctoral fellow at the Quebec Artificial Intelligence Institute, highlights the fact that we have grown somewhat indifferent to climate disasters when they befall strangers. However, when we witness our own homes submerged in several feet of water, it becomes impossible to ignore the severity of the situation.
Image generated on USP.ai
While the true impact of ‘This Climate Does Not Exist’ is yet to be fully realized, one thing is evident: AI has an astonishing ability to adapt and pivot to meet entirely new demands, especially during times of crisis. The most innovative advancements in this field often come from practitioners who are mobilized by the imminent threat of disasters. Microsoft and NASA are two notable examples of such initiatives, emphasizing the need for a participatory democracy within AI to achieve ambitious climate goals. These initiatives rely on networks of local innovators who possess deep knowledge about their specific regions and act urgently in response.
Microsoft, through its AI for Earth program, is constructing a planetary computer as its central component. Initially proposed in 2019 by the company’s chief environmental officer, Lucas Joppa, this computer is designed to function as a geospatial search engine that expedites climate decision-making and prevents environmental catastrophes. To accomplish this, it aggregates data from various sources, including NASA, NOAA, the European Space Agency, as well as data collected through partnerships such as the collaboration between the U.K. Met Office, the Chinese Meteorological Administration, and the Institute of Atmospheric Physics within the Chinese Academy of Sciences.
Image generated on USP.ai
However, relying solely on state-of-the-art data sets is insufficient. Microsoft President Brad Smith emphasized this point in a blog post in 2020, stating that the planetary computer requires active contributions from crowdsourced networks, including those receiving grants from Microsoft, such as iNaturalist. iNaturalist is a mobile app platform that enables amateur ecologists to upload and share biodiversity information, now accessible in 37 languages. Smith noted that the complexity of the planetary computer necessitates collaboration, stating that Microsoft cannot build it alone and emphasizing the importance of the work and demands of their grantees. To truly make a positive impact on endangered ecosystems, AI requires engaged communities as much as it does multinational datasets.
Similarly, the newly announced Multi-Mission Algorithm and Analysis Platform, a collaborative effort between NASA and the European Space Agency, emphasizes this very principle. This open-source project will initially focus on gathering data regarding forest biomass, which is critical for monitoring climate change as forests absorb carbon during their growth and release carbon when they burn or decay. Given the significant variations in forest types across regions and the diverse sources of data, including satellite instruments, the International Space Station, and airborne and ground campaigns, the role of AI is to merge these disparate data sets and establish interoperability among them.
Image generated on USP.ai
Instead of imposing a single, supposedly universal metric worldwide, the input protocol established by NASA and ESA allows research communities to maintain their diversity, adhering to their own languages and methodologies. The cloud-based AI platform then analytically integrates the data to make it accessible for everyone. Moreover, this input-friendly architecture goes a step further by encouraging researchers to collaboratively develop algorithms and codes that cater to specific project requirements. These newly developed algorithms will be shared in an open-access code repository. The program is designed to minimize barriers to participation, ensuring that anyone interested in environmental data can contribute. This approach transforms the platform into a means for on-the-ground vigilance and ingenuity.
Given that climate change impacts each location differently, the success of AI innovations relies on a broad range of inputs as an empirical foundation. This foundation serves to motivate, validate, and diversify efforts. This subset of artificial intelligence, which leverages an unprecedented crisis to foster solidarity, stands in stark contrast to social media algorithms, facial recognition software, and autonomous weapon systems. The U.N. report highlights the technological feasibility of charting a noncatastrophic future, and “climate AI” could play a vital role in achieving this objective. The responsibility lies with us to harness its potential to the fullest extent.