top of page

AI Climate: A Decision Making Tool for Climate Resilience

Updated: Jun 9, 2022

USING AI TO IDENTIFY CLIMATE CHANGE HAZARDS IN THE CITIES OF THE GLOBAL SOUTH


The Institute for International Urban Development (I2UD)’s Accelerator Grant partnership with PJMF, sponsored by GDI Solutions, aims to improve the capacity of I2UD’s AI Climate Platform to use machine learning (ML) to identify socially vulnerable areas and predict and map flooding hazards, landslides, and land values in Honduras.


By: Carlos Rufín, Alejandra Mortarini, Federico Bayle, and Alfredo Stein.


AI Climate Platform is a land management and decision-making tool geared to Global South cities that lack local data. Its objective is to cut the time and expense involved in climate hazard mapping, assessment, and prediction. It also incorporates a community inclusion, validation, and mapping process, in order to incorporate local knowledge, and to help local communities make sense of AI results and put them to use effectively—key needs in contexts of limited capacity of local governments to produce relevant data and address local hazards. AI Climate processes geospatial images and georeferenced datasets, and derives analytics-ready layers of climate change impacts. Under the PJMF’s Accelerator Grant, we are improving and developing layers to identify flooding and landslide risks, informal urbanization, and land value differentials.

Graphic 1: Tegucigalpa Area, Honduras

This is a synthesized graphic of the very first round of predictions. It shows a section of Tegucigalpa with informal settlement groundtruth-data annotations and predictions results. Image also shows flooding susceptibility results.



Data gathering is the most critical aspect of our Accelerator Grant project. AI Climate relies on different types of data such as, but not limited to, satellite imagery, weather data, topographic information, and points of interest locations, originating in a wide variety of sources, but particularly field sources, i.e. highly localized data. The higher the quality of data, the more accurate the result of the machine learning (ML) process. Our experience with our previous AI Climate Platform proof of concept (PoC) project showed us that it takes considerable time and guidance to gather field data. Since we are working, by design, in countries with limited existing data, the main thrust of our effort since the award of the grant has been to optimize the gathering and organizing of field data.


We have done this by developing a field data guide and catalog that explain to our field partners the aim and uses of the data within the ML process and the desired project outcomes, as well as the different kinds of data needed, the delivery time periods, and any additional information that can help our partners understand ML needs and gather useful data. We have then engaged in several rounds of conversations with our field partners—Habitat for Humanity, GOAL and the Honduran Institute of Earth Sciences (IHCIT), plus technical advisors Alfredo Stein and Carlos Rivas—to ensure mutual understanding and respond to questions raised about the required work. Our technical partner, Dymaxion Labs, has simultaneously engaged in gathering data from global databases that could be used for the project, on the basis of our partner’s knowledge and experience from the PoC. Lastly, we have searched for relevant data from sources beyond the field, particularly reports and analyses conducted by organizations in other countries and international organizations.


In order to mitigate the impact of any possible delays to our project timeline, in parallel to these activities Dymaxion Labs started preparing to receive and process the data provided by our field partners. Not all partners have GIS domain expertise, so we need to format the incoming data as vector layers and images before we can use them to retrain our ML models.


Our methodology is based on supervised learning techniques, combining the availability of global and local data to calibrate the precision of the models regarding the different data sources and their cost of acquisition. All software involved is open source, as a means to ensure that the AI Climate Platform is widely accessible to the large number of cities and communities in the Global South that lack the resources to carry out regular and comprehensive hazard identification. In particular, the programming language we will use is Python, with the open-source Dymaxion Labs’ libraries satproc and unetseg for data processing and modeling purposes. The deep learning framework is TensorFlow. For GIS analysis and data collection, we use QGIS.


For planning purposes, and specifically for preparing the tactical roadmap (an updated assessment of the critical activities, timelines and deliverables required for successful project completion), the critical path of activities follows the timeline needed to obtain the desired results according to the reporting deadlines established by the Accelerator Grant program. From these benchmarks, we worked backwards to estimate the critical tasks for results validation, ML model development, data preprocessing, and field data gathering. The senior project managers from I2UD have played the most critical and important role for development of the roadmap, by assembling a timeline based on the task completion needs communicated by the technical and field partners, and then negotiating with the partners the dates needed to meet the deadlines.


Graphic 2: Sula Valley Area, Honduras

Communication with local partners is critical.To better communicate, we have developed an informal settlement interactive map for the

Sula Valley and for the Tegucigalpa Area. Image shows snapshot of interactive map with ground truth data input (blue pins including settlement names) and first round of predictions (red polygons) to facilitate expert and in-situ validation. This is a synthesized graphic of the very first round of predictions.

For Dymaxion, roadmap development required significant internal coordination as well. With a diverse team composed of project managers, data scientists, computer vision and remote sensing specialists, and python developers, the data scientists and project manager roles were key to estimating the time needed to complete the ML model development and the interaction with the partners responsible for the validation of results.


In order to improve the quality of our expected outcomes, we have sought to use both satellite images in the public domain, and higher-resolution images typically available from commercial sources only. In view of the high cost of the latter, our approach has been to build relationships with commercial suppliers to request philanthropic use of a limited set of their images, given the non-profit nature of our organization and the project objectives. When the commercial donor that we had arranged for before the project failed to deliver the high-resolution images we sought, we developed an alternative plan. With our technical partners, we agreed on a minimum set of high-resolution images that we could use, gathered estimates of its cost, and requested permission from PJMF to reallocate some of our grant budget for this purpose. At the same time, we approached PJMF to seek their recommendations for other potential commercial partners. PJMF quickly identified another company, Planet, and facilitated contacts with the company to understand their image donation program, and to allow us to formulate our request. This interaction resulted in Planet agreeing to include us in their donation program, and the signing of a formal agreement in accordance with their own protocols.


An essential part of our project is to incorporate local communities’ detailed knowledge of their own conditions, and to ensure that they can make sense of our findings and thus be able to use the findings for their own benefit. Since both I2UD and Dymaxion Labs are located far from the field, in this case Honduras in Central America, our approach has been to rely on local partners that are well established in that area, that produce high-quality work, and that understand and value the nature and objectives of AI Climate. With the help of one of our advisors, Alfredo Stein, we were able to identify three field partner organizations that met our criteria, and through our network of contacts, we approached them to seek their collaboration, with remuneration for their work supported by grant resources. Because of these organizations’ prior work, we were reassured that they had strong connections with local communities, and we incorporated the verification of ML outcomes (“groundtruthing”) with these communities in explicit agreements with each field partner.


Since the start of the grant period, the technical syncs with PJMF have been key to our successful management of the project in a variety of critical ways. The first one was helping us understand grant timelines and deliverables, particularly the tactical roadmap. Secondly, the technical syncs have been key to unlocking the potential of the Cloudera Data Platform (CDP) to host AI Climate algorithms, particularly as we faced a learning curve about the setting up of CDP for our purposes and transitioning from the platform used by Dymaxion to CDP. Moreover, our discussions with PJMF on this matter during the syncs were not limited to the access to computing resources, but also included various aspects of our data collection strategy as well as deep learning model evaluation. Thirdly, the sync meetings afforded us the chance to raise our concerns about obtaining high-resolution images and seeking PJMF support in this regard, as previously explained. Last but not least, feedback from PJMF staff during the syncs about the global vs local aspects of our project was very helpful for rethinking the sustainability and scalability of the project and AI Climate.









45 views0 comments

Comments


bottom of page