Hard work pays off

Victor Barzana
7 min readOct 4, 2020
Hazard Busters logo created by Irina Ivanova

I haven’t posted in a while, but I can ensure you I have been very busy. This weekend I had the fantastic opportunity to join NASA International Space Apps Hackathon. Trust me, this is not a normal hackathon, the International Space Apps Challenge is an international mass collaboration focused on space exploration that takes place over 48-hours in cities around the world. As I saw this event was running this year in Ventspils, Latvia, there was no way in the world I would lose this opportunity.

The event embraces collaborative problem solving with a goal of producing relevant open-source solutions to address global needs applicable to both life on Earth and life in space. This year we have over 25 challenges in four areas: Earth, Outer Space, Humans and Robotics. NASA is leading this global collaboration along with a number of government collaborators and over 100 local organizing teams across the globe.

If you are patient enough, I will tell you in this article about the team, the challenge, how it feels, the problems we faced and how we approached this with a really interesting solution and last but not least, how it all ended.

The team

Team, building phase is always hard, there are people from different backgrounds, but the thing is, we have to code something, no code = no win, unless you have a one in a million idea. After searching around among participants, we came up with a wonderful team of engineers:

  • Irina Ivanova (UI/UX designer): Irina has a broad knowledge about software development, she can wireframe, create logos, and much more in less than a wink.
  • Victor Antonio Barzana Crespo (Senior Software Engineer) with years of experience in the IT field, I love taking challenges, especially if they bring something good to humanity.
  • Mahabir Gupta (Data scientist & AI specialist), take a minute and look at this man’s CV, I personally got impressed by how huge experience Mahabir has, he proved to all of us his knowledge and didn’t hesitate for a minute to come up with the solution that our project needed.
  • Shenaha Sivakumar (Specialist in Geoinformatics) A map expert or someone with clear enough understanding of what a satellite data output looks like. BTW, she’s looking for job, if you read this, go and give her a chance in Riga, trust me, she’s good!
  • Shravan Koundinya Vutukuru (Researcher at Mechanical Engineering Department, Renewable energy technologies) if you need help to send your rocket to space, this is your man.

So, we gave everyone a shovel, a helmet and jumped right into the challenge, ah and almost forgot it, we called our team: “Hazard Busters”. All the data about our project submission and challenge can be found here: https://2020.spaceappschallenge.org/challenges/inform/automated-detection-hazards/teams/hazard-busters/project

The challenge

On Friday, October 2nd, 2020 we fired our engines and took the first challenge that was more according to our skillsets altogether.

Automated Detection of Hazards

Countless phenomena such as floods, fires, and algae blooms routinely impact ecosystems, economies, and human safety. Your challenge is to use satellite data to create a machine learning model that detects a specific phenomenon and build an interface that not only displays the detected phenomenon, but also layers it alongside ancillary data to help researchers and decision-makers better understand its impacts and scope.

Few hours after we started we already had details of which data comes from where, what satellite gives you which data and so on. But what we hadn’t figured out yet, was what we were going to do with this data at all, at this point about 16 hours of the challenge had passed and we had a very small idea how we would help researchers or for who our software was intended.

We met with Linda to understand our challenge

Luckily on Saturday afternoon, we got to talk to one of the experts Linda Gulbe (Researcher and lecturer at Ventspils Augstskola). If you want to know anything about satellite data, you have to ask Linda, she explained to us among other things that real-time satellite view is not quite there yet. That when there are clouds satellites are blind, among tons of other cool tips. Here is where our real journey started. How we would create the software, the AI algorithm, the training model, and for who all this was intended.

But what problem we chose to solve?

Making the best use of NASA satellite’s data, our team implemented a solution which uses CNN deep learning model to detect or predict potential natural hazards. At first, the model will be trained with already existing satellite images and the prediction output will be stored in a database that will be synchronized with the frontend web application. This is an interactive app which pushes notifications (to the areas they are interested in) to researchers, insurance companies, landowners and other parties, allowing them to make decisions about potential hazards.

Satellites resolution & evolution

The reason why we went this way and not the real-time notification one is because of the below table, picture, for instance, the best-case scenario, where 1km is the minimum size in the grid, therefore, in order for us to find a red dot in the screen or altered temperature levels, already 1km of land should have been burned. Therefore, rather than informing in real-time, we would be informing in really-bad-time.

Satellite resolution table https://smap.jpl.nasa.gov/data/

So, how we addressed this challenge?

  • Researched the favourable conditions for hazards and downloaded related satellite data
  • Created CNN Deep Learning Model
  • Trained the model with satellite images
  • Validated the output against the past fire-prone areas to check the accuracy and tuned the parameters accordingly.
  • Stored the output as GeoJSON format in a database accessible by the web application.
  • Wireframed and built a Web and Mobile App with Ionic Framework that allows users to interact, modify, dismiss or alert about a fire outbreak

How We Developed This Project?

Used a deep learning model with an object detection algorithm to find out a spot in the satellite data. Moreover, we use computer vision and the OpenCV libraries for satellite image processing.

Once the output is generated, the data is stored as GeoJSON in the Firebase database so that any update is centrally triggered in real-time in Firebase. So that our server-less code can handle this with custom functions and notify the clients via push notification that have the mobile app installed. For the learning model was used Python FLASK, Google Collab, TensorFlow, GDAL and so on.

For the current project, the consumer application (Hybrid Mobile App), were chosen the following technologies:

  • React.js: Frontend JavaScript Framework
  • Ionic Framework, allowing the same source code being served as a Web Application or deployed to the Google Play Store or to Apple App Store.
  • Google Firebase (Server-less) with real-time database and push notifications.
  • Google Maps API, although we will be moving to an OpenStreetMap server to save costs in the near future.
  • As the data communication format, we have chosen GeoJSON.
  • The user is able to authenticate either with username/password or with Google.

How We Used Space Agency Data in This Project?

  • The Landsat-8 and Modis data were used to detect vegetation (boreal forest), moisture content, Landuse/landcover, Fire region using indices like NDVI, NDMI, SPI, NBR.
  • We combined Landsat-8 data along with CNES and CSA datasets.
  • With Aster DEM we obtained slope and elevation information.
  • Using the GEOS 16&17 satellite the abnormal change in the temperature near the burning region was identified to detect the source of fire within a few minutes of its outbreak.

In the future, We plan to add even social indicators which help us to better estimate damage and losses that will be incurred. When extending to detect other hazards we plan to include some more satellite images which helps us in predicting that particular hazard accurately.

Project Demo

  • To demonstrate our application, the frontend and backend will coordinate with each other. In the frontend, the user will login in the app and select a certain region he wants to monitor. Then it will send the coordinates to the backend so that the analysis can take place and the model will try to find the hazard within that region or in the configured radio.
  • Currently, the application is in development state, however, some of the source code of the training model and the AI side can be found in the GitHub repository.
  • Please see the detailed PowerPoint presentation with specific pictures, wireframe and other details here: https://docs.google.com/presentation/d/1TleRe3S3Pc545lq1g82-bnOumf26jqF67_92lfjc4ZY/edit?usp=sharing

Project Code

https://github.com/mahavir9008/HazardBusters; https://github.com/vbarzana/hazard-busters

🥳Our Team was selected as Global Nominees

  • Alongside with another team, we were selected to participate in the global NASA competition
  • That is the passageway out of Local straight forward to Global NASA competition level
  • And we even made some money, this was the best challenge I have been in so far, no doubt about it.

Conclusions

  • You may think of this as a funny scene where we delivered two projects at a 10–30% of completion. Reused things, copied code, the app is buggy and not functional, but the truth is that in 2 days, you barely get time to fill in a PowerPoint presentation, trust me, when you want to start writing the code there’s just no more time left.
  • With this project, I had the opportunity to meet and work together with amazing engineers.
  • You are not going to believe how many APIs NASA has. This weekend was a complete dive into a new world that I am not used to, AI, satellite data, NASA. I have enjoyed every bit of this challenge and I am looking forward to participating in any other upcoming events.
  • Never give up, we thought we had lost everything with almost a complete day lost, and we came up with the solution the second day, and still, we were able to write some code and we were selected as Global Nominees.

--

--