Skip to content

gdsc-seoultech/GoldenHour_Android

Repository files navigation

⏱️️ 2023 Solution Challenge : Golden Hour ⏱️

What is Golden Hour ?

Many types of natural disasters are now five times more common in the world than they were 50 years ago. Each year, an increasing number of people are injured or killed by natural disasters.

One of the reasons for the high number of casualties is the "lack of information for proper first aid and disaster action tips".

To solve this problem, we created Golden Hour, an app that provides safety guides for first aid and disaster action tips.

The Meaning of Golden Hour

The first hour after the occurrence of a traumatic injury, considered the most critical for successful emergency treatment.

Download

Golden Hour v1.0.0

Feature

Safety Guide

  • It provides a slide webtoon format using images and text to show how to respond to emergencies we may encounter, such as CPR and airway obstruction.
  • It provides CPR compression points using MLKit Pose Detection to assist in effective CPR.
  • It provides hemostasis points based on TFlite Object Detection to assist in hemostasis in case of bleeding.

Disaster Action Tip

  • Based on the disaster message, the slide webtoon format using images and text to provide actions that correspond to the current disaster situation.
  • Users can carefully check their pre-set emergency contacts and relief supplies through the checklist.

Emergency Report

  • It provides a button to call the police, fire, or civil authorities to make a quick report.
  • It provides the current user's location so that it can be utilized in reporting.
  • The question/answer format allows users to quickly and easily fill out the necessary information to submit an emergency text report.

Safety Map

  • It provides the location of safety amenities based on the user's current location utilizing Google Maps.
  • Safety amenities provided include hospitals/emergency rooms, pharmacies, AEDs, fire extinguishers, shelters, and transitional housing.

Demo Video

Project Structure

golden-hour structure

Detailed implementations using Google technology

Android

  • We made it possible for users signed in with a Google account to register, view, edit, and delete relief supplies and emergency contacts for disasters or emergencies.
  • We used Google Maps to quickly locate nearby hospitals, emergency rooms, pharmacies, and AEDs based on the user's current location.
  • Using the ML Kit's Pose Detection model and CameraX, we made it possible to detect the upper body of a person who collapsed from cardiac arrest and show the CPR compression point from it. As a next step, we made the model detect the angle of the rescuer's arm and sound an alert whenever it is less than 120 degrees. This allows CPR to proceed in a more correct posture.
  • Using the TFLite custom object detection model and CameraX, we implemented it to detect bleeding or wounded areas and mark their center of gravity as hemostatic points.

BackEnd

  • To provide the Backend technology to the client Android app in the form of a REST API, we used several services from Google Cloud Platform (GCP).

  • To create, read, update, and delete data, we used MySQL, a relational database engine. We used Google SQL, a fully managed relational database service provided by GCP, to make MySQL accessible to Spring Project via Data JPA.

  • To create communication via REST APIs, we used Spring Project, a web framework in JAVA. We used Google Compute Engine, a virtual machine service provided by GCP, to access the REST API provided by Spring Project through Retrofit on Android.

  • To store and serve the slide images for the first aid and safety guide, we needed to use a Cloud File service. We used Google Cloud Storage, a managed service for storing unstructured data provided by GCP, to serve the slide images with a public URL for each image.

DL

  • To provide the specific DL technology to the Android Application, we used several model and services from Google Technologies.
  • To train, provide the object detection and recognition model about the bleeding wounds of the patient, We used Google Tensorflow libraries, especially TFLite as a mobile library to deploy the model to the Android application.
  • To make lightweight the wounds detection and recognition model to put in Android App, We also amke the custom model based on MobileNet_V2 from Tensorflow Hub by Google.
  • To detect the precise body landmarks of the patient, we used the mediapipe pose detector from the ML Kit. Mediapipe pose detector is inspired by the lightWeighted BlazeFace model as a proxy for a person detector.
  • If you want to see more information, please see the Additional DL Documentation below.

Team Member

Android BackEnd DL
백송희
백송희
이하은
이하은

이유석
이유석
황재영
황재영

Repositories

Project Android Back-End DL