Skip to main content

SmartSOS AI System

Abstract

SmartSOS is an emergency AI handling system that works on Arduino Nano and Android mobile phones. In an emergency, the user(s) can trigger an emergency message by simply waving their hand. The emergency message (SMS) that send to related contacts consists of the user's contact number and location coordinates.

Purpose

Robbers and thieves can strike unexpectedly, and they can catch you off guard even if you already have an emergency plan in place. During a Robbery, you may have no time to pick up your phone to call for an emergency. Even there has some emergency app that integrates all the functionalities within one app with one action, but pickup phone and opening these apps will consume time hence will alert that the robbery's attention.

SmartSOS AI System 

With SmartSOS AI System, what to do during a Robbery? First, remain calm and simply raise up both hands. Then wave your hand left and right continues for more than 3sec, generally Wave Hand may likely draw the Robber's attention that doesn’t hurt you and you will follow the robber's direction. At the same time, SmartSOS AI system will be trigger immediately, the SOS alert message will send to the registered contacts with a track location map.

This AI system was embedded and designed with TensorFlow Lite TinyML Framework. Continuous motion recognition system was build and implement on Arduino Nano. X, Y, Z-axis Dataset was collected, simulate and calculate via Tensor lite flow framework to predict and analyze user’s hand movement. So, the SmartSOS system alert will not trigger when the user(s) are performing daily activities such as walking, eating, working, etc. 

Bill of Material
i) Arduino Nano 33BLE Sense.
ii) Android Mobile phones, Android 10.
iii) Android Studio
iv) edgeimpulse.com
v) Arduino IDE
Hardware Prototype
i) 3.7v, 110mA Battery (A).
ii) Arduino Nano BLE Sense 33  (B).
iii) Android SmartPhone.
SmartSOS Application Interface

A) Select to connect Nano on  Main Pages.          B) Page for add emergency contact(s).                          

 
C) Log Message display once SOS message was triggered


Building a TinyML (Tensor Flow Lite) Model
Arduino Nano sense connects to a computer via  USB serial to collect hand movement dataset with different position (X, Y, Z-Axis). The data was collected and simulate by using edgeimpulse. The image below shows each collected raw dataset behaviour:





Data below show that the training module output where the training set is classified by a neural network.
Tested log via USB Serial Monitor

Video 1: Testing SmartSOS  AI System with different Movement.
0 -47sec: Other movements will NOT trigger SmartSOS  AI System.
48-58sec: Wavehand movement triggered SmartSOS  AI System.

Video 2: SmartSOS  AI System

Further Discussions
In the future, I hope that idea can be integrated and fully utilize on Android SmartWatch (instead of Arduino nano sense device ) to work with Android Phones.

Source Code
SmartSOS App Android Source Code: Link
SmartSOS AI Dataset Lib: Link
Arduino Nano Sense 33 Source Code: Link

Reference
Image and icon: Link
AdadFruit Ble: Link

About
Heng Chang Song, A forward-thinking senior engineer with over 13 years of experience managing complex automotive-related products and collaborated with superb international car makers namely Tesla, Geely, Great Wall, Nissan, R.Bosch.
 email: changsong.heng@gmail.com
Linkedin.
Github. 
Others App: 
USB OTGChecker.


Comments

Popular posts from this blog

Fixed issues:AttributeError: module 'tensorflow' has no attribute 'Session'

The previous blog, Train Custom Image Classification Model using Google Inception API was shared with you on how to train and prepare your data set. Some reader feedback that they are facing some issues with data set compilation. The issues mentioned were: AttributeError: module 'TensorFlow' has no attribute 'Session'. To fix these issues, you only need to reinstall tensorflow-gpu 1.15.0: pip3 install -- upgrade -- force - reinstall tensorflow - gpu == 1.15 . 0 more info: stack overflow hwerwerTrain Custom Image Classification Model using Google Inception API Train Custom Image Classification Model using Google Inception API

Compile TensorFlow - Hello World

Previous tutorial was shared with you on how to setup tensorflow into your machine or PC. Now let us start test a simple tensorflow code to compile on our setup environment. 1. Open Comand Prompt on your PC. 2. Key in to conda info --envs verify tensorflow was installed on your PC. 3. Activated tensorflow , then let try to import tensorflow lib by key in import tensorflow as tf. 4. Now go to Anaconda> Sypder  editor , then key in the hello world code as below: 5. Then click to run the program, done! ** Note: You also can direct compile from Command prompt: python >>> import tensorflow as tf >>> hello = tf.constant('Hello, Tensorflow') >>> sess = tf.Session() >>> print(sess.run(hello))