Sahana Research and Action
humanitarian FOSS innovations - tackling disaster and emergency communication challenges
Become a financial contributor.
Sahana Research and Action is all of us
Our contributors 6
Thank you for supporting Sahana Research and Action.
Transparent and open finances.
Debit from Sahana Research and Action to Guest •
Credit from Guest to Sahana Research and Action •
Let’s get the ball rolling!
News from Sahana Research and Action
Updates on our activities and progress.
AidIQ and SpotOn to take on the SAMBRO Alerthub cleanup
The IAN-Lens, our first goal
Since we started 1 year ago, what has unfolded
Sahana Research and Action (SaRA) is currently working on the research and development of an Interplanetary Alert Network (IAN) - https://github.com/waidyanatha/IAN/wiki . It aims to offer a decentralized and trusted platform for publishing and subscribing alerts. We are thinking futuristic with IOTs and the current dilemma IOT face with identity, trust, so on. Similar to all other, this work is also on emergency communication.
The idea originates from the concept of "alert authentication without aggregation". We first investigated Blockchain technology, through the Hyperledger Fabric project, to realize that it was not the right paradigm (i.e. language, model and objectives). We are now convinced that the Interplanetary File System (IPFS) and a presentation layer would suffice. The presentation layer would need a taxonomy and indexing for one node to consume or offer filterable alerts to another node. The challenge is in developing a common taxonomy, perhaps one that is adaptive for the alerting ecology.
SaRA has strong credentials in the Common Alerting Protocol international warning standard. It was natural for us to investigate the possibility of formulating a taxonomy from the CAP messages; especially, those originated or relayed by the Alerting Authorities defined in the WMO Register. We have been collecting such messages, in our GRAB database. These come from 80+ CAP feeds around the world. However, the current state of those published messages do not comply with the CAP standard, polices and procedures. This makes it extremely difficult to heuristically develop a taxonomy; especially when we think beyond CAP feeds to include social media and other streams (including video).
We need to apply NLP techniques to build our taxonomy. The current trend is with the use of ML techniques such as the Tensorflow project introduced by Google. We have undergone the virtualization stage of the prototyping to experiment with the GRAB data and the algorithms.
Current code is available @ github/waidyanatha.
The project needs to complete the following activities to prepare a prototype for real world testing
- Develop, test, and validate the Clustering, Bayesian NN, and NLP techniques for building an adaptive taxonomy (data scientist with python skills)
- Integrate IPFS with SAMBRO to offers CAP XML files through a node's hash-tree folder structure for sharing immutable CAP XML files (python programmer with SAMBRO expertise).
- Host SAMBRO/IPFS nodes on cloud services, with a single instance per node for a configuration of multiple nodes (Systems Administrator)
We're looking to connect with people passionate about us utilizing accessible technologies to help people prepare for, respond to and recover from disasters. Connect with us!