Use case #1: Emergency/Disaster Recovery

Image
Study Case
Post-disaster in a container terminal
Demonstration
Simulation of partial implementation in Port of Koper (container fall and monitoring of suspicious substance).
Image
Scenarios
  1. Mapping: map a given area using ground robots and/or drones through cameras and lidars
  2. Victim detection and injury assessment: detect victims and assess their injuries in the post-disaster area using ground robots and/or drones
  3. Risk prediction: predict possible risks in a post-disaster area using ground robots, drones, and sensor nodes
  4. Device deployer and liquid sampler: physically deploy sensor nodes or take samples using ground robots
  5. Network and device monitoring: monitor the network connectivity for the IoT devices (robots, sensor nodes, drones) deployed in the post-disaster area
Goal

The goal of this use case is to enhance the situational awareness for first responders (e.g., firefighters) and collect data to prioritize rescue operations.

Description

Offer timely support to rescue teams by:

  • Deploying network infrastructure for data collection and establishing contact with potential victims.
  • Mapping the area and locating victims and risks.
  • Providing dynamic monitoring to assess damages and victims’ injuries.
  • Considering that data is generated from the heterogeneous IoT devices

Technical constraints

Image
1. Device Management
  • Application functionalities can be pre-deployed on the devices or at the edge
  • Bootstrapping and self-configuration
  • add and remove devices on the fly
  • Support hardware heterogeneity and guarantee self-healing of software
2. Software components orchestration
  • Dynamic placement of components based on service requirements and resource availability
  • Performance and monitoring at the various levels of the continuum for dynamic components redeployment
3. Low latency communication
  • Low delay communication networks to/from disaster area
  • Support for mobility conditions and possible disconnections
4. Dynamic multi-robot mapping and fleet management
  • Coordination, monitoring and optimization of the tasks allocation for mobile robots that work together (through their VO)
5. Computer vision for information extraction
  • AI and computer vision for position detection from image and video data
6. Smart data filtering/aggregation/compression
  • Large amount of data
  • from sensors, robots and cameras in the intervention area
  • Some can be filtered, others can be downsampled or aggregated before sending it to the edge/cloud
  • Smart policies should be defined (high degree of data heterogeneity)
7. Robots and sensor nodes interaction
  • Enable direct communication among robots and sensor nodes of the WSN by defining interaction guidelines
8. Monitoring Dashboard
  • GUI to monitor the post-disaster operation from the operation base (data visualization and control command)