Skip to content

Latest commit

 

History

History
118 lines (99 loc) · 7.07 KB

SOP.md

File metadata and controls

118 lines (99 loc) · 7.07 KB

ADA Standard Operating Procedures

These Standard Operating Procedures (SOP) are meant for 510 to use ADA during emergencies.

Should I run ADA?

  • There must be a clear request by the IFRC SIMS/IM Coordinator - in case of international respone - or by its NS counterpart, because we want this analysis to be useful and used.
  • The relevant agencies - Copernicus, UNOSAT, or others - must not be already producing damage assessments for this emergency, or there must be a clear gap in terms of spatial coverage of those assessments, because we don't want to duplicate efforts.
  • ...

Can I run ADA?

  • High-resolution (<0.6 m/pixel) optical satellite imagery of the affected area, both pre- and post-disaster, must be available.
  • The pre- and post-disaster imagery must spatially overlap, since ADA needs both images for each building.
  • There should be building information (polygons) already available by OpenStreetMap, Microsoft, or Google.
    • if not, buildings can be detected automatically using ABD, but the quality of the results will strongly depend on building density, being lower for densely built-up areas.

Tip

You can check the extent and overlap of images from Maxar Open data using opengeos/maxar-open-data.

How do I run ADA?

Prerequisites

  1. Access to the BitWarden collection Damage Assessment
  2. Access to the resource group 510Global-ADA with role Contributor or higher
  3. QGIS installed locally

1. Get the imagery

The first step is to load the satellite imagery in the container operations of the datalake storage adadatalakestorage.

Login into the VM 510ada-NC8asT4 using SSH or Bastion (credentials in BitWarden).

Mount the container on the directory data with

sudo mkdir /mnt/resource/blobfusetmp -p
sudo blobfuse data --tmp-path=/mnt/resource/blobfusetmp  --config-file=blobfuse/fuse_connection_adadatalake.cfg -o attr_timeout=240 -o entry_timeout=240 -o negative_timeout=120 -o allow_other

If you received a link to the images (e.g. from another agency), simply download and extract it in data.

mkdir ~/data/<event-name>
cd ~/data/<event-name>
wget <link-to-my-images>

If you already have the images on your local machine, simply upload them to the data lake storage, in the operations container.

Tip

Use Azure Storage Explorer to upload and organize images if you're not familiar with command line.

Caution

Make sure that the images are divided in two sub-folders called pre-event and post-event.

If you first need to download the images from Maxar

  1. go to Maxar Geospatial Platform (MGP) Xpress
  2. log in (credentials are in BitWarden)
  3. browse to the relevant event
  4. download the images one by one

DEPRECATED: for old events (pre-2023) there is a script that lists all images from the old Maxar Open Data Portal. To use that,

  1. go to Maxar Open Data Portal
  2. browse to the relevant event
  3. copy the name of the event from the URL (e.g. "typhoon-mangkhut" from https://www.maxar.com/open-data/typhoon-mangkhut)
  4. download the images with
load-images --disaster <event-name> --dest ~/data/<event-name>

2. Check the imagery

Verify that the imagery is

  • optical (RGB), or multi-spectral with optical
  • high-resolution (<0.6 m/pixel)
  • cloud-free
  • and that building damage is visible. Can be done locally by downloading the images and visualizing them with QGIS.

3. Get building footprint

The second step is to get a vector file (.geojson) with the buildings in the affected area.

  • Check if OpenStreetMap (OSM) buildings are good enough; if so, download them for each image with
run get-osm-buildings --raster ~/data/<event-name>/pre-event/<image-name>.tif

4. Run ADA

  • [OPTIONAL] Copy images from the datalake storage to the VM (processing is faster locally)
cp -r ~/data/<event-name> ~/<event-name>
  • Prepare data for caladrius (damage classification model)
cd ~/<event-name>
prepare-data --data . --buildings buildings.geojson --dest caladrius
  • Run caladrius
conda activate cal
CUDA_VISIBLE_DEVICES="0" python ~/caladrius/caladrius/run.py --run-name run --data-path caladrius --model-type attentive --model-path ~/data/caladrius_att_effnet4_v1.pkl --checkpoint-path caladrius/runs --batch-size 2 --number-of-workers 4 --classification-loss-type f1 --output-type classification --inference
  • Prepare the final vector file with building polygons and caladrius' predictions
conda activate base
final-layer --builds buildings.geojson --damage caladrius/runs/run-input_size_32-learning_rate_0.001-batch_size_2/predictions/run-split_inference-epoch_001-model_attentive-predictions.txt --out buildings-predictions.geojson
  • Copy it back on the datalake, download it locally and visualize it with QGIS
cp buildings-predictions.geojson ~/data/<event-name>

5. Interpret and communicate about ADA

  • This is an AI-based assessment: it is not perfect, but it is fast.
  • The expected accuracy of ADA on new disasters is around 60-70%: this means that 3 out of 5 buildings should be correctly assessed. Make sure to communicate this clearly with your end users before sharing the results.
  • To be sure, take 20/50/100 random buildings and check that at least 15/30/60 buildings were correctly assessed.
    • If this the case, manually correct the most obvious mistakes (if any) and proceed to create a map.
    • If this is not the case, it is probably connected with the quality of the images: go back to step #2 and repeat.
  • Create a map and share; please re-use the disclaimers and style from previous examples: