Skip to content

Commit

Permalink
Merge branch 'refs/heads/devel' into performances-seen-unseen
Browse files Browse the repository at this point in the history
  • Loading branch information
niconoe committed Jul 25, 2024
2 parents e84d387 + 1150342 commit 1a9a616
Show file tree
Hide file tree
Showing 20 changed files with 303 additions and 245 deletions.
5 changes: 3 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Current (unreleased)
# v1.7.5 (2024-07-25)

- Fix a compatibility issue with Windows platform (data import script). Thanks @sronveaux!
- Fix a compatibility issue with Windows platform (data import script). Thanks, @sronveaux!
- Major improvements under the hood to map performances (Thanks for the suggestion, @sronveaux and @silenius!)

# v1.7.4 (2024-05-24)

Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ ENV PYTHONUNBUFFERED=1 \
PIP_DEFAULT_TIMEOUT=100 \
PIP_DISABLE_PIP_VERSION_CHECK=1 \
PIP_NO_CACHE_DIR=1 \
POETRY_VERSION=1.4.0 \
POETRY_VERSION=1.8.3 \
NVM_VERSION=0.39.1 \
NVM_DIR=/root/.nvm \
NODE_VERSION=19.7.0 \
Expand Down
6 changes: 3 additions & 3 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,14 +44,14 @@ properly configured.
### Prerequisites

- Make sure [Docker](https://docs.docker.com/get-docker/) is installed on your system
- Identify the latest release of GBIF Alert on GitHub at https://github.com/riparias/gbif-alert/tags (currently [v1.7.4](https://github.com/riparias/gbif-alert/releases/tag/v1.7.4))
- Identify the latest release of GBIF Alert on GitHub at https://github.com/riparias/gbif-alert/tags (currently [v1.7.5](https://github.com/riparias/gbif-alert/releases/tag/v1.7.5))

### Installation steps

- Create a new directory on your system, e.g. `invasive-fishes-nz` following the example above.
- Go to the `docker-compose.yml` file from the latest release of GBIF Alert on GitHub: at the moment https://github.com/riparias/gbif-alert/blob/v1.7.4/docker-compose.yml (note that the URL contains the version number).
- Go to the `docker-compose.yml` file from the latest release of GBIF Alert on GitHub: at the moment https://github.com/riparias/gbif-alert/blob/v1.7.5/docker-compose.yml (note that the URL contains the version number).
- Save the file in the directory you have just created.
- Go to the `local_settings_docker.template.py` file from the latest release of GBIF Alert on GitHub: at the moment https://github.com/riparias/gbif-alert/blob/v1.7.4/djangoproject/local_settings_docker.template.py.
- Go to the `local_settings_docker.template.py` file from the latest release of GBIF Alert on GitHub: at the moment https://github.com/riparias/gbif-alert/blob/v1.7.5/djangoproject/local_settings_docker.template.py.
- Save the file in the directory you have just created.
- Rename this file to `local_settings_docker.py`.
- Open a terminal, navigate to the `invasive-fishes-nz` directory and run the following command: `docker-compose up`.
Expand Down
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,16 @@
TODO:
- to continue:
- new table in DB for the not seen, and populated. so we can now:
- adjust all the seen/not seen code to make sure it uses the new model
- add the user preference to choose which observations should be marked as seen automatically (delay)
- add code to mark as seen all the observations that are older than the delay (run each day after the data import?)
- test if data migration performance is enough for prod purposes !!
- (at the end): drop the old observationView table
- (clone the existing site/db to test everything before real deployment)?
- confirm with Damiano that all observations are considered as seen whe a user create an account
- New NL translations?


# GBIF Alert

<!-- badges: start -->
Expand Down Expand Up @@ -27,4 +40,5 @@ See [INSTALL.md](INSTALL.md) for more information.
## GBIF Alert instances in the wild

- LIFE RIPARIAS Early Alert: [production](https://alert.riparias.be) / [development](https://dev-alert.riparias.be) (Targets riparian invasive species in Belgium)
- [GBIF Alert demo instance](https://gbif-alert-demo.thebinaryforest.net/) (Always in sync with the `devel` branch of this repository)
- [GBIF Alert demo instance](https://gbif-alert-demo.thebinaryforest.net/) (Always in sync with the `devel` branch of this repository)
- The Belgian Biodiversity Platform uses GBIF alert under the hood as an API for the ManaIAS project.
8 changes: 6 additions & 2 deletions assets/ts/components/ObservationsMap.vue
Original file line number Diff line number Diff line change
Expand Up @@ -63,14 +63,18 @@ export default defineComponent({
baseLayerName: String,
dataLayerOpacity: Number,
areasToShow: {
type: Array as PropType<Array<number>>, // Array of area ids
type: Array as PropType<Array<Number>>, // Array of area ids
default: [],
},
layerSwitchZoomLevel: {
// At which zoom level do we switch from the aggregated hexagons to the "individual observation" layer
type: Number,
default: 13,
},
zoomLevelMinMaxQuery: {
type: Number,
required: true,
},
},
data: function () {
return {
Expand Down Expand Up @@ -242,7 +246,7 @@ export default defineComponent({
if (this.aggregatedDataLayer) {
this.map.removeLayer(this.aggregatedDataLayer as VectorTileLayer<Feature>);
}
this.loadOccMinMax(this.initialPosition.initialZoom, this.filters);
this.loadOccMinMax(this.zoomLevelMinMaxQuery, this.filters);
this.aggregatedDataLayer = this.createAggregatedDataLayer();
this.map.addLayer(this.aggregatedDataLayer as VectorTileLayer<Feature>);
}
Expand Down
1 change: 1 addition & 0 deletions assets/ts/components/OuterObservationsMap.vue
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@
:base-layer-name="mapBaseLayer"
:data-layer-opacity="dataLayerOpacity"
:areas-to-show="filters.areaIds"
:zoom-level-min-max-query="frontendConfig.zoomLevelMinMaxQuery"
></Observations-Map>
</div>
</template>
Expand Down
1 change: 1 addition & 0 deletions assets/ts/interfaces.ts
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ export interface FrontEndConfig {
authenticatedUser: boolean;
userId?: number; // Only set if authenticatedUser is true
mainMapConfig: MapConfig;
zoomLevelMinMaxQuery: number;
}

// Keep in sync with Models.Observation.as_dict()
Expand Down
89 changes: 58 additions & 31 deletions dashboard/management/commands/import_observations.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import argparse
import datetime
import logging
import os
import tempfile
import time
Expand All @@ -18,6 +19,9 @@

from dashboard.management.commands.helpers import get_dataset_name_from_gbif_api
from dashboard.models import Species, Observation, DataImport, Dataset
from dashboard.views.helpers import (
create_or_refresh_materialized_views,
)

BULK_CREATE_CHUNK_SIZE = 10000

Expand Down Expand Up @@ -199,6 +203,9 @@ def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.transaction_was_successful = False

def log_with_time(self, message: str):
self.stdout.write(f"{time.ctime()}: {message}")

def _import_all_observations_from_dwca(
self,
dwca: DwCAReader,
Expand Down Expand Up @@ -227,7 +234,7 @@ def _import_all_observations_from_dwca(
self.stdout.write("x", ending="")

if index % BULK_CREATE_CHUNK_SIZE == 0:
self.stdout.write(f"{time.ctime()}: Bulk size reached...")
self.log_with_time("Bulk size reached...")
self.batch_insert_observations(observations_to_insert)
observations_to_insert = []

Expand All @@ -237,9 +244,9 @@ def _import_all_observations_from_dwca(
return skipped_observations_counter

def batch_insert_observations(self, observations_to_insert: list[Observation]):
self.stdout.write(f"{time.ctime()}: Bulk creation")
self.log_with_time("Bulk creation")
inserted_observations = Observation.objects.bulk_create(observations_to_insert)
self.stdout.write(f"{time.ctime()}: Migrating linked entities")
self.log_with_time("Migrating linked entities")
for obs in inserted_observations:
obs.migrate_linked_entities()

Expand All @@ -254,21 +261,28 @@ def flag_transaction_as_successful(self):
self.transaction_was_successful = True

def handle(self, *args, **options) -> None:
self.stdout.write(f"{time.ctime()}: (Re)importing all observations")
# Allow the verbosity option for our custom logging
# (see https://reinout.vanrees.org/weblog/2017/03/08/logging-verbosity-managment-commands.html)
verbosity = int(options["verbosity"])
root_logger = logging.getLogger("")
if verbosity > 1:
root_logger.setLevel(logging.DEBUG)

self.log_with_time("(Re)importing all observations")

# 1. Data preparation / download
gbif_predicate = None
if options["source_dwca"]:
self.stdout.write(f"{time.ctime()}: Using a user-provided DWCA file")
self.log_with_time("Using a user-provided DWCA file")
source_data_path = options["source_dwca"].name
else:
self.stdout.write(
f"{time.ctime()}: No DWCA file provided, we'll generate and get a new GBIF download"
self.log_with_time(
"No DWCA file provided, we'll generate and get a new GBIF download"
)

self.stdout.write(
f"{time.ctime()}: Triggering a GBIF download and waiting for it - this can be long..."
self.log_with_time(
"Triggering a GBIF download and waiting for it - this can be long..."
)

tmp_file = tempfile.NamedTemporaryFile(delete=False)
source_data_path = tmp_file.name
tmp_file.close()
Expand All @@ -283,11 +297,10 @@ def handle(self, *args, **options) -> None:
password=settings.GBIF_ALERT["GBIF_DOWNLOAD_CONFIG"]["PASSWORD"],
output_path=source_data_path,
)
self.stdout.write(f"{time.ctime()}: Observations downloaded")
self.log_with_time("Observations downloaded")

self.stdout.write(
f"{time.ctime()}: We now have a (locally accessible) source dwca, real import is starting. We'll use a transaction and put "
"the website in maintenance mode"
self.log_with_time(
"We now have a (locally accessible) source dwca, real import is starting. We'll use a transaction and put the website in maintenance mode"
)

set_maintenance_mode(True)
Expand All @@ -298,12 +311,12 @@ def handle(self, *args, **options) -> None:
current_data_import = DataImport.objects.create(
start=timezone.now(), gbif_predicate=gbif_predicate
)
self.stdout.write(
f"{time.ctime()}: Created a new DataImport object: #{current_data_import.pk}"
self.log_with_time(
f"Created a new DataImport object: #{current_data_import.pk}"
)

# 3. Pre-import all the datasets (better here than in a loop that goes over each observation)
self.stdout.write(f"{time.ctime()}: Pre-importing all datasets")
self.log_with_time("Pre-importing all datasets")
# 3.1 Get all the dataset keys / names from the DwCA
datasets_referenced_in_dwca = dict()
with DwCAReader(source_data_path) as dwca:
Expand Down Expand Up @@ -344,7 +357,7 @@ def handle(self, *args, **options) -> None:
current_data_import.set_gbif_download_id(
extract_gbif_download_id_from_dwca(dwca)
)
self.stdout.write(f"{time.ctime()}: Importing all rows")
self.log_with_time("Importing all rows")
current_data_import.skipped_observations_counter = (
self._import_all_observations_from_dwca(
dwca,
Expand All @@ -354,41 +367,55 @@ def handle(self, *args, **options) -> None:
)
)

self.stdout.write(
f"{time.ctime()}: All observations imported, now deleting observations linked to previous data imports..."
self.log_with_time(
"All observations imported, now deleting observations linked to previous data imports..."
)

# 6. Remove previous observations
Observation.objects.exclude(data_import=current_data_import).delete()
self.log_with_time("Previous observations deleted")

self.log_with_time(
"We'll now create or refresh the materialized views. This can take a while."
)

# 7. Remove unused Dataset entries (and edit related alerts)
# 7. Create or refresh the materialized view (for the map)
create_or_refresh_materialized_views(
zoom_levels=[settings.ZOOM_LEVEL_FOR_MIN_MAX_QUERY]
)

# 8. Remove unused Dataset entries (and edit related alerts)
for dataset in Dataset.objects.all():
if dataset.observation_set.count() == 0:
self.stdout.write(
f"{time.ctime()}: Deleting (no longer used) dataset {dataset}"
)
self.log_with_time(f"Deleting (no longer used) dataset {dataset}")

alerts_referencing_dataset = dataset.alert_set.all()
if alerts_referencing_dataset.count() > 0:
for alert in alerts_referencing_dataset:
self.stdout.write(
f"{time.ctime()}: We'll first need to un-reference this dataset from alert #{alert}"
self.log_with_time(
f"We'll first need to un-reference this dataset from alert #{alert}"
)
alert.datasets.remove(dataset)

dataset.delete()

# 6. Finalize the DataImport object
self.stdout.write(f"{time.ctime()}: Updating the DataImport object")
# 9. Finalize the DataImport object
self.log_with_time("Updating the DataImport object")

current_data_import.complete()
if options["source_dwca"] is None:
self.log_with_time("Deleting the (temporary) source DWCA file")
os.unlink(source_data_path)
self.stdout.write("Done.")
self.log_with_time("Committing the transaction")

self.stdout.write(f"{time.ctime()}: Leaving maintenance mode.")
self.log_with_time("Transaction committed")
self.log_with_time("Leaving maintenance mode.")
set_maintenance_mode(False)

self.stdout.write(f"{time.ctime()}: Sending email report")
self.log_with_time("Sending email report")
if self.transaction_was_successful:
send_successful_import_email()
else:
send_error_import_email()

self.log_with_time("Import observations process successfully completed")
1 change: 1 addition & 0 deletions dashboard/templatetags/gbif-alert_extras.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,7 @@ def js_config_object(context):
).replace("1", "{id}"),
},
"mainMapConfig": settings.GBIF_ALERT["MAIN_MAP_CONFIG"],
"zoomLevelMinMaxQuery": settings.ZOOM_LEVEL_FOR_MIN_MAX_QUERY,
}
if context.request.user.is_authenticated:
conf["userId"] = context.request.user.pk
Expand Down
2 changes: 2 additions & 0 deletions dashboard/tests/selenium/test_integration.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
ObservationView,
Alert,
)
from dashboard.views.helpers import create_or_refresh_all_materialized_views


def _get_webdriver() -> WebDriver:
Expand Down Expand Up @@ -142,6 +143,7 @@ def setUp(self):
)

alert.species.add(self.first_species)
create_or_refresh_all_materialized_views()


class SeleniumAlertTests(SeleniumTestsCommon):
Expand Down
Loading

0 comments on commit 1a9a616

Please sign in to comment.