This recipe is a step-by-step guide on how to deploy the IMI Data Catalogue in Docker.
For a more general introduction to data catalogues, their elements and data models, see the data catalogue recipe. This recipe is intended as a set of step-by-step instructions to deploy via Docker the IMI Data Catalogue developed at the Luxembourg Centre for Systems Biomedicine. The overall purpose of the data catalogue is to host dataset-level metadata for a wide range of IMI projects. Datasets are FAIRified and searchable by a range facets. The catalogue is not intended to hold the actual data, although it provides links to where the data is hosted, together with information on any access restrictions.
The following need to be installed on the machine the deployment is run on:
Check out the code to your local machine by running the following command in a terminal:
$ git clone email@example.com:FAIRplus/imi-data-catalogue.git
docker-compose, it is possible to easily manage all the components (solr and web server) required to run the application.
Unless otherwise specified, all the following commands should be run in a terminal from the base directory of the data catalogue code.
(web container) indicate context of execution.
First, generate the certificates that will be used to enable HTTPS in reverse proxy. To do so, execute:
$ cd docker/nginx/ $ ./generate_keys.sh
Please note that if you run this command outside the
nginxdirectory, the certificate and key will be generated in the wrong location.
This command relies on OpenSSL. If you don't plan to use HTTPS or just want to see demo running, you can skip this.
- it would cause the HTTPS connection to be unsafe!
Return to the root directory (
$cd ../..), then copy
$ cd ../.. $ cp datacatalog/settings.py.template datacatalog/settings.py
settings.pyfile to add a random string of characters in
SECRET_KEYattribute. For maximum security, in
Python, use the following to generate this key:
import os os.urandom(24)
Build and start the docker containers by running:
(local) $ docker-compose up --build
That will create:
a container with
datacatalog web application
a container for
the data will be persistant between runs.
In a new terminal, to create
(local) $ docker-compose exec solr solr create_core -c datacatalog (local) $ docker-compose exec solr solr create_core -c datacatalog_test
Then, still in the second terminal, put Solr data into the cores:
(local) $ docker-compose exec web /bin/bash (web container) $ python manage.py init_index (web container) $ python manage.py import_entities Json dataset
to kill the container, press
exitfrom the terminal
most browsers display a warning or block self-signed certificates.
Docker container keeps the application in the state it was when built. Therefore, if you change any files in the project, the container has to be rebuilt in order to see changes in application :
shell= $ docker-compose up --build
If you wanted to delete Solr data, you'd need to run:
shell= $ docker-compose down --volumes
This will remove any persisted data - you must redo
solr create_core (see step 4 in the previous section) to recreate the Solr cores.
The datasets are all defined in the file
tests/data/records.json. This file can me modified to add, delete and modify datasets. After saving the file, rebuild and restart docker-compose.
First, to stop all the containers:
shell= $ CTRL+D
Then rebuild and restart the containers:
shell= $ docker-compose up --build
Finally, reindex the datasets using:
shell= (local) $ docker-compose exec web /bin/bash (web container) $ python manage.py import_entities Json dataset
to kill the container, press "
CTRL+D" or type: "
exit" from the terminal
In some cases, you might not want Solr and Nginx to run (for example, if there are multiple instances of
Data Catalog running).
Then, simply use:
shell= (local) $ docker build . -t "data-catalog" (local) $ docker run --name data-catalog --entrypoint "gunicorn" -p 5000:5000 -t data-catalog -t 600 -w 2 datacatalog:app --bind 0.0.0.0:5000
If you would prefer not to use Docker and compile and run the data catalogue manually instead, please follow the instructions in the README file
|Danielle Welter||LCSB, University of Luxembourg||0000-0003-1058-2668||Writing - Original Draft|
|Valentin Grouès||LCSB, University of Luxembourg||0000-0001-6501-0806||Writing - Original Draft|
|Wei Gu||LCSB, University of Luxembourg||0000-0003-3951-6680||Writing - Review|
|Venkata Satagopam||LCSB, University of Luxembourg||0000-0002-6532-5880||Writing - Review|
|Philippe Rocca-Serra||University of Oxford, Data Readiness Group||0000-0001-9853-5668||Writing - Review|
This page is released under the Creative Commons 4.0 BY license.