Search

Deploying a data catalogue


Recipe metadata

identifier: UC9.1

version: v0.1

Difficulty level

Reading Time

20 minutes

Recipe Type

Hands-on

Executable Code

No

Intended Audience

Software Developers

Data Managers

System Administrators


Main Objectives

This recipe is a step-by-step guide on how to deploy the IMI Data Catalogue in Docker.

Introduction

For a more general introduction to data catalogues, their elements and data models, see the data catalogue recipe. This recipe is intended as a set of step-by-step instructions to deploy via Docker the IMI Data Catalogue developed at the Luxembourg Centre for Systems Biomedicine. The overall purpose of the data catalogue is to host dataset-level metadata for a wide range of IMI projects. Datasets are FAIRified and searchable by a range facets. The catalogue is not intended to hold the actual data, although it provides links to where the data is hosted, together with information on any access restrictions.

Requirements

The following need to be installed on the machine the deployment is run on:

Ingredients

Check out the code to your local machine by running the following command in a terminal:

$ git clone git@github.com:FAIRplus/imi-data-catalogue.git

Thanks to docker-compose, it is possible to easily manage all the components (solr and web server) required to run the application.

Step-by-step guide:

Unless otherwise specified, all the following commands should be run in a terminal from the base directory of the data catalogue code.

1. Building

(local) and (web container) indicate context of execution.

  1. First, generate the certificates that will be used to enable HTTPS in reverse proxy. To do so, execute:

    $ cd docker/nginx/
    $ ./generate_keys.sh
    

    :warning: Please note that if you run this command outside the nginx directory, the certificate and key will be generated in the wrong location.

    This command relies on OpenSSL. If you don't plan to use HTTPS or just want to see demo running, you can skip this.

    :warning: - it would cause the HTTPS connection to be unsafe!

  2. Return to the root directory ($cd ../..), then copy datacatalog/settings.py.template to datacatalog/settings.py.

    $ cd ../..
    $ cp datacatalog/settings.py.template datacatalog/settings.py
    
  3. Edit the settings.py file to add a random string of characters in SECRET_KEY attribute. For maximum security, in Python, use the following to generate this key:

    import os
     os.urandom(24)
    
  4. Build and start the docker containers by running:

    (local) $ docker-compose up --build
    

    That will create:

    • a container with datacatalog web application

    • a container for Solr

    :thumbsup: the data will be persistant between runs.

  5. In a new terminal, to create Solr cores, do:

    (local) $ docker-compose exec solr solr create_core -c datacatalog
    (local) $ docker-compose exec solr solr create_core -c datacatalog_test
    
  6. Then, still in the second terminal, put Solr data into the cores:

    (local) $ docker-compose exec web /bin/bash
    (web container) $ python manage.py init_index 
    (web container) $ python manage.py import_entities Json dataset
    

    :bell: to kill the container, press CTRL+D or type: exit from the terminal

  7. The web application should now be available with loaded data via http://localhost and https://localhost with ssl connection

    :warning: most browsers display a warning or block self-signed certificates.

2. Maintenance of docker-compose

Docker container keeps the application in the state it was when built. Therefore, if you change any files in the project, the container has to be rebuilt in order to see changes in application :

shell=
$ docker-compose up --build

If you wanted to delete Solr data, you'd need to run:

shell=
$ docker-compose down --volumes

This will remove any persisted data - you must redo solr create_core (see step 4 in the previous section) to recreate the Solr cores.

3. Modifying the datasets

The datasets are all defined in the file tests/data/records.json. This file can me modified to add, delete and modify datasets. After saving the file, rebuild and restart docker-compose.

First, to stop all the containers:

shell=
$ CTRL+D

Then rebuild and restart the containers:

shell=
$ docker-compose up --build

Finally, reindex the datasets using:

shell=
(local) $ docker-compose exec web /bin/bash
(web container) $ python manage.py import_entities Json dataset

:bell: to kill the container, press "CTRL+D" or type: "exit" from the terminal


Single Docker deployment

In some cases, you might not want Solr and Nginx to run (for example, if there are multiple instances of Data Catalog running). Then, simply use:

shell=
(local) $ docker build . -t "data-catalog"
(local) $ docker run --name data-catalog --entrypoint "gunicorn" -p 5000:5000 -t data-catalog -t 600 -w 2 datacatalog:app --bind 0.0.0.0:5000

Manual deployment

If you would prefer not to use Docker and compile and run the data catalogue manually instead, please follow the instructions in the README file


Conclusion:

This recipe provides a step-by-step guide to deploying the IMI data catalogue developed at University of Luxembourg, as parrt of IMI FAIRplus to a local system.

What should I read next?


Authors:

Name Affiliation orcid CrediT role
Danielle Welter LCSB, University of Luxembourg 0000-0003-1058-2668 Writing - Original Draft
Valentin Grouès LCSB, University of Luxembourg 0000-0001-6501-0806 Writing - Original Draft
Wei Gu LCSB, University of Luxembourg 0000-0003-3951-6680 Writing - Review
Venkata Satagopam LCSB, University of Luxembourg 0000-0002-6532-5880 Writing - Review
Philippe Rocca-Serra University of Oxford, Data Readiness Group 0000-0001-9853-5668 Writing - Review

License:

This page is released under the Creative Commons 4.0 BY license.