Skip to content

Commit

Permalink
Merge pull request #231 from elixir-luxembourg/prerelease
Browse files Browse the repository at this point in the history
Prerelease to master
  • Loading branch information
vildead authored Jun 1, 2021
2 parents dbd7116 + 9f9ac5d commit 0f60f83
Show file tree
Hide file tree
Showing 104 changed files with 3,067 additions and 1,509 deletions.
3 changes: 2 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,10 @@ before_install:
- cp elixir_daisy/settings_compose_ci.py elixir_daisy/settings_compose.py

install:
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin
- docker-compose build
- docker-compose up -d
- sleep 60

script:
- docker-compose exec web python setup.py pytest
- docker-compose exec web python setup.py pytest
42 changes: 27 additions & 15 deletions DEPLOYMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ sudo yum install python36-devel openldap-devel nginx
sudo yum group install "Development Tools"

wget https://bootstrap.pypa.io/get-pip.py
sudo python36 get-pip.py
sudo python3.6 get-pip.py
```


Expand All @@ -22,7 +22,7 @@ sudo useradd daisy
sudo usermod -a -G users daisy
sudo su - daisy
mkdir config log
git clone git@github.com:elixir-luxembourg/daisy.git
git clone https://github.com/elixir-luxembourg/daisy.git
exit
sudo /usr/local/bin/pip install -e /home/daisy/daisy
sudo /usr/local/bin/pip install gunicorn
Expand Down Expand Up @@ -69,6 +69,10 @@ You need configure the solr core 'daisy'. To do so you need to create 'schema.xm
```bash
sudo cp /home/daisy/daisy/docker/solr/schema.xml /var/solr/data/daisy/conf/
sudo cp /home/daisy/daisy/docker/solr/solrconfig.xml /var/solr/data/daisy/conf/
```

Grant ownership and change privileges of `/var/solr` folder
```
sudo chown -R solr:users /var/solr
sudo chmod -R 775 /var/solr
```
Expand Down Expand Up @@ -246,8 +250,6 @@ sudo systemctl start celery_beat

### Install database server

Documentation from: https://www.postgresql.org/download/linux/redhat/

```bash
sudo yum install https://download.postgresql.org/pub/repos/yum/10/redhat/rhel-7-x86_64/pgdg-centos10-10-2.noarch.rpm
sudo yum install postgresql10
Expand All @@ -257,6 +259,8 @@ sudo systemctl enable postgresql-10
sudo systemctl start postgresql-10
```

In case the installation fails, follow steps in the [official documentation](https://www.postgresql.org/download/linux/redhat/) for installation of Postgresql 10 on your platform.

### Create database and roles


Expand Down Expand Up @@ -306,6 +310,13 @@ cp /home/daisy/daisy/elixir_daisy/settings_local.template.py /home/daisy/daisy
vi /home/daisy/daisy/elixir_daisy/settings_local.py
```

<span style="color:red;">Change SECRET_KEY variable:</span>

```
# SECURITY WARNING: change the secret key used in production and keep it secret !
SECRET_KEY='<your-new-secret-key>'
```

Put in the following database configuration to the 'settings_local.py' file.

```
Expand Down Expand Up @@ -452,18 +463,19 @@ To do this run the following.
```bash
sudo su - daisy
cd /home/daisy/daisy
python36 manage.py collectstatic
python36 manage.py migrate
python36 manage.py build_solr_schema -c /var/solr/data/daisy/conf -r daisy
python3.6 manage.py collectstatic
python3.6 manage.py migrate
python3.6 manage.py build_solr_schema -c /var/solr/data/daisy/conf -r daisy
cd /home/daisy/daisy/core/fixtures/
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json
python36 manage.py load_initial_data
cd /home/daisy/daisy
python3.6 manage.py load_initial_data
```
The load_initial_data command needs several minutes to complete.
DAISY has a demo data loader. With example records of Projects Datasets and Users. If you want to deploy DAISY demo data, then do
```bash
python36 manage.py load_demo_data
python3.6 manage.py load_demo_data
```
The above command will create an 'admin' and other users such as 'alice.white', 'john.doe' 'jane.doe'. The password for all is 'demo'.
Expand All @@ -472,13 +484,13 @@ The above command will create an 'admin' and other users such as 'alice.white',
If you do not want to load the demo data and work with your own definitions, then you'd still need to create super user for the application, with which you can logon and create other users as well as records. To create a super user, do the following and respond to the questions.

```bash
python36 manage.py createsuperuser
python3.6 manage.py createsuperuser
```

Trigger a reindex with:

```bash
python36 manage.py rebuild_index
python3.6 manage.py rebuild_index
```

# Validate the installation
Expand Down Expand Up @@ -559,7 +571,7 @@ As daisy user:
```bash
cd /home/daisy/daisy
python36 manage.py migrate && python36 manage.py build_solr_schema -c /var/solr/data/daisy/conf/ -r daisy && yes | python36 manage.py clear_index && yes "yes" | python36 manage.py collectstatic;
python3.6 manage.py migrate && python3.6 manage.py build_solr_schema -c /var/solr/data/daisy/conf/ -r daisy && yes | python3.6 manage.py clear_index && yes "yes" | python3.6 manage.py collectstatic;
```
Expand All @@ -575,7 +587,7 @@ cd /home/daisy/daisy/core/fixtures/
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json -O edda.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json -O hpo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json -O hdo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json -O hgnc.json
cd /home/daisy/daisy
python36 manage.py load_initial_data
python3.6 manage.py load_initial_data
```
**IMPORTANT NOTE:** This step can take several minutes to complete.
Expand All @@ -586,15 +598,15 @@ python36 manage.py load_initial_data
If LDAP was used to import users, they have to be imported again.
As daisy user:
```bash
python36 manage.py import_users
python3.6 manage.py import_users
```
6) Rebuild Solr search index.
As daisy user:
```bash
cd /home/daisy/daisy
python36 manage.py rebuild_index
python3.6 manage.py rebuild_index
```
7) Restart services.
Expand Down
27 changes: 15 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,21 +93,19 @@ You are encouraged to try Daisy for yourself using our [DEMO deployment](https:/

#### Importing

In addition to loading of initial data, DAISY database can be populated by importing Project and Dataset records from JSON files.
The commands for import are given below: </br>
In addition to loading of initial data, DAISY database can be populated by importing Project, Dataset and Partners records from JSON files using commands `import_projects`, `import_datasets` and `import_partners` respectively.
The commands for import are accepting one JSON file (flag `-f`): </br>

```bash
docker-compose exec web python manage.py import_projects -f ${PROJECTS_JSON}
docker-compose exec web python manage.py <COMMAND> -f ${PATH_TO_JSON_FILE}
```
where ${PROJECTS_JSON} is the path to a json file containing the projects definitions.
See file daisy/data/projects.json as an example.
where ${PATH_TO_JSON_FILE} is the path to a json file containing the records definitions.
See file daisy/data/demo/projects.json as an example.


Alternatively, you can specify directory containing multiple JSON files to be imported with `-d` flag:
```bash
docker-compose exec web python manage.py import_datasets -d ${DATASETS_FOLDER}
docker-compose exec web python manage.py <COMMAND> -d ${PATH_TO_DIR}
```
where ${DATASETS_FOLDER} is the path to a folder containing datasets and data declarations definitions.
See folder daisy/data/datasets as an example.

#### Exporting

Expand Down Expand Up @@ -206,16 +204,21 @@ To be completed.
./manage.py import_users
```

### Import projects from external system
### Import projects, datasets or partners from external system
Single file mode:
```bash
./manage.py import_projects -f path/to/json_file.json
```

### Import datasets from external system
Batch mode:
```bash
./manage.py import_datasets -d path/to/folder_with_json
./manage.py import_projects -d path/to/dir/with/json/files/
```

Available commands: `import_projects`, `import_datasets`, `import_partners`.

In case of problems, add `--verbose` flag to the command, and take a look inside `./log/daisy.log`.

### Install js and css dependencies

```bash
Expand Down
8 changes: 8 additions & 0 deletions conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,14 @@ def storage_resources():
CommandLoadInitialData.create_storage_resources()


@pytest.fixture
def data_types():
"""
Create data types
"""
CommandLoadInitialData.create_datatypes()


@pytest.fixture
def partners():
"""
Expand Down
12 changes: 11 additions & 1 deletion core/exceptions.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
class DaisyError(Exception):
"""Base error class for the module."""

default_message = "Unspecified daisy error"
default_message = "Unspecified DAISY error"

def __init__(self, msg=None):
self._msg = msg or self.default_message
Expand Down Expand Up @@ -37,3 +37,13 @@ class FixtureImportError(DaisyError):
def __init__(self, data, msg=None):
self.data = data
super().__init__(msg=msg)


class JSONSchemaValidationError(DaisyError):
"""Error when validating data to be imported."""

default_message = "Error when validating data against JSON schema:' {data}'"

def __init__(self, data, msg=None):
self.data = data
super().__init__(msg=msg)
3 changes: 3 additions & 0 deletions core/fixtures/contact-types.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
[
{
"name": "Researcher"
},
{
"name": "Principal_Investigator"
},
Expand Down
23 changes: 23 additions & 0 deletions core/fixtures/data-log-types.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
[
{
"name": "Transfer"
},
{
"name": "Receipt"
},
{
"name": "Deletion"
},
{
"name": "Selective Deletion"
},
{
"name": "Archival"
},
{
"name": "DAC Decision"
},
{
"name": "Other"
}
]
Loading

0 comments on commit 0f60f83

Please sign in to comment.