2018-06-07 14:16:50 +02:00
Run your own inventaire in a docker environment
2018-05-09 18:19:47 +02:00
2018-06-07 14:16:50 +02:00
## Requirements
- [docker-compose ](https://docs.docker.com/compose/gettingstarted/ ) up and ready
- git
2018-05-09 18:19:47 +02:00
2018-06-07 14:16:50 +02:00
## Install
2018-05-09 18:19:47 +02:00
```
git clone https://github.com/inventaire/inventaire-docker.git
```
got to `cd inventaire-docker`
clone the two repos inventaire needs to run :
2018-07-21 13:22:41 +02:00
- `inventaire` core application server -> [setup ](https://github.com/inventaire/inventaire#installation )
- `entities-search-engine` for querying entities -> [go to repo ](https://github.com/inventaire/entities-search-engine )
2018-05-09 18:19:47 +02:00
```
git clone https://github.com/inventaire/inventaire.git
git clone https://github.com/inventaire/entities-search-engine.git
```
2018-10-08 14:59:36 +02:00
Create empty folders for docker volume to dump backup. In accordance with docker-compose volumes, example: `mkdir data couch-test couch es`
2018-06-07 14:16:50 +02:00
Start the magic, build everything at once !
2018-05-09 18:19:47 +02:00
```
docker-compose up --build
```
2018-06-07 14:16:50 +02:00
## Useful commands
`docker-compose up` : start containers if already built
`docker-compose down` : kill active containers
`docker rm $(docker ps -a -q)` : delete stopped containers
`docker rmi $(docker images -q -f dangling=true)` : delete untagged images
Check out [official doc ](https://docs.docker.com/compose/ )
2018-05-09 18:19:47 +02:00
## Load wikidata into elasticsearch
2018-08-23 18:01:05 +02:00
Make sure ES import limit is above entities-search-engige import rate, by raising the limit
```
2018-09-09 12:55:28 +02:00
docker-compose exec entities-search-engine curl -XPUT http://elasticsearch:9200/wikidata/_settings -d '{"index.mapping.total_fields.limit": 20000}'
2018-08-23 18:01:05 +02:00
```
2018-05-09 18:19:47 +02:00
start the containers `docker-compose up`
```
claim=P31:Q5
type=humans
2018-08-23 18:01:05 +02:00
docker-compose exec entities-search-engine ./bin/dump_wikidata_subset $claim $type
2018-05-09 18:19:47 +02:00
```
2018-10-08 14:59:36 +02:00
[more info on importing some wikidata items ](https://github.com/inventaire/inventaire-deploy/install_entities_search_engine )
more docs [wikidata filtered dump import ](https://github.com/inventaire/entities-search-engine/blob/master/docs/wikidata_filtered_dump_import.mdFv )
2018-05-09 18:19:47 +02:00
## Fixtures
In case you would like to play with out-of-the-box data.
2018-09-12 13:48:44 +02:00
Run api tests to populate tests dbs (see Tests section)
2018-05-09 18:19:47 +02:00
```
2018-10-14 21:47:48 +02:00
docker-compose -f docker-compose.yml -f docker-compose.test.yml up exec inventaire npm run test-api
2018-05-09 18:19:47 +02:00
```
- Replicate `*-tests` dbs documents into `*` dbs
```
2018-09-12 13:48:44 +02:00
`docker-compose exec inventaire npm run replicate-tests-db`
2018-05-09 18:19:47 +02:00
```
2018-09-12 13:48:44 +02:00
2018-10-14 21:47:48 +02:00
## Tests
2018-09-12 13:48:44 +02:00
2018-10-14 21:47:48 +02:00
Start services with test environnement with [multiple compose files ](https://docs.docker.com/compose/extends/#understanding-multiple-compose-files )
```
docker-compose -f docker-compose.yml -f docker-compose.test.yml up
```
2018-09-12 13:48:44 +02:00
Execute tests script
`docker-compose exec inventaire npm run test-api`
or execute directly the test command
2018-10-14 21:47:48 +02:00
`docker-compose exec inventaire ./node_modules/.bin/mocha --compilers coffee:coffee-script/register --timeout 20000 /opt/inventaire/path/to/test/file`
2018-10-08 14:59:36 +02:00
Tip : create a symbolic link on your machine between the inventaire folder and docker working directory on your machine at `/opt/` , in order to autocomplete path to test file to execute
2018-10-14 21:47:48 +02:00
`sudo ln ~/path/to/inventaire-docker/inventaire /opt -s`