Edit idetect/docker.env
to add the appropriate environment variables
Exporting the UID is necessary before build so that the user that everything
runs as inside the docker container matches the user on the host machine.
Without this, there will be a bunch of permissions problems, like things
failing because they can't write to the .jupyter
or .newspaper_scraper
directories. This could also be avoided by not volume mounting the code
into the containers, which would be an option in production. Having to
rebuild
the images every time during development would be a real drag,
though.
We start the workers in order to get it to run the setup.py script.
export UID
docker-compose build
docker-compose up workers
Next run the update script to create the fact API tables. Assuming the localdb docker container is running and you have psql installed on your host machine, you can use:
psql -U postgres -h localhost -p 5433 idetect < source/data/update.sql
Start LocalDB, Workers, Flask App, Jupyter:
docker-compose up
Rebuild after changing requirements.txt:
docker-compose build
Just start LocalDB (eg. for running unit tests in an IDE):
docker-compose up localdb
Run unit tests in docker:
docker-compose up unittests
docker exec -it idetect_workers_1 bash
supervisorctl status # see what's running
supervisorctl stop all # stop all workers
supervisorctl start classifier:classifier-00 # start a single classifier
supervisorctl start extractor:* # start all extractors
Logs for the workers are available on the host machine in idetect/logs/workers
or
inside the docker container at /var/log/workers
docker logs idetect_notebooks_1