Easy submission of spike sorting jobs locally or to the Cloud:
Keep track of your sorting jobs:
Set ENV variables:
export AWS_DEFAULT_REGION=
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export AWS_BATCH_JOB_QUEUE=
export AWS_BATCH_JOB_DEFINITION=
export DANDI_API_KEY=
Running with docker compose pulling images from github packages:
docker compose up
Running with docker compose building images locally (for dev, with hot reaload):
docker compose -f docker-compose-dev.yml up
If you did any changes in requirements.txt
, package.json
or Dockerfile
, you should stop the containers and run again with an extra --build
flag:
docker compose down
docker compose -f docker-compose-dev.yml up --build
Run rest api standalone (dev):
cd rest
python main.py
Run frontend standalone (dev):
cd frontend
yarn start
The app is composed of four components:
rest
- the rest api, which is a FastAPI appfrontend
- the frontend, which is a React appdb
- the database, which is a Postgres databaseworker
- the worker, which is a sorter container with a Flask app
DOCKER_BUILDKIT=1 docker build -t ghcr.io/catalystneuro/si-sorting-worker:latest -f Dockerfile.combined .
docker push ghcr.io/catalystneuro/si-sorting-worker:latest