Skip to content

Latest commit

 

History

History
235 lines (163 loc) · 9.85 KB

CONTRIBUTING.md

File metadata and controls

235 lines (163 loc) · 9.85 KB

Contributing to pyro-api

Everything you need to know to contribute efficiently to the project!

Whatever the way you wish to contribute to the project, please respect the code of conduct.

Data model

The back-end core feature is to interact with the metadata tables. For the service to be useful for wildfire detection, multiple tables/object types are introduced and described as follows:

Access-related tables

  • Users: stores the hashed credentials and access level for users.
  • Cameras: stores the camera metadata.
  • Organizations: scope the access to the API.

Core worklow tables

  • Detection: association of a picture and a camera.

Client-related tables

  • Webhook: stores the webhook URLs.

The UML is versioned at scripts/dbdiagram.txt and the UML diagram is available on DBDiagram.

What is the full detection workflow through the API

The API has been designed to provide, for each wildfire detection, the alert metadata:

  • timestamp
  • the picture that was used for detection
  • the camera that detected the event

With the previously described tables, here are all the steps to send a wildfire alert:

  • Prerequisites (ask the instance administrator): register user
  • Register a camera: declare your camera on the API, using your new user credentials.
  • Create a camera token: create non-user token for the camera to access the API.
  • Create a detection: using the camera credentials, upload the image content and the detection metadata.

Codebase structure

Continuous Integration

This project uses the following integrations to ensure proper codebase maintenance:

  • Github Worklow - run jobs for package build and coverage
  • Codacy - analyzes commits for code quality
  • Codecov - reports back coverage results
  • Sentry - automatically reports errors back to us
  • PostgreSQL - storing and interacting with the metadata database
  • S3 storage - the file system for media storage (not necessarily AWS, but requires S3 interface)
  • PostHog - product analytics
  • Prometheus - Scraping API metrics
  • Traefik - the reverse proxy and load balancer

As a contributor, you will only have to ensure coverage of your code by adding appropriate unit testing of your code.

Feedback

Feature requests & bug report

Whether you encountered a problem, or you have a feature suggestion, your input has value and can be used by contributors to reference it in their developments. For this purpose, we advise you to use Github issues.

First, check whether the topic wasn't already covered in an open / closed issue. If not, feel free to open a new one! When doing so, use issue templates whenever possible and provide enough information for other contributors to jump in.

Questions

If you are wondering how to do something with Pyro-API, or a more general question, you should consider checking out Github discussions. See it as a Q&A forum, or the Pyro-API-specific StackOverflow!

Developer setup

Prerequisites

Configure your fork

1 - Fork this repository by clicking on the "Fork" button at the top right of the page. This will create a copy of the project under your GitHub account (cf. Fork a repo).

2 - Clone your fork to your local disk and set the upstream to this repo

git clone [email protected]:<YOUR_GITHUB_ACCOUNT>/pyro-api.git
cd pyro-api
git remote add upstream https://github.com/pyronear/pyro-api.git

Install the dependencies

Let's install the different libraries:

poetry export -f requirements.txt --without-hashes --with quality --output requirements.txt
pip install -r requirements.txt

Pre-commit hooks

Let's make your life easier by formatting & fixing lint on each commit:

pre-commit install

Environment configuration

In order to run the project, you will need to specific some information, which can be done using a .env file. Copy the default environement variables from .env.example:

cp .env.example .env

This file contains all the information to run the project.

Values you have to replace

None :)

Values you can edit freely

  • POSTGRES_DB: a name for the PostgreSQL database that will be created
  • POSTGRES_USER: a login for the PostgreSQL database
  • POSTGRES_PASSWORD: a password for the PostgreSQL database
  • SUPERADMIN_LOGIN: the login of the initial admin user
  • SUPERADMIN_PWD: the password of the initial admin user
  • SUPERADMIN_ORG: the organization of the initial admin user

Other optional values

  • JWT_SECRET: if set, tokens can be reused between sessions. All instances sharing the same secret key can use the same token.
  • SENTRY_DSN: the DSN for your Sentry project, which monitors back-end errors and report them back.
  • SERVER_NAME: the server tag that will be used to report events to Sentry.
  • POSTHOG_HOST: the host for PostHog PostHog.
  • POSTHOG_KEY: the project API key for PostHog PostHog.
  • SUPPORT_EMAIL: the email used for support of your API.
  • DEBUG: if set to false, silence debug logs.
  • S3_ACCESS_KEY: public key to access to the S3 storage service
  • S3_SECRET_KEY: private key to access the resource.
  • S3_REGION: your S3 bucket is geographically identified by its location's region
  • S3_ENDPOINT_URL: the URL providing a S3 endpoint by your cloud provider
  • S3_PROXY_URL: the url of the proxy to hide the real s3 url behind, do not use proxy if ""
  • TELEGRAM_TOKEN: the token of your Telegram bot

Production-only values

  • ACME_EMAIL: the email linked to your certificate for HTTPS
  • BACKEND_HOST: the subdomain where your users will access your API (e.g "api.mydomain.com")

Developing your feature

Commits

  • Code: ensure to provide docstrings to your Python code. In doing so, please follow Google-style so it can ease the process of documentation later.
  • Commit message: please follow Udacity guide

Tests

In order to run the same unit tests as the CI workflows, you can run unittests locally:

make test

This will run the full suite of core API unittests. However, if you're trying to run some specific unittests, you can do as follows:

make run-dev
docker-compose exec -T backend pytest tests/routes/test_XYZ.py

Code quality

To run all quality checks together

make quality

The previous command won't modify anything in your codebase. Some fixes (import ordering and code formatting) can be done automatically using the following command:

make style

Local deployment

To run the API locally, the easiest way is with Docker. Launch this command in the project directory:

make run-dev

To enable a smoother development experience, we are using localstack to create a local S3 bucket. NOTE: please check localstack documentation to understand how to create buckets or to add/delete objects.

Submit your modifications

Push your last modifications to your remote branch

git push -u origin a-short-description

Then open a Pull Request from your fork's branch. Follow the instructions of the Pull Request template and then click on "Create a pull request".

Database

Schema evolution

See Alembic guide to create revision and run it locally.

Postgres upgrade

With your current PG version, we first make a data extract:

make run
docker compose exec -it db pg_dumpall -U mybdsuperuserpyro > my_local_dump.sql
./scripts/pg_extract.sh my_local_dump.sql pyro_api_prod >> upgrade_dump.sql

We stop the container and remove the volume to prevent it from repopulating the new database

make stop
docker volume rm pyro-api_postgres_data

Now update the Postgres version on your docker. We then run the DB only (to prevent the backend from initializing it) and restore the data:

docker compose up db -d
cat upgrade_dump.sql| docker compose exec -T db psql -U mybdsuperuserpyro