Skip to content

Commit

Permalink
Merge pull request #98 from NIAEFEUP/develop
Browse files Browse the repository at this point in the history
Merge develop into main
  • Loading branch information
tomaspalma authored Aug 24, 2024
2 parents 281c458 + 73c906f commit 2e85bb4
Show file tree
Hide file tree
Showing 33 changed files with 843 additions and 678 deletions.
57 changes: 0 additions & 57 deletions .github/workflows/ci.yml

This file was deleted.

22 changes: 22 additions & 0 deletions .github/workflows/niployments.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
name: Deploy

on:
push:
branches:
- main
- develop

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Upload to NIployments registry
uses: NIAEFEUP/[email protected]
with:
docker_dockerfile: Dockerfile
docker_context: ./django
docker_target: prod
NIPLOYMENTS_REGISTRY_URL: ${{ vars.NIPLOYMENTS_REGISTRY_URL }}
NIPLOYMENTS_REGISTRY_USERNAME: ${{ vars.NIPLOYMENTS_REGISTRY_USERNAME }}
NIPLOYMENTS_REGISTRY_PASSWORD: ${{ secrets.NIPLOYMENTS_REGISTRY_PASSWORD }}
9 changes: 4 additions & 5 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,20 +3,19 @@ logs
*.log

# sigarra internal data cannot be committed
mysql/sql/01_dump_mysql.sql
postgres/sql/01_dump_postgres.sql

# dotenv environment variables file
.env

# mysql data
mysql/data/*
mysql/sql/01_data.sql
# postgres data
postgres/data/*
postgres/sql/01_data.sql
**__pycache__

# django
django/**/migrations/**
django/university/models.py
django/statistics.sql

# celery
django/celerybeat-schedule
Expand Down
12 changes: 6 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.PHONY: all clean

MYSQL_DATA = ./mysql/sql
POSTGRES_DATA = ./postgres/sql

all: clean_database
@echo [EXECUTING] ./scripts/$(EXEC)
Expand All @@ -11,10 +11,10 @@ download: clean_fetcher clean_database
@-mkdir ./fetcher/data
@echo [DOWNLOADING] data from the source...
@docker-compose run fetcher python ./update_data/download.py
@echo [REMOVING] data from mysql...
@-rm $(MYSQL_DATA)/01_data.sql
@echo [REMOVING] data from postgres...
@-rm $(POSTGRES_DATA)/01_data.sql
@echo [MOVING] data from fetcher to sql...
@mv ./fetcher/data/* ./mysql/sql
@mv ./fetcher/data/* ./postgres/sql

upload:
@echo [UPLOADING] data...
Expand All @@ -27,5 +27,5 @@ clean_fetcher:
@-rm -r ./fetcher/data

clean_database:
@echo [CLEANING] Removing folder mysql/data...
@-rm -r ./mysql/data/
@echo [CLEANING] Removing database data...
@-docker volume rm tts_postgres_data || true
71 changes: 54 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,82 @@
# TTS - backend
The backend for timetable selector.
# TTS - Backend

The backend for the timetable selector, which is a platform that aims to help students better choose their class schedules by allowing them to see and play with all possible combinations.

Made with ❤️ by NIAEFEUP.

## Installation
### Prerequisites
- `docker`
- `docker-compose`
- `docker compose`

### Installing docker
to install docker, take a look on the [official website](https://www.docker.com/) and follow the [`Get docker`](https://docs.docker.com/get-docker/) section to install it. If you're using windows, make sure to have the [`wsl`](https://docs.microsoft.com/en-us/windows/wsl/install) installed.

In case you're using linux, after installing docker check the [`Manage Docker as a non-root user`](https://docs.docker.com/engine/install/linux-postinstall/), so you can use docker without the `sudo` command.
In case you're using linux, after installing docker check the [`Manage Docker as a non-root user`](https://docs.docker.com/engine/install/linux-postinstall/), so you can use docker without the `sudo` command, which involves creating a user group for docker.

## Data

The data is available the NIAEFEUP drive (Only for NIAEFEUP members):
The data is available at the NIAEFEUP drive (Only for NIAEFEUP members):

https://drive.google.com/drive/folders/1hyiwPwwPWhbAPeJm03c0MAo1HTF6s_zK?usp=sharing

- The ```00_schema_mysql.sql``` corresponds to the schema for the most recent data.

- Copy the ```01_data.sql``` and ```00_schema_mysql.sql``` of year and semester you desire to the ```mysql/sql``` folder.

- The ```00_schema_postgres.sql``` corresponds to the schema for the most recent data.

- Copy the ```01_data.sql``` and ```00_schema_postgres.sql``` of year and semester you desire to the ```postgres/sql``` folder.

## Usage

### Development environment
You can start developing by building the local server with docker:

#### Building the container

After you installed docker, go to the folder where you cloned this repository and do:

```yaml
docker-compose build .
docker compose build
```

This will build the docker container for the backend.

In case you have __already build the server before and want to repopulate the database__, make sure you run

```bash
sudo make clean
```

We need to clean the database to repopulate it, since the way the postgres container works is that it only runs the `sql` files present in the `postgres/sql` folder if the database is clean. This is way we need to issue `sudo make clean` in order for the insert sql queries to be run.

#### Running the container

Before running docker, you have to create an `.env` file with required environment variables for the backend to work.

```bash
cp .env.dev .env
```

In case you have __already build the server before and want to build it again__, be sure to delete the folder in `mysql/data`. You can do this by running `sudo rm -r mysql/data/`. To make your life easier, you can simply run the `build_dev.sh` script: `sudo ./build_dev.sh`.
> The sudo permission is nevessary to delete the `mysql/data` folder.
And then you need to set the correct desired values in the `.env` file.

*The `.env` file is not already on the repository in order to prevent sensitive information to be leaked. This way, the file with default non important values (`.env.dev`) serves as a template, while the real file with important sensitive values is on `.gitignore` so it is never accidentally
uploaded to `github` with sensitive information.*

```yaml
docker-compose up
docker compose up
```
#### Some django caveats after running the container

- The first time you run this docker container or after you clean the database, you will need to a wait for some time (5-10 minutes) until the database is populated. It is normal to see django giving a `115` error since the database is not yet ready to anwser to connection requests since it is busy populating itself.

- There are some times on the first execution of this command that django will start giving a`2` error. If that happens, you need to close the container with `docker compose down` and then turning it on with `docker compose up` again.

As well as the build, the running command can also be executed with the `run_dev.sh` script by executing: `./run_dev.sh`.

#### Accessing the development database

> __WARNING__: it's likely that the first execution of `docker-compose up` after building won't work, since django doesn't wait for the database being populated to executed. Thus, if that's your ccase, execute it again.
We are currently using `pgadmin` and you can access it

1. Go to `localhost:4000`

2. On the login screen, both the credentials are as follows:

- Email: [email protected]
- Password: admin

This is fine, since this is only a development environment.
15 changes: 10 additions & 5 deletions django/.env.dev
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
DEBUG=0
SECRET_KEY=foo

MYSQL_DATABASE=tts
MYSQL_PASSWORD=root
MYSQL_USER=root
MYSQL_HOST=db
MYSQL_PORT=3306
POSTGRES_DB=tts
POSTGRES_USER=root
POSTGRES_PASSWORD=root
POSTGRES_HOST=db
POSTGRES_PORT=5432

TTS_REDIS_HOST=tts_redis
TTS_REDIS_PORT=6379
TTS_REDIS_USERNAME=
TTS_REDIS_PASSWORD=
30 changes: 21 additions & 9 deletions django/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,26 +1,38 @@
FROM python:3.8-slim-buster
# deps
FROM python:3.8-slim-buster AS deps

WORKDIR /usr/src/django/

# Get's the output from the django in realtime.
ENV PYTHONUNBUFFERED 1
ENV STATISTICS_NAME tts_be
ENV STATISTICS_PASS batata_frita_123
ENV PYTHONUNBUFFERED=1

# Copy requirements
COPY ./requirements.txt ./requirements.txt

# Dependencies for mysqlclient
# Dependencies for building the requirements
RUN apt-get update
RUN apt-get -y install build-essential default-libmysqlclient-dev
RUN apt-get -y install build-essential

# Install mysql command to wait for the database initialization
RUN apt -y install default-mysql-client
# Install postgres dependencies (pgsql client and development files)
COPY ./etc/pgdg.sh /tmp/pgdg.sh
RUN /tmp/pgdg.sh

RUN apt -y install libpq-dev postgresql-client-16
RUN apt -y clean && rm -rf /var/lib/apt/lists/*

# Install the requirements
RUN pip install -r requirements.txt

EXPOSE 8000

COPY ./entrypoint.sh ./entrypoint.sh
ENTRYPOINT ["sh", "/usr/src/django/entrypoint.sh"]
ENTRYPOINT ["/usr/src/django/entrypoint.sh"]

# prod
FROM deps AS prod

COPY tts_be/ ./tts_be
COPY university/ ./university
COPY manage.py tasks.py ./

CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
14 changes: 6 additions & 8 deletions django/entrypoint.sh
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,24 +1,22 @@
#!bin/sh
#!/bin/sh

# WARNING: The script will not work if formated with CRLF.

# Configure the shell behaviour.
set -e
if [[ ${DEBUG} == 1 ]]
if [[ "${DEBUG}" == 1 ]]
then set -x
fi

# Get parameters.
database_host="$1" # The database host and should be provided the container name.
shift
cmd="$@"

# Waits for mysql initialization.
until mysql -h "$database_host" -u ${MYSQL_USER} -p${MYSQL_PASSWORD} ${MYSQL_DATABASE} -e 'select 1'; do
>&2 echo "MySQL is unavailable - sleeping"
# Waits for PostgreSQL initialization.
until PGPASSWORD="${POSTGRES_PASSWORD}" psql -h "${POSTGRES_HOST}" -U "${POSTGRES_USER}" "${POSTGRES_DB}" -c 'select 1'; do
>&2 echo "PostgreSQL is unavailable - sleeping"
sleep 4
done
>&2 echo "Mysql is up - executing command"
>&2 echo "PostgreSQL is up - executing command"

# Migrate the Django.
python manage.py inspectdb > university/models.py
Expand Down
15 changes: 15 additions & 0 deletions django/etc/pgdg.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/bin/sh

# Source: https://www.postgresql.org/download/linux/ubuntu/

# Import the repository signing key:
apt install -y curl ca-certificates postgresql-common lsb-release

install -d /usr/share/postgresql-common/pgdg
curl -o /usr/share/postgresql-common/pgdg/apt.postgresql.org.asc --fail https://www.postgresql.org/media/keys/ACCC4CF8.asc

# Create the repository configuration file:
sh -c 'echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'

# Update the package lists:
apt update
3 changes: 2 additions & 1 deletion django/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ django-cors-headers==3.10.1
djangorestframework==3.11.0
pytz==2021.3
sqlparse==0.4.2
mysqlclient==1.4.6
psycopg2==2.9.9
celery==5.2.7
redis==3.5.3
python-dotenv==1.0.1
32 changes: 16 additions & 16 deletions django/tasks.py
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
from celery import Celery
from celery.schedules import crontab
import os
from dotenv import dotenv_values

app = Celery('tasks', broker="redis://tts-be-redis_service-1:6379")
CONFIG={
**dotenv_values(".env"), # load variables
**os.environ, # override loaded values with environment variables
}

username_password_str = ''
if os.getenv('TTS_REDIS_USERNAME') != '' and os.getenv('TTS_REDIS_PASSWORD') != '':
username_password_str = f"{os.getenv('TTS_REDIS_USERNAME')}:{os.getenv('TTS_REDIS_PASSWORD')}@"

app = Celery('tasks', broker=f"redis://{username_password_str}{os.getenv('TTS_REDIS_HOST')}:{os.getenv('TTS_REDIS_PORT')}")

# Gets called after celery sets up. Creates a worker that runs the dump_statistics function at midnight and noon everyday
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(minute='0', hour='0, 12'),
dump_statistics.s(),
name='dump statistics'
)
#@app.on_after_configure.connect
#def setup_periodic_tasks(sender, **kwargs):
# sender.add_periodic_task()



@app.task
def dump_statistics():
command = "mysqldump -P {} -h db -u {} -p{} {} statistics > statistics.sql".format(
os.environ["MYSQL_PORT"],
os.environ["MYSQL_USER"],
os.environ["MYSQL_PASSWORD"],
os.environ["MYSQL_DATABASE"])
os.system(command)
Loading

0 comments on commit 2e85bb4

Please sign in to comment.