Skip to content

Commit

Permalink
Merge pull request #239 from OCHA-DAP/dev
Browse files Browse the repository at this point in the history
dev into prod for 0.7.2
  • Loading branch information
danmihaila authored Oct 22, 2024
2 parents 9f52e09 + 28ac13b commit 69f5eb4
Show file tree
Hide file tree
Showing 6 changed files with 108 additions and 4 deletions.
54 changes: 54 additions & 0 deletions DEPLOYMENT-PROCESS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# HAPI COMPONENTS
-----------------
1) HAPI-SCHEMA release 0.8.17

2) HAPI-PIPELINES plays dev role: `main` branch is for (code) -> `db-export` branch (CSV daily depends on HAPI-SCHEMA release 0.8.17)

3) HAPI-PIPELINES-PROD plays prod role: `main` (code) <- PRs from HAPI-PIPELINES
-> `db-export` (CSV daily depends on HAPI-SCHEMA release 0.8.17)

4) HWA 0.0.7 depends on HAPI-SCHEMA
-> imports `db-export` (CSV daily depends on HAPI-SCHEMA release 0.8.17) from HAPI-PIPELINES-PROD / HAPI-PIPELINES -> postgres

5) HAPI API 0.7.0 depends on HAPI-SCHEMA release 0.8.17
serves data from postgres

6) HAPI SMOKE TESTS depend on endpoint schema, which in turn, depends on HAPI SCHEMA


# PROCESS:
----------
1) HAPI-SCHEMA has a new release ( ex: deletes a column from an existing table, adds a new table ), 0.9.0
2) HAPI-PIPELINES use HAPI-SCHEMA 0.9.0 -> produces 0.9.0 CSVs
3) [Ticket] HAPI API to use new 0.9.0 HAPI-SCHEMA
4) [Ticket] HWA to use new 0.9.0 HAPI-SCHEMA. Special care need to be taken when creating **alembic revisions**
- *Enum changes* are not automatically detected by alembic
- `nullable=False` constraints can pose problems with existing data
- changes to primary keys can pose problems to existing data

**NOTE**: 3 and 4 can be implemented in parallel
5) Deploy on dev server:
- Deploy [dev] HWA
- Run in parallel:
- Run HWA ingest
- Deploy [dev] HAPI
- Manual Testing
1) [Ticket] HAPI SMOKE TESTS update
2) Test HAPI SMOKE TESTS run successfully on dev server
3) HWA new version 0.0.8
- make sure to check for enum changes when creating alembic patch
4) HAPI API new version 0.8.0
5) PR from HAPI-PIPELINES to HAPI-PIPELINES-PROD -> generate new 0.9.0 CSVs
6) Deploy on prod server:
- Deploy [0.0.8] HWA
- Run in parallel:
- Run HWA ingest
- Deploy [0.8.0] HAPI
- Manual Testing
- Run smoke tests on prod


IDEAS FOR A BRIGHTER FUTURE:
----------------------------
- HWA to create a new postgres schema for each new HAPI SCHEMA release
- Stop using postgres, use something that supports faceting out of the box (solr, elastic, mongo)
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ This project is currently in development and the outputs are not yet public exce
* HAPI documentation
* HAPI endpoint root
* [readthedocs](https://hdx-hapi.readthedocs.io/en/latest/)
* HAPI [deployment process](/DEPLOYMENT-PROCESS.md)

# Related repositories

Expand Down
10 changes: 10 additions & 0 deletions docker/hapi_run
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,16 @@ else
envsubst < /srv/hapi/docker/unit.json.tpl > /var/lib/unit/conf.json
fi

# LOGGING_CONF_FILE needs to be set before substituting app.conf.tpl
export LOGGING_CONF_FILE=/srv/logging.conf

# regenerate logging.conf
export LOG_LEVEL="${LOG_LEVEL:-INFO}"
[ ! -z "${LOG_LEVEL_CONSOLE}" ] || export LOG_LEVEL_CONSOLE=${LOG_LEVEL}
[ ! -z "${LOG_LEVEL_JSON}" ] || export LOG_LEVEL_JSON=${LOG_LEVEL}
[ ! -z "${LOG_LEVEL_TXT}" ] || export LOG_LEVEL_TXT=${LOG_LEVEL}
envsubst < /srv/hapi/docker/logging.conf.tpl > /srv/logging.conf

chmod 600 /var/lib/unit/conf.json
chown -R unit /var/log/hapi

Expand Down
39 changes: 39 additions & 0 deletions docker/logging.conf.tpl
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
[loggers]
keys=root

[handlers]
keys=consoleHandler, fileHandler, jsonFileHandler

[formatters]
keys=simpleFormatter, jsonFormatter

[logger_root]
level=${LOG_LEVEL}
handlers=consoleHandler, fileHandler, jsonFileHandler


[handler_consoleHandler]
class=StreamHandler
level=${LOG_LEVEL_CONSOLE}
formatter=simpleFormatter
args=(sys.stdout,)

[handler_fileHandler]
class = FileHandler
args = ('/var/log/hapi/hapi.log','a')
level = ${LOG_LEVEL_TXT}
formatter = simpleFormatter

[handler_jsonFileHandler]
class = FileHandler
args = ('/var/log/hapi/hapi-json.log','a')
level = ${LOG_LEVEL_JSON}
formatter = jsonFormatter

[formatter_simpleFormatter]
format=[%(process)d - %(thread)d] %(asctime)s %(levelname)-5.5s [%(name)s:%(lineno)d] %(message)s
datefmt=

[formatter_jsonFormatter]
format = %(process)d %(thread)d %(asctime)s %(levelname) %(threadName)s %(name)s %(lineno)d %(message)s %(funcName)s
class = pythonjsonlogger.jsonlogger.JsonFormatter
2 changes: 1 addition & 1 deletion hdx_hapi/endpoints/util/version.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
api_version = '0.7.1'
api_version = '0.7.2'
hapi_sqlalchemy_schema_version = '0.9.0'
6 changes: 3 additions & 3 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
import logging
import logging.config
import os

logging.config.fileConfig('logging.conf')
logging.config.fileConfig(os.getenv('LOGGING_CONF_FILE','logging.conf'))

import uvicorn # noqa
from fastapi import FastAPI, Request # noqa
Expand Down Expand Up @@ -153,6 +154,5 @@ def home():
async def resp_validation_exception_handler(request: Request, exc: ResponseValidationError):
return await response_validation_error_handler(request, exc)


if __name__ == '__main__':
uvicorn.run(app, host='0.0.0.0', port=8844, log_config='logging.conf')
uvicorn.run(app, host='0.0.0.0', port=8844, log_config=os.getenv('LOGGING_CONF_FILE','logging.conf'))

0 comments on commit 69f5eb4

Please sign in to comment.