Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI for lingo with ECS #106

Merged
merged 15 commits into from
Oct 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/actions/build-and-test-branch/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ runs:

- name: Ensure frontend configuration files exist
run: |
export ARCHES_DJANGO_DEBUG=True
python manage.py check
shell: bash

Expand Down
81 changes: 81 additions & 0 deletions .github/workflows/build-ci-container.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
name: Build Container Image

on:
push:
branches:
- "deploy"
- "test/*"
repository_dispatch:
type:
- deploy_project
jobs:
build:
name: Build Docker Image and Push
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
path: arches-lingo

- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-1

- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v2

- name: Build, tag, and push image to Amazon ECR
id: build-image
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: lingo-repository
IMAGE_TAG: ${{ github.sha }}
run: |
# Build a docker container and
# push it to ECR so that it can
# be deployed to ECS.
docker build -t $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG -f arches-lingo/docker/production/Dockerfile ./arches-lingo
docker push $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
echo "::set-output name=image::$ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG"

- name: Fill in the new image ID in the Amazon ECS task definition for service
id: task-def
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: arches-lingo/docker/deploy/task-definition.json
container-name: arches
image: ${{ steps.build-image.outputs.image }}

- name: Fill in the new image ID in the Amazon ECS task definition to reset db
id: task-def-run-reset
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: arches-lingo/docker/deploy/task-definition-reset-database.json
container-name: arches
image: ${{ steps.build-image.outputs.image }}

- name: Deploy Reset Amazon ECS task definition to reset db
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
id: task-def-deploy-reset
with:
task-definition: ${{ steps.task-def-run-reset.outputs.task-definition }}
cluster: lingo-cluster

- name: Reset database
id: run-reset-task
run: |
# Build a docker container and
# push it to ECR so that it can
# be deployed to ECS.
aws ecs run-task --cluster lingo-cluster --task-definition ${{ steps.task-def-deploy-reset.outputs.task-definition-arn }} --count 1 --launch-type FARGATE --network-configuration "awsvpcConfiguration={subnets=['subnet-000e3d777b0f3b605','subnet-0a424c54d72c1d54f'],securityGroups=['sg-014ef1f241f91407a']}"

- name: Deploy Amazon ECS task definition to arches service
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: lingo-arches-service
cluster: lingo-cluster
1 change: 1 addition & 0 deletions arches_lingo/hosts.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,5 @@
host_patterns = patterns(
"",
host(re.sub(r"_", r"-", r"arches_lingo"), "arches_lingo.urls", name="arches_lingo"),
host(r"arches", "arches_references.urls", name="arches"),
)
164 changes: 135 additions & 29 deletions arches_lingo/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,17 +9,126 @@
import inspect
import semantic_version
from datetime import datetime, timedelta
from django.core.exceptions import ImproperlyConfigured
from django.utils.translation import gettext_lazy as _
from django.utils.crypto import get_random_string

try:
from arches.settings import *
except ImportError:
pass


def get_env_variable(var_name):
msg = "Set the %s environment variable"
try:
return os.environ[var_name]
except KeyError:
error_msg = msg % var_name
raise ImproperlyConfigured(error_msg)


def get_optional_env_variable(var_name, default=None) -> str:
try:
return os.environ[var_name]
except KeyError:
return default


APP_NAME = "arches_lingo"
APP_VERSION = semantic_version.Version(major=0, minor=0, patch=0)
SECRETS_MODE = get_optional_env_variable("ARCHES_SECRETS_MODE", "ENV")

APP_ROOT = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))

# environment variable names for postgres are built-ins for the pg client, do not modify
DB_NAME = get_optional_env_variable("PGDATABASE", APP_NAME)
DB_USER = get_optional_env_variable("PGUSER", "postgres")
DB_PASSWORD = get_optional_env_variable("PGPASSWORD", "postgis")
DB_HOST = get_optional_env_variable("PGHOST", "localhost")
DB_PORT = get_optional_env_variable("PGPORT", "5432")

ES_USER = get_optional_env_variable("ARCHES_ESUSER", "elastic")
ES_PASSWORD = get_optional_env_variable("ARCHES_ESPASSWORD", "E1asticSearchforArche5")
ES_HOST = get_optional_env_variable("ARCHES_ESHOST", "localhost")
ES_PORT = int(get_optional_env_variable("ARCHES_ESPORT", "9200"))
WEBPACK_DEVELOPMENT_SERVER_PORT = int(
get_optional_env_variable("ARCHES_WEBPACKDEVELOPMENTSERVERPORT", "8022")
)
ES_PROTOCOL = get_optional_env_variable("ARCHES_ESPROTOCOL", "http")
ES_VALIDATE_CERT = get_optional_env_variable("ARCHES_ESVALIDATE", "True") == "True"
DEBUG = bool(get_optional_env_variable("ARCHES_DJANGO_DEBUG", False))
KIBANA_URL = get_optional_env_variable("ARCHES_KIBANA_URL", "http://localhost:5601/")
KIBANA_CONFIG_BASEPATH = get_optional_env_variable(
"ARCHES_KIBANACONFIGBASEPATH", "kibana"
)
RESOURCE_IMPORT_LOG = get_optional_env_variable(
"ARCHES_RESOURCEIMPORTLOG", os.path.join(APP_ROOT, "logs", "resource_import.log")
)
ARCHES_LOG_PATH = get_optional_env_variable(
"ARCHES_LOGPATH", os.path.join(ROOT_DIR, "arches.log")
)

STORAGE_BACKEND = get_optional_env_variable(
"ARCHES_STORAGEBACKEND", "django.core.files.storage.FileSystemStorage"
)

if STORAGE_BACKEND == "storages.backends.s3.S3Storage":
import psutil

STORAGE_OPTIONS = {
"bucket_name": get_env_variable("ARCHES_S3BUCKETNAME"),
"file_overwrite": get_optional_env_variable("ARCHES_S3FILEOVERWRITE", "True")
== "True",
"signature_version": get_optional_env_variable(
"ARCHES_S3SIGNATUREVERSION", "s3v4"
),
"region": get_optional_env_variable("ARCHES_S3REGION", "us-west-1"),
"max_memory_size": get_optional_env_variable(
"ARCHES_S3MAXMEMORY", str(psutil.virtual_memory().available * 0.5)
),
}
else:
STORAGE_OPTIONS = {}

STORAGES = {
"default": {
"BACKEND": STORAGE_BACKEND,
"OPTIONS": STORAGE_OPTIONS,
},
"staticfiles": {
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
},
}

if SECRETS_MODE == "AWS":
try:
import boto3
import json

AWS_REGION = get_optional_env_variable("ARCHES_AWS_REGION", "us-west-1")
ES_SECRET_ID = get_env_variable("ARCHES_ES_SECRET_ID")
DB_SECRET_ID = get_env_variable("ARCHES_DB_SECRET_ID")
client = boto3.client("secretsmanager", region_name=AWS_REGION)
es_secret = json.loads(
client.get_secret_value(SecretId=ES_SECRET_ID)["SecretString"]
)
db_secret = json.loads(
client.get_secret_value(SecretId=DB_SECRET_ID)["SecretString"]
)
DB_NAME = APP_NAME
DB_USER = db_secret["username"]
DB_PASSWORD = db_secret["password"]
DB_HOST = db_secret["host"]
DB_PORT = db_secret["port"]
ES_USER = es_secret["user"]
ES_PASSWORD = es_secret["password"]
ES_HOST = es_secret["host"]
except (ModuleNotFoundError, ImportError):
pass


APP_VERSION = semantic_version.Version(major=0, minor=0, patch=0)


WEBPACK_LOADER = {
"DEFAULT": {
Expand Down Expand Up @@ -53,19 +162,20 @@
FILENAME_GENERATOR = "arches.app.utils.storage_filename_generator.generate_filename"
UPLOADED_FILES_DIR = "uploadedfiles"

chars = "abcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*(-_=+)"
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = "--+c7*txnosqv=flep00qp+=t-xhrj%f4==r8w*n_7pm@mi%)7"

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
SECRET_KEY = get_optional_env_variable(
"ARCHES_SECRET_KEY", "django-insecure-" + get_random_string(50, chars)
)

ROOT_URLCONF = "arches_lingo.urls"

ELASTICSEARCH_HOSTS = [{"scheme": ES_PROTOCOL, "host": ES_HOST, "port": ES_PORT}]
# Modify this line as needed for your project to connect to elasticsearch with a password that you generate
ELASTICSEARCH_CONNECTION_OPTIONS = {
"request_timeout": 30,
"verify_certs": False,
"basic_auth": ("elastic", "E1asticSearchforArche5"),
"verify_certs": ES_VALIDATE_CERT,
"basic_auth": (ES_USER, ES_PASSWORD),
}

# If you need to connect to Elasticsearch via an API key instead of username/password, use the syntax below:
Expand All @@ -81,44 +191,40 @@
# Or Kibana: https://www.elastic.co/guide/en/kibana/current/api-keys.html

# a prefix to append to all elasticsearch indexes, note: must be lower case
ELASTICSEARCH_PREFIX = "arches_lingo"
ELASTICSEARCH_PREFIX = get_optional_env_variable("ARCHES_ES_INDEX_PREFIX", APP_NAME)

ELASTICSEARCH_CUSTOM_INDEXES = []
# [{
# 'module': 'arches_lingo.search_indexes.sample_index.SampleIndex',
# 'name': 'my_new_custom_index', <-- follow ES index naming rules
# 'should_update_asynchronously': False <-- denotes if asynchronously updating the index would affect custom functionality within the project.
# }]

KIBANA_URL = "http://localhost:5601/"
KIBANA_CONFIG_BASEPATH = "kibana" # must match Kibana config.yml setting (server.basePath) but without the leading slash,
# also make sure to set server.rewriteBasePath: true

LOAD_DEFAULT_ONTOLOGY = False
LOAD_PACKAGE_ONTOLOGIES = True

# This is the namespace to use for export of data (for RDF/XML for example)
# It must point to the url where you host your site
# Make sure to use a trailing slash
ARCHES_NAMESPACE_FOR_DATA_EXPORT = "http://localhost:8000/"
PUBLIC_SERVER_ADDRESS = get_optional_env_variable(
"ARCHES_PUBLIC_SERVER_ADDRESS", "http://localhost:8000/"
)
ARCHES_NAMESPACE_FOR_DATA_EXPORT = get_optional_env_variable(
"ARCHES_NAMESPACE_FOR_DATA_EXPORT", PUBLIC_SERVER_ADDRESS
)

DATABASES = {
"default": {
"ATOMIC_REQUESTS": False,
"AUTOCOMMIT": True,
"CONN_MAX_AGE": 0,
"ENGINE": "django.contrib.gis.db.backends.postgis",
"HOST": "localhost",
"NAME": "arches_lingo",
"OPTIONS": {
"options": "-c cursor_tuple_fraction=1",
},
"PASSWORD": "postgis",
"PORT": "5432",
"HOST": DB_HOST,
"NAME": DB_NAME,
"PASSWORD": DB_PASSWORD,
"PORT": DB_PORT,
"POSTGIS_TEMPLATE": "template_postgis",
"TEST": {"CHARSET": None, "COLLATION": None, "MIRROR": None, "NAME": None},
"TIME_ZONE": None,
"USER": "postgres",
"USER": DB_USER,
}
}

Expand Down Expand Up @@ -171,13 +277,13 @@
# "silk.middleware.SilkyMiddleware",
]

MIDDLEWARE.insert( # this must resolve to first MIDDLEWARE entry
MIDDLEWARE.insert(
0, "django_hosts.middleware.HostsRequestMiddleware"
)
) # this must resolve to first MIDDLEWARE entry

MIDDLEWARE.append( # this must resolve last MIDDLEWARE entry
MIDDLEWARE.append(
"django_hosts.middleware.HostsResponseMiddleware"
)
) # this must resolve last MIDDLEWARE entry

STATICFILES_DIRS = build_staticfiles_dirs(app_root=APP_ROOT)

Expand All @@ -186,7 +292,7 @@
app_root=APP_ROOT,
)

ALLOWED_HOSTS = []
ALLOWED_HOSTS = get_optional_env_variable("ARCHES_ALLOWED_HOSTS", "*").split(",")

SYSTEM_SETTINGS_LOCAL_PATH = os.path.join(
APP_ROOT, "system_settings", "System_Settings.json"
Expand Down Expand Up @@ -260,7 +366,7 @@
# For more info on configuring your cache: https://docs.djangoproject.com/en/2.2/topics/cache/
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.dummy.DummyCache",
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
},
"lingo": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
Expand Down
24 changes: 24 additions & 0 deletions docker/ca-shpo-online-supervisor.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
[unix_http_server]
file=/tmp/supervisor.sock ; path to your socket file
chmod=7770

[supervisord]
logfile=/var/log/supervisor/supervisord.log ; supervisord log file
logfile_maxbytes=50MB ; maximum size of logfile before rotation
logfile_backups=10 ; number of backed up logfiles
loglevel=info ; info, debug, warn, trace
pidfile=/var/run/supervisord.pid ; pidfile location
nodaemon=false ; run supervisord as a daemon
minfds=1024 ; number of startup file descriptors
minprocs=200 ; number of process descriptors
user=root ; defaults to whichever user is runs supervisor
childlogdir=/var/log/supervisor/ ; where child log files will live

[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface

[supervisorctl]
serverurl=unix:///tmp/supervisor.sock ; use unix:// schem for a unix sockets.

[include]
files=./conf.d/arches-lingo-celeryd.conf ./conf.d/arches-lingo-celerybeat.conf
22 changes: 22 additions & 0 deletions docker/conf.d/ca-shpo-online-celerybeat.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
; ================================
; celery beat supervisor
; ================================

[program:celerybeat]
command=python3 -m celery -A arches_vgm.celery beat --loglevel=INFO
directory=/web_root/arches-vgm

user=root
numprocs=1
stdout_logfile=/var/log/celery/beat.log
stderr_logfile=/var/log/celery/beat.log
autostart=true
autorestart=true
startsecs=10

; Causes supervisor to send the termination signal (SIGTERM) to the whole process group.
stopasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=999
Loading
Loading