A web app to fetch metrics via REST endpoints based on celery's distributed task queue.
Why Artemis? She was a greek godess for hunting. This app hunts for metrics across a cluster of nodes.
-
###Design consideration:
- Flask to run the web app
- Celery with redis as message broker
- Alternative: In place of celery, python's multiprocessing can be used for parallelism. But I chose clerey for clear separation of web app and task workers. Why celery? Because the message passing will further enable to distribute load to multiple machines compared to mutiprocessing which will give parallelism on a single machine. Production quality
-
###Installation:
- pip install -r requirements.txt
- Install redis (and keep the default config, if a different port is needed, please change the setting in settings.py)
-
###Running the app:
-
Flask web app: python artemis.py runserver
-
Start redis: ./src/redis-server
-
Start Celery (from artemis/tasks): celery -A fetch worker --loglevel=debug
-
Start accepting requests (via POST request)!
Example: curl -d "node=localhost:5000&node=127.0.0.1:5000&url=metric" http://localhost:5000/fetch
-
-
###Input explanation:
node should be 1 or more node=localhost:5000 node=127.0.0.1:5000 url pattern should be 1 which can be applied to all the nodes url=metric
-
###How does it work:
- POST request is received and a task will be added to celery queue and all processing is asynchronous now.
- artemis will construct the metrics endpoint by concatenating the 2 inputs:
- Celery task will get the JSON data from /metric (using requests.get)
- Write JSON to a flat file for now (this can be changed to be written to a DB)
-
###Enhacements:
- Error handling and retries is needed to be implemented
- Simple benchmarking to measure total nodes 8 celery processes and 1 instance of redis can handle.