Skip to content

An app to periodically fetch metrics which uses celery backend for distributed task processing

Notifications You must be signed in to change notification settings

utsengar/artemis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Artemis

A web app to fetch metrics via REST endpoints based on celery's distributed task queue.
Why Artemis? She was a greek godess for hunting. This app hunts for metrics across a cluster of nodes.

  1. ###Design consideration:

    1. Flask to run the web app
    2. Celery with redis as message broker
      • Alternative: In place of celery, python's multiprocessing can be used for parallelism. But I chose clerey for clear separation of web app and task workers. Why celery? Because the message passing will further enable to distribute load to multiple machines compared to mutiprocessing which will give parallelism on a single machine. Production quality
  2. ###Installation:

    1. pip install -r requirements.txt
    2. Install redis (and keep the default config, if a different port is needed, please change the setting in settings.py)
  3. ###Running the app:

    1. Flask web app: python artemis.py runserver

    2. Start redis: ./src/redis-server

    3. Start Celery (from artemis/tasks): celery -A fetch worker --loglevel=debug

    4. Start accepting requests (via POST request)!

      Example: curl -d "node=localhost:5000&node=127.0.0.1:5000&url=metric" http://localhost:5000/fetch

  4. ###Input explanation:

     node should be 1 or more
     node=localhost:5000
     node=127.0.0.1:5000
    
     url pattern should be 1 which can be applied to all the nodes
     url=metric
    
  5. ###How does it work:

    1. POST request is received and a task will be added to celery queue and all processing is asynchronous now.
    2. artemis will construct the metrics endpoint by concatenating the 2 inputs:
      1. http://localhost:5000/metric
      2. http://127.0.0.1:5000/metric
    3. Celery task will get the JSON data from /metric (using requests.get)
    4. Write JSON to a flat file for now (this can be changed to be written to a DB)
  6. ###Enhacements:

    1. Error handling and retries is needed to be implemented
    2. Simple benchmarking to measure total nodes 8 celery processes and 1 instance of redis can handle.
  7. ###Screenshot: redis, celery and flask

About

An app to periodically fetch metrics which uses celery backend for distributed task processing

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages