Skip to content

Latest commit

 

History

History
55 lines (29 loc) · 1.16 KB

README.md

File metadata and controls

55 lines (29 loc) · 1.16 KB

Globant: Code Challenge

Implement an ETL process that extracts data from a free fake API, transforms it, and automatically loads it into BigQuery using Airflow. Utilize Agile methodologies, testing and version control tools.

Table of Contents

Installation

Install all requiriments in a Python environment using pip

Usage

Create a folder named DAGs in the Airflow directory containing the file etl_dag.py and its dependencies.

Initialize Airflow Database:

airflow db init

Start the Airflow Webserver:

airflow webserver --port 8080

Open a browser and go to localhost:8080 to access the Airflow UI.

Start Airflow Scheduler:

alt text

Metrics

alt text

Features

Troubleshoots

Contact

Cristian MB [email protected]