Job Overview
We are seeking a skilled professional to lead the design, development, and maintenance of scalable data pipelines and workflows.
About the Role
* Develop, schedule, and monitor data pipelines using Apache Airflow.
* Collaborate with data analysts, scientists, and engineering teams to optimize ETL processes for performance, reliability, and scalability.
* Maintain and improve data architecture to ensure data integrity and security.
* Troubleshoot and resolve pipeline failures and performance bottlenecks.
Requirements
* Bachelor's degree in Computer Science, Statistics, Engineering, Economics, or a related field.
* Curiosity, collaboration, entrepreneurship, and fluency in English.
What We Offer
* Meal allowance.
* Transportation allowance.
* Free office space.
* Gym membership.
* Health insurance.
* Bi-monthly team meetings.
* Annual performance evaluations with opportunities for promotion.