Job Summary:
We are seeking a skilled professional to design, build, and maintain scalable data pipelines and workflows that support analytics and data-driven products.
This role is ideal for individuals with hands-on experience in Apache Airflow who want to join our data team.
* Schedule and monitor data pipelines using Apache Airflow
* Collaborate with cross-functional teams to ensure reliable data delivery
* Optimize ETL processes for performance, reliability, and scalability
* Maintain and improve data architecture, ensuring data integrity and security
* Troubleshoot and resolve pipeline failures and performance bottlenecks
The ideal candidate will have fluency in English and a degree in Computer Science, Statistics, Engineering, Economics, or a related field.
Key responsibilities include developing, scheduling, and monitoring data pipelines using Apache Airflow.
We offer a free office model, meal/transportation allowance, gym pass, insurance, bi-monthly meetings, and semi-annual evaluations.
We prioritize diversity, inclusion, and empathy, offering a healthy work environment where all voices are heard and valued.