About Our Team
We strive to create a work environment that fosters empathy and mutual respect.
Our goal is to build an inclusive culture where all voices are valued and respected, regardless of ethnicity, color, gender, sexual orientation, gender identity, disability, or religious belief.
Key Responsibilities:
* Design, develop, and maintain scalable data pipelines and workflows to support analytics and data-driven products.
* Develop, schedule, and monitor data pipelines using Apache Airflow.
* Collaborate with data analysts, scientists, and engineering teams to ensure reliable data delivery.
* Optimize ETL processes for performance, reliability, and scalability.
* Maintain and improve data architecture, ensuring data integrity and security.
* Troubleshoot and resolve pipeline failures and performance bottlenecks.
Requirements:
* A degree in Computer Science, Statistics, Engineering, Economics, or a related field.
* Curiosity: You are always looking for innovative solutions.
* Collaboration: Knowledge sharing is essential to you.
* Entrepreneurship: You bring solutions, new ideas, and innovations.
* Fluent English: You are comfortable communicating in English.