About This Role
A dynamic work environment is one where all voices are valued and respected. We strive to build a diverse and inclusive space, where empathy is for everyone.
Responsibilities
* We're seeking a professional to develop, schedule, and monitor data pipelines using Apache Airflow.
* The ideal candidate will collaborate with data analysts, scientists, and engineering teams to ensure reliable data delivery.
* This role involves optimizing ETL processes for performance, reliability, and scalability.
* Maintaining and improving data architecture is also key to ensuring data integrity and security.
* Troubleshooting pipeline failures and performance bottlenecks is an essential part of this role.
Requirements
* A degree in Computer Science, Statistics, Engineering, Economics, or a related field is required.
* Candidates must possess curiosity, always looking for innovative solutions.
* Collaboration is essential, with knowledge sharing and contributing to information dissemination within the team.
* Entrepreneurial spirit is crucial, bringing new ideas and innovations to the table.
* Fluency in English is necessary for effective communication within our international environment.