About the Role:
* Design scalable data pipelines with Apache Airflow and deliver business insights efficiently.
* Collaborate effectively with cross-functional teams to guarantee reliable data delivery and drive organizational success.
* Optimize extract, transform, load (ETL) processes for performance, scalability, and reliability.
* Maintain and improve the data architecture for integrity and security.
* Troubleshoot pipeline failures and performance bottlenecks to minimize downtime.
Requirements:
* Computer Science, Statistics, Engineering, Economics, or related degree, preferably with a strong academic record.
* Demonstrate proven problem-solving skills and a genuine interest in finding innovative solutions.
* Showcase exceptional collaboration and communication skills, ensuring seamless teamwork and effective information exchange.
* Be able to think creatively and bring forward-thinking ideas to drive business growth.
* Possess excellent English language skills to articulate complex concepts clearly.
Bonus Points:
* Achieve industry-recognized certifications in data engineering or analytics.
* Participate in online forums or contribute to open-source projects to demonstrate expertise and passion for data-related topics.