Job Title
Design and Implement Scalable Data Pipelines
We're seeking a skilled professional to join our data team as a Data Engineer. The ideal candidate will have hands-on experience in Apache Airflow and a proven track record of designing, building, and maintaining scalable data pipelines and workflows that support analytics and data-driven products.
* Develop, schedule, and monitor data pipelines using Apache Airflow.
* Collaborate with data analysts, scientists, and engineering teams to ensure reliable data delivery.
* Optimize ETL processes for performance, reliability, and scalability.
* Maintain and improve data architecture, ensuring data integrity and security.
* Troubleshoot and resolve pipeline failures and performance bottlenecks.
Requirements
* A degree in Computer Science, Statistics, Engineering, Economics, or a related field.
* Curiosity: You are always looking for innovative solutions for your clients.
* Collaboration: Knowledge sharing is essential to you, and you actively contribute to spreading information within the organization.
* Entrepreneurship: You bring solutions, new ideas, and innovations.
* Fluent English: You are comfortable communicating in English, as it is essential for working in an international environment.
Artefact values a culture based on empathy and diversity.