Job Title: Data Pipeline Specialist
We are seeking a skilled professional with hands-on experience in designing, building, and maintaining scalable data pipelines using Apache Airflow.
About the Role:
* This is an opportunity to collaborate with cross-functional teams to ensure reliable data delivery and optimize ETL processes for performance, reliability, and scalability.
* You will be responsible for maintaining and improving data architecture, ensuring data integrity and security.
Key Responsibilities:
1. Design and build complex data pipelines using Apache Airflow, incorporating data from various sources into a unified system.
2. Collaborate with stakeholders to identify business requirements and implement solutions that meet those needs.
3. Work closely with data analysts and scientists to develop data visualizations and insights that inform business decisions.
4. Analyze and troubleshoot pipeline failures and performance bottlenecks, implementing corrective actions to improve overall efficiency.
Requirements:
* 3+ years of experience in designing and developing data pipelines, preferably with Apache Airflow.
* Strong understanding of data modeling, data warehousing, and data visualization.
* Excellent problem-solving skills, with the ability to analyze complex data sets and identify patterns.
* Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams.
Bonus Points:
* Experience with cloud-based technologies, such as AWS or GCP.
* Knowledge of machine learning algorithms and their application in data analysis.
* Familiarity with data governance frameworks and regulations, such as GDPR and HIPAA.