Job Opportunity
We are looking for a skilled Data Engineer to join our team. As a key member of the data engineering team, you will be responsible for developing, scheduling, and monitoring data pipelines using Apache Airflow.
Main Responsibilities:
* Design and implement efficient data pipelines that meet business requirements.
* Collaborate with cross-functional teams, including data analysts, scientists, and engineers, to ensure reliable data delivery.
* Analyze and optimize ETL processes for performance, reliability, and scalability.
* Develop and maintain data architecture, ensuring data integrity and security.
* Troubleshoot and resolve pipeline failures and performance bottlenecks.
Required Skills:
* Bachelor's degree in Computer Science, Statistics, Engineering, Economics, or related field.
* Strong programming skills in languages such as Python, Java, or Scala.
* Experience with Apache Airflow, data warehouses, and big data technologies.
* Excellent collaboration and communication skills.
* Ability to work in a fast-paced environment and prioritize tasks effectively.
What We Offer:
* Competitive salary and benefits package.
* Ongoing training and development opportunities.
* Opportunities for career growth and advancement.
* A dynamic and supportive work environment.
* Professional certifications and awards.
About Us:
* We are a leading technology company dedicated to delivering innovative solutions.
* We value diversity, equity, and inclusion in the workplace.
* We offer flexible working arrangements and remote work options.
* We provide access to cutting-edge technology and tools.
* We foster a culture of continuous learning and improvement.