Unlock Scalable Data Solutions
We're seeking a skilled data professional to design, build, and maintain high-performance data pipelines.
The ideal candidate will collaborate with data analysts, scientists, and engineering teams to ensure reliable data delivery.
Responsibilities include:
* Developing, scheduling, and monitoring data pipelines using Apache Airflow
* Collaborating with data analysts, scientists, and engineering teams to ensure reliable data delivery
* Optimizing ETL processes for performance, reliability, and scalability
* Maintaining and improving data architecture, ensuring data integrity and security
* Troubleshooting and resolving pipeline failures and performance bottlenecks
Our team values innovation, collaboration, and entrepreneurship. The successful candidate will have a degree in Computer Science, Statistics, Engineering, Economics, or a related field, and be fluent in English.
Key Skills and Qualifications:
* Strong understanding of data architecture, ETL processes, and Apache Airflow
* Excellent problem-solving skills and ability to troubleshoot complex issues
* Collaborative mindset and strong communication skills
* Experience with data visualization tools and techniques
Benefits:
* Opportunity to work on challenging projects with a dynamic team
* Professional development and growth opportunities
* Competitive compensation package
What We Offer:
Our company is dedicated to providing a collaborative and innovative work environment. We offer competitive salaries, comprehensive benefits, and opportunities for growth and development.