About this data engineering role, you will design and develop large-scale data pipelines to support analytics and artificial intelligence (AI) initiatives. These pipelines will ingest and process various data sources to provide clean and reliable datasets for downstream systems.
ETL/ELT workflows must be implemented and optimized with a focus on performance, dependability, and cost efficiency. Data models and curated layers should be designed to meet the needs of reporting, analytics, and AI use cases.
The ideal candidate will collaborate closely with cross-functional teams, including analytics, AI/data science, and product groups, to ensure data requirements are met and service level agreements (SLAs) are upheld.
Additionally, the selected candidate will work to enhance AI and advanced analytics capabilities by providing well-structured and high-quality datasets.
* Design and implement scalable data pipelines
* Optimize ETL/ELT workflows
* Develop and maintain data models
* Collaborate with cross-functional teams
* Provide reliable datasets for AI and analytics