About the Role
The ideal candidate will have expertise in designing and implementing data architectures on GCP, utilizing services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
* Design scalable, high-performance ETL/ELT pipelines to meet business needs.
* Collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
* Ensure data quality, integrity, and security throughout the entire data lifecycle.
Key Responsibilities:
* Develop and maintain data models aligned with business objectives.
* Automate ingestion, transformation, and data delivery processes.
* Monitor and optimize cost and performance of GCP resources.
Benefits:
* Hands-on experience with cloud cost and performance optimization.
* GCP certifications and knowledge of Kubernetes (GKE) and APIs on GCP.
* Previous involvement with Data Mesh and distributed architectures.