Cloud Data Engineer Role
The ideal candidate will have expertise in designing and implementing scalable data architectures using GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and Composer.
* Proficiency in SQL, Oracle Database, and PostgreSQL.
* Knowledge of orchestration using Cloud Composer (Airflow).
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform).
* Expertise in cloud cost and performance optimization.
* GCP certifications.
* Understanding of Kubernetes (GKE) and APIs on GCP.
* Experience with Machine Learning pipelines (Vertex AI, AI Platform).
* Previous involvement with Data Mesh and distributed architectures.
The role involves designing and implementing data architectures on GCP, developing and optimizing high-performance ETL/ELT pipelines, ensuring data quality, integrity, and security end-to-end, creating and maintaining data models aligned with business needs, collaborating with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases, automating ingestion, transformation, and data delivery processes, monitoring and optimizing cost and performance of GCP resources, and implementing best practices for DataOps and Data Governance.
Key Responsibilities:
* Designing and implementing scalable data architectures using GCP services.
* Developing and optimizing high-performance ETL/ELT pipelines.
* Ensuring data quality, integrity, and security end-to-end.
* Creating and maintaining data models aligned with business needs.
* Collaborating with cross-functional teams to support advanced analytics and machine learning use cases.
This is an excellent opportunity to work with a talented team of professionals who share your passion for innovative technologies and solutions. If you are a motivated and detail-oriented individual who thrives in a fast-paced environment, we encourage you to apply for this exciting role.