GCP Data Engineer OpportunityWe are seeking an experienced GCP Data Engineer to join our team.This role involves designing and implementing scalable, high-performance data architectures on GCP using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, among others.The ideal candidate should have knowledge of orchestration using Cloud Composer (Airflow) and hands-on experience with CI/CD applied to data pipelines (Git, Terraform).
Additionally, the candidate should have experience with cloud cost and performance optimization, GCP certifications, and knowledge of Kubernetes (GKE) and APIs on GCP.A previous involvement with Data Mesh and distributed architectures is also desirable.This role requires proficiency in SQL, Oracle Database, and PostgreSQL, as well as experience with ETL/ELT pipeline development and optimization.The candidate will be responsible for developing and maintaining data models aligned with business needs, collaborating with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.Furthermore, the candidate will automate ingestion, transformation, and data delivery processes, monitor and optimize cost and performance of GCP resources, and implement best practices for DataOps and Data Governance.Responsibilities:Design and implement scalable, high-performance data architectures on GCP.Develop and optimize ETL/ELT pipelines.Maintain data quality, integrity, and security.Create and maintain data models aligned with business needs.Collaborate with data scientists, analysts, and software engineers.Automate ingestion, transformation, and data delivery processes.Monitor and optimize cost and performance of GCP resources.Benefits:Opportunity to work with a cutting-edge technology stack.Chance to develop and maintain complex data architectures.Collaborative environment with experienced professionals.Professional growth and development opportunities.Requirements:4+ years of experience in data engineering or a related field.Proficiency in GCP services and technologies.Experience with ETL/ELT pipeline development and optimization.Knowledge of cloud cost and performance optimization.Experience with Kubernetes (GKE) and APIs on GCP.Data Mesh and distributed architectures expertise.