About the Role
We are seeking a highly skilled Data Engineer to join our team. The successful candidate will have expertise in designing and implementing data architectures on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Composer.
Responsibilities
* Design and implement scalable, high-performance ETL/ELT pipelines.
* Ensure data quality, integrity, and security end-to-end.
* Create and maintain data models aligned with business needs.
* Collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
* Automate ingestion, transformation, and data delivery processes.
* Monitor and optimize cost and performance of GCP resources.
Requirements
* Proficiency in BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub.
* Experience with SQL, Oracle Database, and PostgreSQL.
* Knowledge of orchestration using Cloud Composer (Airflow).
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform).
* Experience with cloud cost and performance optimization.
* GCP certifications.
* Knowledge of Kubernetes (GKE) and APIs on GCP.
* Experience with Machine Learning pipelines (Vertex AI, AI Platform).
* Previous involvement with Data Mesh and distributed architectures.
* Understanding of Data Lake layers.
* Knowledge of batch and streaming processing.
* Experience with data modeling (relational, dimensional, and NoSQL).
What We Offer
* Professional development and constant evolution of your skills.
* Opportunities to work outside Brazil.
* A collaborative, diverse, and innovative environment that encourages teamwork.