GCP Data Engineer
Transform your career with our biggest IT Services company in the world. Here you can unlock your potential and grow professionally.
We are looking for a talented GCP Data Engineer to join our team of innovators. As a key player, you will be responsible for designing and implementing data architectures on GCP using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and more.
Your primary duties will include:
* Developing and optimizing scalable, high-performance ETL/ELT pipelines.
* Ensuring data quality, integrity, and security end-to-end.
* Creating and maintaining data models aligned with business needs.
* Collaborating with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
What do we offer?
As a member of our team, you can expect:
* A collaborative, diverse, and innovative environment that encourages teamwork.
* Professional development and constant evolution of your skills, always in line with your interests.
* Opportunities to work outside Brazil.
The ideal candidate will have experience with:
* BigQuery, Dataflow, AlloyDB, APIGee (focus on GCP).
* Cloud cost and performance optimization.
* GCP certifications.
* Kubernetes (GKE) and APIs on GCP.
* Machine Learning pipelines (Vertex AI, AI Platform).
If you're passionate about data engineering and want to take your career to the next level, apply now and join our team of experts!