Job Opportunity
We are seeking a highly skilled professional to fill the role of Data Engineer.
The ideal candidate will have expertise in designing and implementing data architectures on GCP using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, among others.
* Proficiency in BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub
* Experience with SQL, Oracle Database, and PostgreSQL
* Knowledge of orchestration using Cloud Composer (Airflow)
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform)
* Experience with cloud cost and performance optimization
* GCP certifications
* Knowledge of Kubernetes (GKE) and APIs on GCP
* Experience with Machine Learning pipelines (Vertex AI, AI Platform)
* Previous involvement with Data Mesh and distributed architectures
* Understanding of Data Lake layers
* Knowledge of batch and streaming processing
* Experience with data modeling (relational, dimensional, and NoSQL)
The selected individual will design scalable, high-performance ETL/ELT pipelines, ensure data quality, integrity, and security end-to-end, create and maintain data models aligned with business needs, and collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.