Job Opportunity
The ideal candidate will possess in-depth knowledge of BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Proficiency in SQL, Oracle Database, and PostgreSQL is also essential. Furthermore, they should have expertise in orchestration using Cloud Composer (Airflow) and hands-on experience with CI/CD applied to data pipelines (Git, Terraform).
Key Responsibilities:
* Design and implement efficient data architectures on GCP
* Develop and optimize scalable ETL/ELT pipelines
* Ensure end-to-end data quality, integrity, and security
* Create and maintain aligned data models with business objectives
* Collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning initiatives
Requirements:
* Proficient in BigQuery, Dataflow, Cloud Storage, and Pub/Sub
* Expertise in SQL, Oracle Database, and PostgreSQL
* Experience in orchestration using Cloud Composer (Airflow)
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform)