We're seeking a skilled professional to spearhead our data engineering efforts.
About the Role
As a Senior Data Engineer, you will be responsible for designing and implementing data architectures on Google Cloud Platform (GCP) using various services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Composer.
Your primary goal will be to develop and optimize scalable, high-performance ETL/ELT pipelines. You will ensure data quality, integrity, and security end-to-end. Additionally, you will create and maintain data models aligned with business needs, collaborating with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
Key Skills and Qualifications
* Proficiency in BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub
* Experience with SQL, Oracle Database, and PostgreSQL
* Knowledge of orchestration using Cloud Composer (Airflow)
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform)
* Experience with cloud cost and performance optimization
Desirable Skills
* GCP certifications
* Knowledge of Kubernetes (GKE) and APIs on GCP
* Experience with Machine Learning pipelines (Vertex AI, AI Platform)
* Previous involvement with Data Mesh and distributed architectures
* Understanding of Data Lake layers
About the Ideal Candidate
The ideal candidate will possess a strong foundation in data engineering principles and practices. They will have hands-on experience with GCP services and be familiar with cloud-based data processing and storage solutions. A degree in Computer Science or a related field is preferred but not required.
What We Offer
In this role, you will have the opportunity to work on challenging projects, collaborate with experienced professionals, and contribute to the growth and development of our organization.