Cloud Data Solutions Architect
We are looking for a skilled Cloud Data Solutions Architect who can lead the development of data architectures on Google Cloud Platform (GCP) using various services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
Key Responsibilities:
1. Design and implement scalable, high-performance ETL/ELT pipelines using GCP services.
2. Develop and optimize data models aligned with business needs, ensuring data quality, integrity, and security.
3. Collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
4. Automate ingestion, transformation, and data delivery processes.
5. Monitor and optimize cost and performance of GCP resources.
6. Implement best practices for DataOps and Data Governance.
The ideal candidate will have experience with cloud-based data platforms, proficiency in programming languages such as Python or Java, and knowledge of data engineering concepts. They should also be able to work collaboratively in a team environment and communicate effectively with stakeholders.
Required Skills and Qualifications:
* Proficiency in BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub.
* Experience with SQL, Oracle Database, and PostgreSQL.
* Knowledge of orchestration using Cloud Composer (Airflow).
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform).
* Experience with cloud cost and performance optimization.
* GCP certifications.
* Knowledge of Kubernetes (GKE) and APIs on GCP.
* Experience with Machine Learning pipelines (Vertex AI, AI Platform).
* Previous involvement with Data Mesh and distributed architectures.
* Understanding of Data Lake layers.
* Knowledge of batch and streaming processing.
* Experience with data modeling (relational, dimensional, and NoSQL).
By joining our organization, you will have the opportunity to work on cutting-edge projects, collaborate with a talented team, and develop your skills in cloud data solutions.