Data Engineer Role
We are seeking a skilled Data Engineer to design and implement data architectures on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, and others.
* Develop scalable, high-performance Extract Transform Load/Load pipelines.
* Maintain data quality, integrity, and security across the entire pipeline.
* Create data models aligned with business objectives.
You will collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
-----------------------------------
About this Role
This is an exciting opportunity to work on complex data engineering projects and contribute to the growth of our organization.
* Key Responsibilities:
* Design and develop data pipelines using GCP services.
* Collaborate with cross-functional teams to meet business needs.
* Optimize data processing and storage solutions.
-----------------------------------
What You'll Need
To be successful in this role, you will need:
* Strong technical skills:
* Proficiency in programming languages such as Python, Java, or C++.
* Experience with GCP services including BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and Composer.
* Knowledge of data modeling and data warehousing concepts.
In addition to your technical expertise, you should possess excellent problem-solving skills, strong communication abilities, and a passion for innovation and continuous learning.