GCP Data Engineer Role
Transform your career with GCP data engineering at a leading IT services company.
* Proficiency in BigQuery, Dataflow, Cloud Storage, and Pub/Sub required.
* Experience with SQL, Oracle Database, and PostgreSQL essential.
* Knowledge of orchestration using Cloud Composer (Airflow) necessary.
* Hands-on experience with CI/CD applied to data pipelines (Git, Terraform) desired.
Key Responsibilities:
1. Data Architecture Design: Implement scalable data architectures on GCP using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Composer.
2. ETL/ELT Pipeline Development: Develop and optimize high-performance ETL/ELT pipelines.
3. Data Quality and Integrity: Ensure end-to-end data quality, integrity, and security.
4. Data Modeling: Create and maintain data models aligned with business needs.
5. Collaboration: Work with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
6. Process Automation: Automate ingestion, transformation, and data delivery processes.
7. Resource Optimization: Monitor and optimize cost and performance of GCP resources.
8. Data Governance: Implement best practices for DataOps and Data Governance.
What We Offer:
* Professional development opportunities.
* International work opportunities.
* A collaborative and innovative work environment.
TCS Benefits - Brazil include health insurance, dental plan, life insurance, transportation vouchers, meal/food voucher, childcare assistance, Gympass, TCS Cares, partnership with SESC, reimbursement of certifications, free TCS Learning Portal, international experience opportunity, discount partnership with universities and language schools, Bring Your Buddy program, TCS Gems recognition for performance, and Xcelerate mentoring career platform.