Unlock Your Potential as a Data Engineer
We are seeking an exceptional Data Engineer to join our team. As a Data Engineer, you will be responsible for designing and implementing scalable data processing systems using AWS services such as S3, Lambda, Glue, Athena, Redshift, and DynamoDB.
Responsibilities:
* Design and implement an AWS Serverless DataLake architecture to efficiently handle large volumes of data and support various data processing workflows;
* Develop data ingestion pipelines and data integration processes, ensuring the smooth and reliable transfer of data from various sources into the DataLake;
* Implement data transformation and data enrichment processes using AWS Lambda, Glue, or similar serverless technologies to ensure data quality and consistency;
* Collaborate with data scientists and analysts to understand their data requirements and design appropriate data models and schemas in the DataLake;
* Optimize data storage and retrieval mechanisms, leveraging AWS services such as S3, Athena, Redshift, or DynamoDB, to provide high-performance access to the data;
* Monitor and troubleshoot the DataLake infrastructure, identifying and resolving performance bottlenecks, data processing errors, and other issues;
* Continuously evaluate new AWS services and technologies to enhance the DataLake architecture, improve data processing efficiency, and drive innovation;
* Mentor and provide technical guidance to junior data engineers, fostering their growth and ensuring adherence to best practices;
* Collaborate with cross-functional teams to understand business requirements, prioritize tasks, and deliver high-quality solutions within defined timelines;
Requirements:
* 5+ years of experience working as a Data Engineer with a strong focus on AWS technologies and serverless architectures;
* Experience working with AWS services such as S3, Lambda, Glue, Athena, Redshift, and DynamoDB;
* Proven expertise in designing and implementing AWS serverless architectures for large-scale data processing and storage;
* Strong programming skills in languages like Python, Java, or Scala, along with experience using SQL for data manipulation and querying;
* Familiarity with data modeling techniques and data warehousing concepts, including star and snowflake schemas;
* Solid understanding of data security, access control, and compliance requirements in a data-driven environment;
* Experience with data visualization tools (e.g., Tableau, Power BI) and the ability to collaborate with analysts and data scientists to deliver actionable insights;
* Strong problem-solving and analytical skills, with a detail-oriented approach to ensure data accuracy and integrity;
* Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment;
Benefits:
* Professional development and constant evolution of your skills, always in line with your interests;
* Opportunities to work outside Brazil;
* A collaborative, diverse, and innovative environment that encourages teamwork;
* TCS Benefits – Brazil: Health insurance, Dental Plan, Life insurance, Transportation vouchers, Meal/Food Voucher, Childcare assistance, Gympass, TCS Cares – free 0800 that provides psychological assistance (24 hrs/day), legal, social, and financial assistance to associates, Partnership with SESC, Reimbursement of Certifications, Free TCS Learning Portal – Online courses and live training, International experience opportunity, Discount Partnership with Universities and Language Schools, Bring Your Buddy – By referring people you become eligible to receive a bonus for each hire, TCS Gems – Recognition for performance, Xcelerate – Free Mentoring Career Platform;
What We Offer:
* An inclusive culture that values diversity, equity, and inclusion;
* A dynamic and challenging work environment that encourages growth and innovation;
* The opportunity to work with cutting-edge technologies and collaborate with a talented team of professionals;
* The chance to make a real impact on our organization's success and contribute to shaping the future of data engineering.