Arquiteto De Dados - This role is ideal for professionals with strong expertise in AWS data services, data architecture, and large-scale data processing. Location: 100% RemoteLanguage Requirement: Fluent EnglishIndustry Experience: Financial Services preferredKey Responsibilities- Design and implement end-to-end data architectures on AWS aligned with enterprise and regulatory standards. - Develop, optimize, and maintain ETL/ELT pipelines using AWS Glue and Apache Spark. - Architect and manage data warehousing solutions using Amazon Redshift. - Ensure data quality, reliability, performance, and governance across data platforms. - Collaborate with data engineers, analysts, and business teams to translate data requirements into technical solutions. - Implement best practices for security, monitoring, scalability, and cost efficiency in AWS environments. - Troubleshoot and resolve complex data processing and performance issues. - Stay current with AWS services and emerging data engineering best practices. Required Experience- 5 years of experience in data engineering and cloud architecture, with strong expertise in AWS environments. - Hands-on experience with AWS Glue, Apache Spark, and Amazon Redshift. - Strong knowledge of data lake and data warehouse architectures in AWS. - Experience designing and implementing scalable ETL/ELT pipelines for large volumes of structured and semi-structured data. - Proficiency in SQL and Python for data processing and automation. - Solid understanding of AWS security, IAM, networking, monitoring, and cost optimization. - Experience working with highly sensitive data in regulated environments, ideally within Financial Services. - Fluent English (mandatory). - Strong analytical, problem-solving, and communication skills. If you are passionate about building modern, scalable data platforms on AWS, we would love to hear from you. Feel free to apply or share with your network! Send your resume to vagas@ethosolucoes. com