The company : We are a global custom software development and IT project staffing organization without geographic restrictions; we operate globally, navigating technology and business challenges with expertise. Our U.S. headquarters are in San Diego, California and we also have strategically located development centers in Brazil, Mexico, Peru, and Uruguay. Our Latin America headquarters are based in Porto Alegre, Brazil, with a branch in São Paulo. For over two decades, we have been globally recognized in the software and innovation market for delivering projects with added value, crafted by agile teams of skilled professionals. Check out our international page at: www.ntconsultcorp.com. The project : We are seeking a Data Engineer to develop and maintain AWS-based data integration applications. Core responsibilities include data modeling, integration, data quality assurance, and making data readily available for reporting and dashboard creation. Additionally, this position should maintain and enhance the existing data applications and integrations. In this role, you will troubleshoot production issues, optimize data pipelines, and implement improvements that ensure reliability. Responsibilities: Troubleshoot and resolve issues related to data integrity and system availability. Develop and maintain reports using BI tools like Tableau, QuickSight Collaborate on the development, implementation, and improvement of frameworks to support the automatized generation of reports and dashboards on AWS utilizing tools like Redshift, PostgreSQL, RDS, Glue, Lambda, DynamoDB, StepFunctions, QuickSight, etc. Assist with data integration tasks between legacy and new systems. Communicate with business stakeholders to understand reporting requirements. Stay informed about current data engineering technologies and best practices. Requirements: 5 years as a Data Engineer. Proficient with BI tools like QuickSight and Tableau. Advanced Python skill for Data Engineering (main principles of OOP, code modularization, data wrangling, database connections) Strong Knowledge on AWS Services such as: S3, Glue, Lambda, Step Functions, EMR, Redshift. Knowledge of different types of data ingestion patterns. Knowledge of both ETL and ELT. Can model transactional and analytical layers of data. Solid knowledge on transactional databases (mainly PostgreSQL) and datawarehouses. Nice to have: Azure Cloud services, SSIS, SQL Server.