🚀 We’re Hiring: Senior Data Engineer / Tech Lead (AWS Lakehouse)
💼 Experience: 8+ Years
🌎 Location: Brazil And Mexico (Remote)
🗣️ Language: Fluent English is mandatory
🗣 Language: Resumes must be in English
📅 Start Date: Immediate joiners preferred or candidates who can join within a week
⚠️ Note: Only candidates currently residing in Brazil and Mexico will be considered
🌐 Experience: International work exposure required
We are looking for a Senior Data Engineer / Tech Lead to act as the technical anchor for a squad of talented data engineers working on a modern AWS Lakehouse platform.
You’ll collaborate closely with architects, senior leaders, and domain experts while owning the end-to-end engineering execution for your squad’s workstream.
🔧 What You’ll Do
💻 Hands-On Engineering
- Build and optimize AWS Glue pipelines and transformation logic
- Design scalable data models using Apache Iceberg / Delta Lake
- Ensure best practices in partitioning, schema evolution, and data quality
- Develop consumption layer pipelines for analytical workloads (Athena)
- Apply strong engineering practices: testing, documentation, modularity, and performance
👨💻 Technical Leadership
- Mentor and guide junior engineers through pair programming & code reviews
- Act as the go-to technical expert for your squad
- Identify risks early and propose effective solutions
🤝 Delivery & Collaboration
- Break down requirements into well-defined engineering tasks
- Manage dependencies, unblock issues, and ensure smooth delivery
- Contribute to technical discussions and implementation strategies
✅ Required Skills
- 5+ years of experience in Data Engineering
- Strong hands-on experience with AWS Glue (ETL development & troubleshooting)
- Solid knowledge of AWS Data Stack (S3, Athena, etc.)
- Experience with Apache Iceberg / Delta Lake
- Strong understanding of data modeling (dimensional/lakehouse)
- Experience mentoring or guiding junior engineers
- Excellent problem-solving and communication skills
⭐ Nice to Have
- Experience with Apache Spark (Glue / EMR)
- Exposure to streaming technologies (Kafka, Flink, Kinesis)
- Familiarity with CI/CD for data pipelines
- Knowledge of AI-assisted coding tools (Copilot, CodeWhisperer)
- AWS Certifications (Data Engineer / Solutions Architect)