Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you!We invite a Junior Data Engineer to join our dynamic team supporting a major enterprise client in modernizing their data platform. In this role, you'll assist in migrating and transforming legacy data pipelines to a modern cloud environment. You'll work closely with senior engineers, architects, DevOps, QA, and product stakeholders, gaining hands-on data engineering experience and contributing to reliable, scalable data solutions.
Overview
What\'s in it for you:
Join a supportive delivery team built on collaboration, transparency, and mutual respect
Get hands-on exposure to a high-impact, real-world data platform transformation project
Grow your skills with modern technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
Qualifications
Required:
1,5+ years of experience in data engineering, data analytics, or software development
Basic understanding of data warehouse concepts and ETL pipelines
Good knowledge of SQL and willingness to learn Snowflake or similar data storage technologies
Basic experience with Python for scripting or simple ETL tasks
Experience with GCP platforms (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub)
Understanding of version control (Git) and eagerness to learn CI/CD and IaC tools
Degree in Computer Science, Data Engineering, or related field, or equivalent practical experience
Strong communication and collaboration skills
Upper-Intermediate English level
Desirable:
Basic exposure to streaming data pipelines and event-driven architectures
Familiarity with basic scripting and containerization tools (Bash, Docker)
Basic understanding of data lakehouse concepts (Iceberg tables)
Aware of data transformation tools like dbt
Familiarity with AI-assisted tools like GitHub Copilot
Key responsibilities
Assist in reviewing and analyzing existing ETL solutions for migration to the new architecture
Support the migration of batch and streaming data pipelines to the GCP Landing Zone
Help build and maintain data transformations with dbt, supporting ELT pipelines in Snowflake
Help with data jobs refactoring and mapping
Assist in setting up and maintaining monitoring and alerting for data pipelines
Contribute to migrating historical data to Iceberg tables with guidance from senior engineers
Collaborate with senior engineers and stakeholders to understand requirements and implement solutions
Participate in code reviews, team discussions, and technical planning to develop your skills
What\'s working at Dev.Pro like?
Dev.Pro is a global company that has been building software since 2011, valuing fairness, high standards, openness, and inclusivity
We are 99.9% remote — you can work from anywhere in the world
Get 30 paid days off per year to use for vacations, holidays, or personal time
5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events
Partially covered health insurance after the probation, plus a wellness bonus after 6 months
We pay in U.S. dollars and cover all approved overtime
Join English lessons and Dev.Pro University programs, and take part in online activities and team-building events
Next steps
Submit a CV in English — Intro call with a Recruiter — Internal interview — Client interview — Offer
Find out more
How we work
LinkedIn Page
Our website
IG Page
#J-18808-Ljbffr