Salary:
Payment in USD/hour. Mandatório Inglês Avançado/Fluente para Conversação.
Job Description:
We are seeking a Data Engineer to create data pipelines and move data from multiple sources to the cloud. This will involve creating delta lake, developing, unit testing, deployment, and prod support/defect fixing. The Lead will also deal with tasks like tech mentoring and development of continuous integration and deployment solutions.
What You'll Do:
* Create Data Pipelines to move data on cloud
* Work with customers, understand requirements, and complete pipeline development, daily connect with customer stakeholders
* Identify issues in UAT/Production phase and validate data and correct it
* Stay abreast of industry trends and best practices
* Conduct develop, tests, and calibrate techniques which could be reused and applied to software development projects
* Work on different underlying frameworks and add enhancements to them
* Participate and coordinate with customer architects and provide suggestions
Expertise You'll Bring:
* Hands-on experience in SQL, Python, AWS services, Snowflake
* Data engineering experience is must. Experience in Python libraries, data structures
* Good knowledge of Data Warehousing concepts, data modeling, handling large scale data
* Good knowledge of GIT, CI/CD
* Should have hands-on experience of working on Data Pipeline creation
* Customer-focused, react well to changes, work with teams and able to multi-task
* Must be a proven performer and team player that enjoy challenging assignments in a high-energy, fast growing and start-up workplace
* Must be a self-starter who can work well with minimal guidance and in fluid environment
* Act as a technical mentor and guide/support junior team members technically
* Good to have – Airflow, DBT knowledge
* Should be able to lead a team