 
        
        Job Title:
Datapipeline Architect
Description:
We are seeking a highly skilled Datapipeline Architect to design and maintain large-scale data pipelines on AWS, ensuring high performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.
Key Responsibilities:
 * Design and optimize ETL pipelines using AWS Glue.
 * Work with AWS S3 Glue and SageMaker for data and AI workflows.
 * Develop solutions in Python and SQL.
 * Integrate data from Salesforce and APIs.
 * Maintain data governance documentation and best practices.
Requirements:
 * Proven experience in datapipeline engineering with AWS.
 * Strong Python + SQL skills.
 * Experience with ETL data modeling and pipeline optimization.
 * Advanced English language international collaboration.
About the Role:
This is an excellent opportunity for a talented professional to join our team and contribute to the development of our data architecture.