 
        
        Job Title: Data Pipeline Architect
We are seeking an experienced Data Pipeline Architect to design and develop scalable data pipelines on AWS. This role requires collaboration with cross-functional teams, including data scientists and analysts, to integrate data from multiple sources and support AI/ML initiatives.
 * The ideal candidate will have a strong background in data engineering, experience with AWS Glue, S3, and SageMaker, and proficiency in Python and SQL.
 * Familiarity with Salesforce and APIs is also required.
 * Experience with ETL, data modeling, and pipeline optimization is essential.
Key Responsibilities:
 * Design and implement data pipelines using AWS Glue, S3, and SageMaker.
 * Collaborate with data scientists and analysts to integrate data from multiple sources.
 * Develop solutions in Python and SQL.
 * Integrate data from Salesforce and APIs.
Requirements:
 * Proven experience in data engineering with AWS.
 * Strong Python + SQL skills.
 * Experience with ETL, data modeling, and pipeline optimization.
 * Advanced English (international collaboration).
We prioritize confidentiality and follow global data protection laws. Candidate data will be kept confidential and not shared with third parties.
Technical Stack:
 * AWS (S3, Glue, SageMaker)
 * Python, SQL
 * Salesforce, APIs