 
        
        We are seeking a highly skilled professional to lead the development of scalable data pipelines on AWS.
Key Responsibilities:
 1. Design and Optimize Data Pipelines
 2. Collaborate with Cross-Functional Teams
 3. Develop Solutions in Python and SQL
 4. Ensure Data Quality and Security
Tech Stack:
 * AWS (S3, Glue, SageMaker)
 * Python, SQL
 * Salesforce, APIs
Our ideal candidate will have expertise in designing and maintaining large-scale data architectures, ensuring high performance, quality, and security. They will work closely with data scientists, analysts, and other stakeholders to deliver robust data solutions that meet business needs.
The successful candidate will be responsible for building, optimizing, and maintaining ETL pipelines, collaborating with cross-functional teams to ensure seamless data integration, and developing solutions in Python and SQL. They will also ensure data governance and documentation to maintain data quality and security standards.