We are seeking a seasoned data engineering professional to spearhead the design and implementation of scalable data pipelines on AWS. This high-performance, secure, and reliable system will integrate data from multiple sources and support AI/ML initiatives across various projects.
You will collaborate closely with data scientists and analysts to ensure seamless integration and data quality throughout the pipeline.
Key Responsibilities:
* Develop and optimize ETL pipelines using AWS Glue.
* Work with AWS S3, Glue, and SageMaker for data and AI workflows.
* Design and implement data models in Python and SQL.
* Integrate data from Salesforce and APIs.
* Ensure data governance, documentation, and best practices throughout the pipeline.
Requirements:
* Proven experience in data engineering with AWS technologies.
* Strong skills in Python, SQL, and AWS.
* Experience with ETL, data modeling, and pipeline optimization.
* Advanced English language proficiency for international collaboration.
Avenue Code prioritizes data protection and adheres to global regulations such as GDPR, CCPA, and CPRA. Candidate data shared with Avenue Code remains confidential and is not transmitted to third parties.