Data Engineer - MLOps Specialist
We are seeking an experienced Data Engineer to join a dynamic team working on advanced financial technology projects. This remote role offers the opportunity to work with state-of-the-art machine learning and cloud infrastructure in a fast-paced, growth-oriented environment.
Key Responsibilities:
* Data Pipeline Design and Development: Design, develop, and maintain scalable data pipelines using Python, Airflow, and PySpark to process large volumes of financial transaction data.
* MLOps Infrastructure Implementation: Implement and optimize MLOps infrastructure on AWS to automate the full machine learning lifecycle from development to production.
* ML Model Deployment: Build and maintain deployment pipelines for ML models using SageMaker and other AWS services.
* Collaboration and Communication: Collaborate with data scientists and business stakeholders to implement machine learning solutions for fraud detection, risk assessment, and financial forecasting.
Requirements:
* 3-5 years of experience in Data Engineering with a focus on MLOps in production environments.
* Strong proficiency in Python programming and data processing frameworks (PySpark).
* Experience with workflow orchestration tools, particularly Airflow.
* Hands-on experience with AWS stack, especially SageMaker, Lambda, S3, and other relevant services.