Key Responsibilities:
In this dynamic role, you will have the opportunity to work with state-of-the-art machine learning and cloud infrastructure in a fast-paced, growth-oriented environment. You will design, develop, and maintain scalable data pipelines using Python, Airflow, and PySpark to process large volumes of financial transaction data.
* You will implement and optimize MLOps infrastructure on AWS to automate the full machine learning lifecycle from development to production.
* You will build and maintain deployment pipelines for ML models using SageMaker and other AWS services.
* You will collaborate with data scientists and business stakeholders to implement machine learning solutions for fraud detection, risk assessment, and financial forecasting.
* You will ensure data quality, reliability, and security across all data engineering workloads.
* You will optimize data architecture to improve performance, scalability, and cost-efficiency.
* You will implement monitoring and alerting systems to ensure production ML models perform as expected.