About the Role
We are seeking a highly skilled Data Engineer with strong MLOps expertise and machine learning modeling experience in the financial domain. This remote role offers the opportunity to work with state-of-the-art machine learning and cloud infrastructure in a fast-paced, growth-oriented environment.
The ideal candidate will have 3-5 years of experience in Data Engineering with a focus on MLOps in production environments, as well as strong proficiency in Python programming and data processing frameworks (PySpark). Experience with workflow orchestration tools, particularly Airflow, and hands-on experience with AWS stack, especially SageMaker, Lambda, S3, and other relevant services is also required.
* Key Responsibilities:
o Design, develop, and maintain scalable data pipelines using Python, Airflow, and PySpark to process large volumes of financial transaction data.
o Implement and optimize MLOps infrastructure on AWS to automate the full machine learning lifecycle from development to production.
o Build and maintain deployment pipelines for ML models using SageMaker and other AWS services.
o Collaborate with data scientists and business stakeholders to implement machine learning solutions for fraud detection, risk assessment, and financial forecasting.
o Ensure data quality, reliability, and security across all data engineering workloads.
o Optimize data architecture to improve performance, scalability, and cost-efficiency.
o Implement monitoring and alerting systems to ensure production ML models perform as expected.
Familiarity with containerization (Docker) and CI/CD pipelines is a plus. The ability to work in a fast-paced fintech environment with excellent problem-solving skills is essential.
About You
To be successful in this role, you should possess the following qualifications and skills:
* Strong proficiency in Python programming and data processing frameworks (PySpark).
* Experience with workflow orchestration tools, particularly Airflow.
* Hands-on experience with AWS stack, especially SageMaker, Lambda, S3, and other relevant services.
* Working knowledge of machine learning model deployment and monitoring in production.
* Experience with data modeling and database systems (SQL and NoSQL).
* Familiarity with containerization (Docker) and CI/CD pipelines.
What We Offer
In this role, you will have the opportunity to work on advanced financial technology projects, collaborate with experienced professionals, and continuously learn and grow in your career.