Financial Data Engineering Role Overview
">
* We are seeking an experienced Data Engineer with strong MLOps expertise and machine learning modeling experience in the financial domain to join a dynamic team working on advanced financial technology projects. This remote role offers the opportunity to work with state-of-the-art machine learning and cloud infrastructure in a fast-paced, growth-oriented environment.
">
About the Job
">
We are looking for a skilled Data Engineer to design, develop, and maintain scalable data pipelines using Python, Airflow, and PySpark to process large volumes of financial transaction data. The ideal candidate will have hands-on experience with AWS stack, especially SageMaker, Lambda, S3, and other relevant services.
">
Key Responsibilities
">
* Design, develop, and maintain robust data pipelines to support payment processing systems, fraud detection algorithms, and financial analytics solutions.
* Implement and optimize MLOps infrastructure on AWS to automate the full machine learning lifecycle from development to production.
* Collaborate with data scientists and business stakeholders to implement machine learning solutions for fraud detection, risk assessment, and financial forecasting.
* Ensure data quality, reliability, and security across all data engineering workloads.
">
Requirements and Qualifications
">
* 3-5 years of experience in Data Engineering with a focus on MLOps in production environments.
* Strong proficiency in Python programming and data processing frameworks (PySpark).
* Hands-on experience with workflow orchestration tools, particularly Airflow.
* Familiarity with containerization (Docker) and CI/CD pipelines.
* Excellent problem-solving skills and ability to work in a fast-paced fintech environment.
">
Ambition and Benefits
">
We offer a competitive package of benefits, including opportunities for professional growth and development, a dynamic work environment, and a chance to make a meaningful impact in the field of financial technology.