We are seeking a highly skilled Python Data Engineer to join our growing data and analytics team. This role is ideal for someone who loves building scalable data pipelines, operationalizing machine learning workflows, and partnering closely with data scientists to bring models into production.
The successful candidate will design, develop, and maintain data infrastructure that powers AI-driven insights across the organization, including data models and pipelines that run through Snowflake. This is a fully remote position working with cross-functional product, engineering, and analytics teams.
Key Responsibilities:
* Build, optimize, and maintain ETL/ELT pipelines using Python, modern data engineering frameworks, and Snowflake as a central data warehouse.
* Architect and manage data workflows, ensuring accuracy, scalability, and reliability.
* Work closely with data scientists to deploy, monitor, and tune machine learning models.
* Develop feature engineering pipelines, preprocessing workflows, and model-serving APIs.
* Integrate data from various sources (APIs, databases, cloud storage, streaming platforms).
* Implement MLOps best practices including versioning, CI/CD for ML, and automated retraining workflows.
* Optimize data storage, compute usage, and performance within Snowflake and cloud-native tools (AWS, GCP, or Azure).
Required Skills & Qualifications
* 3–7+ years of experience as a Data Engineer, Python Engineer, or similar backend/data role.
* Strong proficiency in Python, including building production-grade data pipelines.
* Experience with Snowflake —data modeling, Snowpipe, tasks, streams, stored procedures, and performance optimization.
* Experience with AI/ML workflows: feature engineering, inference pipelines, or deploying models.
* Proficiency in SQL and relational databases (PostgreSQL, MySQL, SQL Server).
* Hands-on experience with at least one cloud platform (AWS, GCP, or Azure).
* Experience using data orchestration tools like Airflow, Prefect, or Dagster.
* Familiarity with MLOps tools such as MLflow, Kubeflow, SageMaker, Vertex AI, or similar.
Benefits
This is a fully remote position offering opportunities for growth and development in a fast-paced, innovative environment.
The successful candidate will be part of a collaborative, dynamic team working on cutting-edge projects.