Big Data Architect
As a Big Data Architect, you will play a pivotal role in designing and implementing large-scale data systems. This involves building, optimizing, and maintaining ETL/ELT pipelines using Python, modern data engineering frameworks, and Snowflake as a central data warehouse.
Key Responsibilities:
* Design and develop scalable data architectures that meet business needs.
* Work closely with data scientists to deploy, monitor, and tune machine learning models.
* Develop feature engineering pipelines, preprocessing workflows, and model-serving APIs.
* Integrate data from various sources (APIs, databases, cloud storage, streaming platforms).
Required Skills & Experience:
* At least 3–7+ years of experience as a Data Engineer, Python Engineer, or similar backend/data role.
* Strong proficiency in Python, including building production-grade data pipelines.
* Experience with Snowflake—data modeling, Snowpipe, tasks, streams, stored procedures, and performance optimization.