Data Innovator
As a Data Innovator, you will play a pivotal role in designing and implementing scalable data pipelines that drive business growth.
* Key Responsibilities:
* Design and build efficient ETL/ELT pipelines using Python and modern data engineering frameworks to optimize data processing and minimize latency.
* Collaborate with cross-functional teams to architect and manage data workflows that ensure accuracy, scalability, and reliability.
* Develop feature engineering pipelines, preprocessing workflows, and model-serving APIs to support the deployment of machine learning models.
Requirements:
* Proficiency in Python programming and experience with modern data engineering frameworks such as Apache Beam or Spark.
* Strong understanding of data warehousing concepts and experience with Snowflake as a central data warehouse.
* Ability to design and implement scalable data pipelines that meet business requirements.
* Excellent communication and collaboration skills to work effectively with data scientists and other stakeholders.
What We Offer:
* A dynamic and innovative work environment that fosters collaboration and creativity.
* Opportunities for professional growth and development in a rapidly evolving field.
* A competitive compensation package that recognizes your skills and contributions.