Job Description:
We are seeking a skilled data engineer to enhance and scale our data transformation and modeling layer. The ideal candidate will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications.
You will work closely with data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure high-quality data delivery.
* Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services.
* Build modular, well-documented dbt models with strong test coverage for business reporting, lifecycle marketing, and experimentation.
* Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt.
* Maintain and improve our orchestration layer (Airflow/Astronomer) to ensure reliability and efficient dependency management.
* Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies.