Unlock Your Potential as a Data Engineering Expert
We are seeking an experienced Data Engineer to join our team. As a key member of our data engineering team, you will play a crucial role in designing, building, and maintaining large-scale data systems.
Your primary focus will be on developing and implementing robust data pipelines using Fivetran and Python. You will also be responsible for staging and enriching data in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows.
You will work closely with cross-functional teams to deliver high-quality data products that power analytics and decision-making. This includes collaborating with data scientists, product managers, and business stakeholders to understand their needs and develop solutions that meet their requirements.
In addition to your technical expertise, we value excellent communication skills and the ability to effectively collaborate across technical and non-technical teams.
-----------------------------------
Key Responsibilities:
* Data Ingestion and Pipeline Development: Design and implement efficient data pipelines using Fivetran and Python to ingest raw data from diverse source systems.
* Data Staging and Enrichment: Stage and enrich data in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows.
* Workflow Management: Design, maintain, and improve workflows that ensure reliable and consistent data creation, proactively addressing data quality issues and optimizing for performance and cost.
* Data Modeling and Democratization: Develop LookML Views and Models to democratize access to data products and enable self-service analytics in Looker.
* Ad Hoc Reporting and Support: Deliver ad hoc SQL reports and support business users with timely insights.
-----------------------------------
Requirements:
* Proven Experience: Proven experience building and managing data products in modern cloud environments.
* Technical Expertise: Strong proficiency in Python for data ingestion and workflow development, hands-on expertise with BigQuery, dbt, Airflow, and Looker.
* Data Quality and Best Practices: Solid understanding of data modeling, pipeline design, and data quality best practices.
* Communication and Collaboration: Excellent communication skills and a track record of effective collaboration across technical and non-technical teams.