Job Title: Senior Data Engineer (GCP BigQuery & Snowflake) Location: Remote
Job Summary
We are seeking a highly experienced Senior Data Engineer to design, build, and optimize scalable data pipelines across cloud platforms. The ideal candidate will have strong expertise in GCP (BigQuery), Snowflake, and Python, along with a solid foundation in data warehousing concepts and modern DevOps practices.
Key Responsibilities
Support and maintain existing Snowflake workflows and data pipelines
Operate, monitor, and stabilize migrated BigQuery workflows
Integrate international business logic into global data pipelines to ensure scalability and standardization
Design, build, and maintain ETL/ELT pipelines across hybrid environments ( AWS, GCP, Snowflake )
Implement data quality checks, validation logic, and reconciliation processes
Optimize pipeline performance, compute usage, storage, and query costs (especially in BigQuery)
Develop and maintain documentation for all data pipelines and workflows
Collaborate with cross-functional teams to support data-driven decision making
Follow and contribute to modern DevOps practices, including CI/CD and version control
Required Skills & Qualifications
Bachelor’s Degree in Computer Science or a related discipline
12–15 years of overall IT experience
7–9 years of hands-on data engineering experience
Technical Skills
Strong experience with:
GCP BigQuery
Snowflake
Python & SQL
Apache Airflow / Cloud Composer