Job Title: Senior Data Engineer – GCP & Snowflake (BigQuery, Python, Airflow)
Location: Mexico/ Brazil / Costa Rica (Remote)
Type: Contract / FTE
Summary:
Looking for a Senior Data Engineer with strong experience in BigQuery, Snowflake, Python, and Airflow/Composer to support, stabilize, and scale data pipelines. The role includes managing Snowflake workflows, operating migrated BigQuery pipelines, and integrating international logic into global data platforms for scalability and standardization across AWS, GCP, and Snowflake environments.
Key Skills:
BigQuery, Snowflake, Python, SQL, Airflow/Composer, ETL/ELT, GitHub, CI/CD, Data Warehousing (Star Schema, Dimensional Modeling, OLTP vs OLAP)
Responsibilities:
Support existing Snowflake workflows; Operate and stabilize BigQuery pipelines; Integrate international/Paramount logic into global pipelines; Design and maintain ETL/ELT pipelines across AWS, GCP, Snowflake; Implement data quality checks and reconciliation; Optimize performance and cost (BigQuery/Snowflake); Manage workflows and CI/CD; Maintain documentation
Experience:
Bachelor’s Degree in Computer Science or related field; 12–15 years overall IT experience; 7–9 years in data engineering
Nice to Have:
Adobe Analytics; Experience with global data standardization; Strong DevOps practices
Apply Today or share your profile to