Job Title: Specialist – Software Engineering (Data Engineering) Experience RequiredJob Summary We are seeking aSpecialist – Software Engineeringwith strong expertise indata engineering, cloud platforms, and ETL/ELT pipeline development. The ideal candidate will play a key role in themigration and expansion of Adobe-based data pipelines, integration ofOrion datasets, and creation of new event collection pipelines. This role requires strong experience inPython, SQL, Google Cloud Platform (GCP), BigQuery, Airflow/Composer, and modern data warehousing concepts. Key Responsibilities Lead themigration and expansion of Adobe-based data pipelines IntegrateOrion datasetsinto existing enterprise data platforms Design and developnew pipelines for Orion event collection Build, maintain, and optimizeETL/ELT pipelinesacross hybrid cloud environments: AWS GCP Snowflake Develop robust data transformation processes using: Python SQL Implement and maintaindata quality checks, validation rules, and reconciliation processes Optimize: Pipeline runtime Compute usage Storage tiering Query performance and cost inBigQuery Maintain clear technical documentation for data pipelines and workflows Collaborate with cross-functional teams to support data architecture initiatives SupportCI/CD deployment processesand source control best practices Required Technical Skills Cloud & Data Platforms Google Cloud Platform (GCP) BigQuery Cloud Composer Apache Airflow AWS Snowflake Programming Python SQL Version Control / DevOps GitHub CI/CD pipelines Data Concepts Data Warehousing Star Schema Dimensional Modeling Understanding of: OLTP vs OLAP Data lifecycle management Data governance principlesPreferred Qualifications Experience with enterprise-scale data migration projects Strong troubleshooting and performance tuning skills Experience with hybrid cloud data environments Ability to work independently and mentor junior engineersHow to Apply / Contact Email: WhatsApp:+1 (404) 940-4414