Data EngineerLocation: Brazil (Remote)Duration: Long TermType: PJKey ResponsibilitiesDesign, build, and optimize ETL/ELT data pipelines using Python and SQLLead migration of Adobe-based pipelines to GCP/BigQueryDevelop event-driven ingestion pipelines for large-scale data processingIntegrate and transform datasets (including media/streaming data) into data warehousesManage and orchestrate workflows using Apache Airflow / Cloud ComposerEnsure data quality, validation, and reconciliation across pipelinesOptimize BigQuery performance, storage, and cost efficiencyWork across hybrid cloud environments (GCP, AWS, Snowflake)Collaborate with cross-functional teams to deliver scalable data solutionsMaintain clear technical documentation and best practicesRequired Skills & Experience7–9 years of hands-on experience in data engineeringStrong expertise in:Python (data pipelines, scripting, transformation)SQL (advanced queries, performance tuning)Apache Airflow (DAGs, scheduling, orchestration)Google Cloud Platform (GCP) & BigQueryCloud Composer (managed Airflow)Experience with CI/CD pipelines, GitHub, and automation workflowsStrong understanding of:Data warehousing (star schema, dimensional modeling)ETL/ELT architectureData quality frameworks and validation techniquesNice to HaveExperience with Adobe Analytics / Adobe Experience PlatformExposure to AWS and Snowflake in multi-cloud environmentsHands-on experience with BigQuery cost optimizationKnowledge of media, streaming, or event-based data pipelines