Data Engineer
Location: Brazil (Remote)
Duration: Long Term
Type: PJ
Key Responsibilities
* Design, build, and optimize ETL/ELT data pipelines using Python and SQL
* Lead migration of Adobe-based pipelines to GCP/BigQuery
* Develop event-driven ingestion pipelines for large-scale data processing
* Integrate and transform datasets (including media/streaming data) into data warehouses
* Manage and orchestrate workflows using Apache Airflow / Cloud Composer
* Ensure data quality, validation, and reconciliation across pipelines
* Optimize BigQuery performance, storage, and cost efficiency
* Work across hybrid cloud environments (GCP, AWS, Snowflake)
* Collaborate with cross-functional teams to deliver scalable data solutions
* Maintain clear technical documentation and best practices
Required Skills & Experience
* 7–9 years of hands-on experience in data engineering
Strong expertise in:
* Python (data pipelines, scripting, transformation)
* SQL (advanced queries, performance tuning)
* Apache Airflow (DAGs, scheduling, orchestration)
* Google Cloud Platform (GCP) & BigQuery
* Cloud Composer (managed Airflow)
* Experience with CI/CD pipelines, GitHub, and automation workflows
Strong understanding of:
* Data warehousing (star schema, dimensional modeling)
* ETL/ELT architecture
* Data quality frameworks and validation techniques
Nice to Have
* Experience with Adobe Analytics / Adobe Experience Platform
* Exposure to AWS and Snowflake in multi-cloud environments
* Hands-on experience with BigQuery cost optimization
* Knowledge of media, streaming, or event-based data pipelines