Job Title: Specialist – Software Engineering (Data Engineering)
Location: Remote - Brazil
Employment Type: Contract
Job Summary:
We are looking for a Specialist – Software Engineering with strong data engineering expertise to support the migration, integration, and expansion of Adobe-based data pipelines. The ideal candidate will have deep experience in Python, SQL, GCP (BigQuery), and Airflow/Composer, along with a solid understanding of data warehousing concepts and modern ETL/ELT practices.
Key Responsibilities:
* Design, build, and maintain scalable ETL/ELT pipelines across hybrid environments ( GCP, AWS, Snowflake ).
* Lead migration and expansion of Adobe-based pipelines and integrate Orion datasets into existing data platforms.
* Develop new pipelines for Orion event data collection .
* Optimize BigQuery performance, including query tuning, cost optimization, and storage strategies.
* Implement data quality checks, validation logic, and reconciliation processes.
* Work with Airflow/Composer for workflow orchestration and scheduling.
* Manage code using GitHub and implement CI/CD pipelines .
* Create and maintain clear documentation for data pipelines and workflows.
Required Skills & Qualifications:
* Bachelor’s degree in Computer Science or related field
* 12–15 years of overall IT experience with 7–9 years in data engineering
* Strong hands-on experience with Python and SQL
* Expertise in GCP (BigQuery), Airflow/Composer
* Experience with GitHub and CI/CD practices
* Strong understanding of data warehousing concepts (star schema, dimensional modeling, OLTP vs OLAP)
Nice to Have:
* Experience with Adobe data pipelines
* Exposure to AWS and Snowflake
* Experience working in large-scale, enterprise data environments