Emprego
Meus anúncios
Meus alertas e-mail de emprego
Fazer login
Encontrar um emprego Dicas de emprego Fichas de empresas
Pesquisar

Sr. data architect

Caicó
buscojobs Brasil
Anunciada dia 16 setembro
Descrição

Overview

We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.


Key Responsibilities

* Build and optimize ETL pipelines with AWS Glue.
* Work with AWS S3, Glue, and SageMaker for data and AI workflows.
* Develop solutions in Python and SQL.
* Integrate data from Salesforce and APIs.
* Ensure data governance, documentation, and best practices.
* AWS (S3, Glue, SageMaker)
* Python, SQL


Requirements

* Proven experience in data engineering with AWS.
* Experience with ETL, data modeling, and pipeline optimization.
* Advanced English (international collaboration).


Company and Privacy

Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by global data protection laws such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a consultancy, Avenue Code may share your information with its clients and other companies from the CompassUol Group to which Avenue Code’s consultants are allocated to perform its services.


What You’ll Do

Designing performant data pipelines for the ingestion and transformation of complex datasets into usable data products. Build enterprise-grade batch and real-time data processing pipelines on AWS, with a focus on serverless architectures. Design and implement automated ELT processes to integrate disparate datasets. Collaborate across teams to ingest, extract, and process data using Python, SQL, REST, and GraphQL APIs. Transform clickstream and CRM data into meaningful metrics and segments for visualization. Create automated QA and reliability checks to ensure data integrity. Define and maintain CI/CD and deployment pipelines for data infrastructure. Containerize and deploy solutions using Docker and AWS ECS.


What You’ll Bring

* Bachelor’s degree in Computer Science, Software Engineering, or a related field; additional training in statistics, mathematics, or machine learning is a strong plus.
* 5+ years of experience building scalable and reliable data pipelines and data products in a cloud environment (AWS preferred).
* Deep understanding of ELT processes and data modeling best practices.
* Strong programming skills in Python or a similar scripting language.
* Advanced SQL skills, with experience in relational database design.
* Familiarity with large behavioral datasets (e.g., Adobe, GA4 clickstream data).
* Excellent problem-solving abilities and attention to data accuracy and detail.
* Proven ability to manage and prioritize multiple initiatives with minimal supervision.


Nice to Have

* Experience with data transformation tools (e.g., Data Build Tool).
* Docker containerization and orchestration experience.
* API design or integration for data pipelines.
* Linux or Mac development environment experience.
* Experience with data QA frameworks or observability tools (e.g., Great Expectations).


What We Offer

* 100% Remote Work
* WFH allowance: Monthly payment for remote working.
* Career Growth: Career development program with 360º feedback.
* Training: Time allocated for tech training, English classes, books, conferences, and events.
* Mentoring Program: Mentoring opportunities available.
* Zartis Wellbeing Hub: Access to specialist sessions and resources.
* Multicultural working environment with online team-building activities.


About The Role (Kake/Cargo)

We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands. The project involves ingesting data with tools like Fivetran, processing in BigQuery, building LookML models, and delivering Looker dashboards. You will work in a modern cloud environment (GCP preferred) with a focus on data quality, performance, and cost efficiency. You will collaborate with cross-functional teams and support business users with timely insights.

Why join: remote-first, global community, strong emphasis on growth, learning, and social good.

#J-18808-Ljbffr

Se candidatar
Criar um alerta
Alerta ativado
Salva
Salvar
Vagas parecidas
Emprego Caicó
Emprego Rio Grande do Norte
Emprego Nordeste
Página principal > Emprego > Sr. Data Architect

Jobijoba Brasil

  • Dicas de emprego

Encontre vagas

  • Vagas de emprego por cargo
  • Pesquisa de vagas de emprego por área
  • Empregos por empresas
  • Empregos por localização

Contato / Parceria

  • Entre em contato
  • Publique suas ofertas no site Jobijoba

Menções legais - Menções legais e termos de uso - Política de dados - Gerir os meus cookies - Acessibilidade: Não conforme

© 2025 Jobijoba Brasil - Todos os direitos reservados

Se candidatar
Criar um alerta
Alerta ativado
Salva
Salvar