Back‑End / Platform Engineer (Contract)
We are looking for an experienced Back‑End / Platform Engineer to design and build scalable, data‑driven APIs and backend services.
Key Responsibilities
Design, develop, and maintain data‑driven APIs using GraphQL and REST.
Build and optimize backend services that interact with relational and NoSQL databases.
Design and manage GraphQL schemas, resolvers, and performance optimization.
Collaborate with data, product, and application teams to define data access patterns.
Implement API security, versioning, and governance best practices.
Deploy and manage services using Docker and Kubernetes.
Work with cloud platforms (AWS or OCI) to build scalable and reliable systems.
Monitor, troubleshoot, and optimize application and database performance.
Participate in architecture discussions and mentor junior engineers.
Required Skills & Experience
6+ years of experience in software engineering or backend development.
Strong hands‑on experience with API development (GraphQL & REST).
Solid understanding of database design and optimization (SQL & NoSQL).
Experience with cloud platforms (AWS or Oracle Cloud Infrastructure – OCI).
Hands‑on experience with containers and orchestration (Docker, Kubernetes).
Proficiency in at least one backend language: Node.js (preferred), Java (Spring Boot), or Python.
Strong problem‑solving and communication skills.
About the Role (Remote – Brazil)
We’re looking for a Senior Data Engineer with strong MLOps expertise to join a US‑based company focused on sustainable commerce and data‑driven operations. This 100% remote, full‑time opportunity is exclusive to candidates located in Brazil.
What You’ll Do
Design, develop, and maintain robust, scalable data infrastructure across real‑time and batch workloads.
Build and support ML pipelines for model training, deployment, and monitoring.
Collaborate cross‑functionally with data scientists, engineers, and product teams.
Develop APIs and services for data ingestion, transformation, and querying.
Ensure the reliability of ML systems through strong observability and operational tools.
Contribute to architectural decisions and mentor team members.
5+ years as a Data Engineer or MLOps Engineer.
Strong experience with Python, Java, or Scala.
Hands‑on with GCP (preferred), AWS, or Azure.
Experience with BigQuery, ML frameworks (TensorFlow, PyTorch), and container orchestration (Docker, Kubernetes).
Familiarity with Apache Kafka, Spark, or similar tools.
Experience with ETL, CI/CD, git, and monitoring pipelines.
Strong communication skills and fluency in English (written & spoken) is mandatory.
Bachelor’s or Master’s degree in Computer Science or related fields.
What We Offer
Top‑tier hourly rate paid in USD.
Long‑term contract opportunity.
Fully remote work – collaborate with international teams from the comfort of your home.
A high‑impact role within a data‑driven, mission‑oriented company.
Senior Data Engineer – Data Migration (Signify Technology)
Contract role until the end of 2026 (likely extension). The focus is on delivering the final phase of a business‑critical GCP → AWS data migration.
Key Responsibilities
Deliver the final 30% of a complex data migration (GCP → AWS).
Work with and adapt an existing Scala‑based Spark codebase.
Translate pipelines for AWS compatibility and performance.
Support Airflow (Python) orchestration.
Ensure robust testing, validation, and monitoring across pipelines.
What We’re Looking For
Proven experience with data pipeline migrations (GCP/AWS).
Solid understanding of testing, validation, and data quality.
Comfortable working with high‑impact, business‑critical datasets.
SQL & Snowflake Engineer (Encora)
As an SQL & Snowflake Engineer, you will play a key role in the full development lifecycle, translating complex data structures into actionable technical insights with a focus on Snowflake and SQL queries.
Responsibilities and Duties
Digest requirements from the engineering team and specify the implementation of complex SQL queries.
Analyze and map underlying data schemas within Snowflake to support application logic.
Optimize SQL performance for large‑scale data retrieval and reporting.
Follow Agile processes and participate actively in data exploration phases.
Participate in technical discussions, data quality reviews, and schema design sessions.
Apply strong experience with Snowflake (architecture, warehousing, and data sharing).
Demonstrate strong proficiency in advanced SQL (CTEs, Window Functions, Stored Procedures).
Familiarity with ETL/ELT processes and data ingestion tools.
Solid experience with version control for database scripts (Git, Liquibase, or dbt).
Highly Desirable Skills
Experience with Python or scripting for data manipulation.
Familiarity with semi‑structured data (JSON, Parquet) with Snowflake.
Senior Data Engineer (Oracle/ODI) – Data Warehouse & ETL Specialist
We are looking for a Senior Data Engineer with strong expertise in Oracle and ODI (Oracle Data Integrator) to support enterprise‑grade data initiatives.
Key Responsibilities
Design and develop ETL/ELT processes using ODI.
Build and maintain scalable Data Warehouse solutions.
Develop complex SQL and PL/SQL logic (procedures, packages, functions).
Implement data quality, reconciliation, and auditing frameworks.
Develop and manage ODI Load Plans and workflows.
Troubleshoot and debug data pipelines and mappings.
Translate business requirements into technical specifications.
Required Qualifications
6+ years of experience in Data Engineering / ETL development.
3+ years of hands‑on experience with Oracle Data Integrator (ODI).
Strong expertise in Oracle SQL and PL/SQL.
Proven experience in Data Warehouse implementations (end-to-end).
Deep understanding of ETL/ELT concepts and data modeling.
Experience with Load Plans and scheduling.
Error handling and data reconciliation.
Strong analytical and problem‑solving skills.
Ability to work independently in fast‑paced environments.
Fluent or advanced English communication skills.
Nice to Have
Experience in financial services industry.
Exposure to modern data platforms (Databricks, Snowflake, etc.).
Cloud experience (AWS, Azure, or GCP).
What We Offer
Opportunity to work on high‑impact enterprise data projects.
Collaborative and technically strong team environment.
Exposure to both legacy and modern data architectures.
Career growth aligned with evolving data technologies.
Data Engineer – Salesforce Integration (HCLTech)
Data Engineer with Salesforce integration experience and strong ETL expertise in Azure‑based environments.
Responsibilities
Extract legacy data from SQL Server using Azure Synapse / Azure Data Factory.
Load, validate, and reconcile data into Salesforce platforms.
Conduct unit testing and support UAT.
Manage work items, defects, and priorities in Jira.
Contribute to reusable migration processes and playbooks.
Develop and implement source‑to‑target mappings aligned to Salesforce data models.
Build transformations using Databricks.
Required Skills
Data Mapping experience.
Strong ETL expertise.
Experience with Azure services and Salesforce integration.
Preferred/Desired Skills
Experience with Data Transformation using Databricks.
Knowledge of best practices for Salesforce data loads.
Equal Opportunity Employer
We are an equal opportunity employer committed to providing equal employment opportunities to all applicants and employees without regard to race, religion, sex, color, age, national origin, pregnancy, sexual orientation, disability or genetic information, or any other protected classification, in accordance with federal, state and/or local laws.
Principal Data Engineer (Boldin)
The Principal Data Engineer is a senior technical authority responsible for defining Boldin’s data architecture, setting long‑term technical strategy, and tackling the most complex data engineering challenges.
Key Responsibilities
Define and evolve long‑term data architecture and vision.
Design resilient and scalable data platform and pipelines.
Set standards for data modeling, reliability, observability, and governance.
Lead complex, high‑risk technical initiatives and migrations.
Influence tool selection and technology adoption across the data stack.
Partner with leadership to align data strategy and business goals.
Enable analytics, ML, and product use cases.
Maintain uptime and data freshness SLAs.
Optimize cost and performance in data systems.
Maintain full documentation for production‑grade assets.
Qualifications
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
10+ years of experience in data engineering or related disciplines.
Proficient in SQL, Python, or related languages.
Strong experience with data warehouse, data lakes, and distributed systems.
Strong experience with modern data stack (Athena, BigQuery, Glue, Spark, Dataproc, Kafka, Flink, dbt, Kestra, Fivetran, etc.).
Proven ability to build and maintain production‑grade ELT/ETL pipelines.
Experience with workflow orchestration (e.g., Airflow, Dagster, Prefect, Cloud Composer).
Experience implementing data quality and observability frameworks.
Performance and cost optimization in cloud warehouses.
Good spoken English.
Product & Business Partnership
Experience supporting product analytics and experimentation.
Ability to translate business requirements into scalable data models.
Strong ownership and accountability for SLAs.
Nice to Have
Experience working with Kubernetes.
Experience structuring data for ML or AI use cases.
Familiarity with Amplitude or product event pipelines.
Experience in a high‑growth SaaS or fintech environment.
Influencing technical direction without direct managerial authority.
Collaborative and innovative work environment.
Flexible PTO and health and wellness package.
Budget for English lessons.
Senior Data Engineer – Oil & Gas ERP Client
The senior‑level position focuses on designing, building, and maintaining scalable data pipelines for an ERP accounting software used by over 1,700 customers.
Responsibilities
Build and maintain scalable data pipelines that power core product functionality, customer reporting, and internal analytics.
Take end‑to‑end ownership of data flows, from ingestion to delivery with a focus on quality and reliability.
Create and maintain production‑grade database views and transformation logic.
Develop, optimize, and manage SQL view scripts and transformations.
Design, configure, and maintain database architectures, distributed systems, and storage solutions.
Ensure data integrity by implementing validation processes.
Monitor system performance, troubleshoot issues, and optimize data queries and workflows.
Build and support data integrations with internal services and third‑party APIs.
Research and recommend innovative approaches for project execution.
Document procedures and provide status reports to stakeholders.
Required Experience
Bachelor’s degree in Computer Science, Information Systems, Finance, Accounting, or related field.
Excellent English verbal and written communication skills.
5+ years of data engineering experience.
3+ years of SQL and Python experience.
Knowledge of database architecture and management.
Experience working with Cloud Platforms (Azure, AWS, GCP, etc.).
Ability to read code and convert programming logic from one language to another.
Familiarity with APIs.
Preferred Experience
Experience with data analytics tools (Power BI, QuickSight, Looker, Sisense).
Experience with Apache Airflow.
Experience with AI coding tools and/or AI assistants.
Background in Math, Stats, Machine Learning.
Additional Information
15 days of paid time off, 1 floating day, 3 sick days, and designated national holidays.
Start: ASAP.
Velozient – Senior Data Engineer
Velozient is a privately held nearshore software development company providing outsourced development resources to North American companies.
Qualifications
5+ years of work experience.
Competence in industry-accepted professional standards.
Exceptional soft‑skills, communication ability, and interpersonal aptitude.
Comfort with structured corporate culture.
Up‑to‑date knowledge of the industry.
Equal Opportunity Statement (HCLTech)
Representing 165 nationalities worldwide, we are proud to be an equal opportunity employer committed to providing equal employment opportunities to all applicants and employees without regard to race, religion, sex, color, age, national origin, pregnancy, sexual orientation, disability or genetic information, or any other protected classification, in accordance with federal, state and/or local laws.
#J-18808-Ljbffr