Job Title: Data Warehouse Architect
A leading company in the Gaming industry is seeking a hands-on Data Architect with DataWarehouse Engineer expertise to spearhead the design and implementation of a new data warehouse instance for a major product line.
* Design and deploy a new Databricks Lakehouse instance tailored to the client's product-level data needs.
* Architect and implement robust data ingestion pipelines using Spark (PySpark/Scala) and Delta Lake.
* Integrate AWS-native services (S3, Glue, Athena, Redshift, Lambda) with Databricks for optimized performance and scalability.
* Define data models, optimize query performance, and establish warehouse governance best practices.
* Collaborate cross-functionally with product teams, data scientists, and DevOps to streamline data workflows.
* Maintain CI/CD, preferably DBX for data pipelines using GitOps and Infrastructure-as-Code.
* Monitor data jobs and resolve performance bottlenecks or failures across environments.
The ideal candidate should have advanced skills in Databricks workspaces and Unity Catalog setup, Delta Lake internals, file compaction, and schema enforcement. Additionally, they should possess expertise in PySpark/SQL for ETL and transformations.
Required Skills and Qualifications:
* End-to-end setup of Databricks workspaces and Unity Catalog.
* Expertise in Delta Lake internals, file compaction, and schema enforcement.
* Advanced PySpark/SQL skills for ETL and transformations.
Benefits:
Familiarity with AWS Native Integration and Data Warehousing & Modeling principles is required.
AWS certification is a plus.