Big Data Quality Expert
We're looking for a highly skilled professional to rebuild and maintain our data quality scorecards using SAP ECC data in Databricks. You'll be responsible for designing profiling logic, building rule-based data quality checks in PySpark, generating field-level and row-level results, and publishing business-facing scorecards in Power BI.
Responsibilities:
* Rebuild and optimize existing data quality scorecards
* Develop custom profiling logic (nulls, distincts, pattern checks)
* Create curated datasets for Power BI scorecards
* Establish reusable data quality rule templates and standardized development patterns
* Work with SAP ECC data models and support junior developers on rule logic and development standards
Qualifications:
* Strong experience with Databricks engineering (PySpark, SQL, Delta Lake)
* Hands-on experience building data quality rules, frameworks, or scorecards
* Proficiency in profiling large datasets and implementing metadata-driven data quality logic
* Excellent communication skills and ability to mentor, review code, and explain concepts clearly
* Familiarity with SAP ECC tables and key fields is preferred
* Experience with Unity Catalog or Purview is a nice to have
* Exposure to Lakehouse Monitoring or DQX accelerators is a bonus