Job Title:DescriptionRebuild and optimize data quality scorecards in Databricks to ensure high-quality data across the organization.ResponsibilitiesDesign, develop, and implement profiling logic to identify data inconsistenciesBuild PySpark-based data quality rules and metrics to measure data accuracyCollaborate with teams to create curated datasets for Power BI scorecardsEstablish reusable data quality rule templates and standardized development patternsWork closely with SAP ECC data models to ensure seamless integrationSupport and mentor junior developers on rule logic and development standardsQualificationsStrong experience in Databricks engineering (PySpark, SQL, Delta Lake)Hands-on experience building data quality rules, frameworks, or scorecardsProficiency in profiling large datasets and implementing metadata-driven DQ logicExcellent communication skills in EnglishFamiliarity with SAP ECC tables and key fields (preferred)Experience with Unity Catalog or Purview (nice to have)Exposure to Lakehouse Monitoring or DQX accelerators (bonus)BenefitsAs a Data Quality Engineer, you will be part of a team that is passionate about delivering high-quality data solutions. You will have the opportunity to work with cutting-edge technologies and collaborate with experienced professionals. If you are looking for a challenging role that will help you grow as a professional, apply today!