Business Intelligence Developer Position
We are looking for a skilled Business Intelligence developer to join our team. As a Business Intelligence developer, you will play a key role in the development and implementation of data quality scorecards using SAP ECC data available in the Lakehouse.
* Design and validate profiling logic to ensure high-quality data.
* Build rule-based data quality checks in PySpark to identify and resolve data inconsistencies.
* Generate field-level and row-level results to provide actionable insights.
* Publish business-facing scorecards in Power BI to facilitate data-driven decision-making.
Responsibilities:
* Rebuild Data Quality scorecards in Databricks using PySpark.
* Develop profiling logic to detect nulls, distincts, and pattern checks.
* Build PySpark-based Data Quality rules and row/column-level metrics to ensure data accuracy.
* Create curated DQ datasets for Power BI scorecards.
* Establish reusable DQ rule templates and standardized development patterns.
* Work with SAP ECC data models to ensure seamless integration.
* Support and mentor junior developers on rule logic and development standards.
Requirements:
* Strong experience in Databricks engineering with PySpark, SQL, and Delta Lake.
* Hands-on experience building Data Quality rules, frameworks, or scorecards.
* Experience in profiling large datasets and implementing metadata-driven DQ logic.
* Ability to mentor, review code, and explain concepts clearly.
* Excellent communication skills in English.
* Familiarity with SAP ECC tables and key fields (preferred).
* Experience with Unity Catalog or Purview (nice to have).
* Exposure to Lakehouse Monitoring or DQX accelerators (bonus).