Responsibilities:
1. Design, build, and maintain robust data pipelines and data models using tools such as Python, Databricks, Azure Data Factory, Snowflake, MS Fabric and MS SQL to ensure accurate, reliable, and scalable data flows from source to dashboard. Work may also involve other modern data platforms and cloud technologies as needed.
2. Create dynamic, interactive Power BI dashboards and reports tailored to meet business needs. Ensure visualizations are user-friendly, visually compelling, and deliver meaningful insights for decision-makers.
3. Ingest, transform, and model large datasets efficiently. Optimize ETL/ELT workflows, SQL queries, and Power BI data models to deliver high-performance analytics solutions.
4. Partner closely with business users and data consumers to gather requirements, define KPIs, and translate business objectives into technical data pipelines and analytical models.
5. Continuously monitor and improve data pipeline and dashboard performance. Implement best practices in data validation, quality checks, and system optimization to ensure fast, reliable reporting.
6. Maintain comprehensive documentation for data pipelines, models, dashboards, and processes to support transparency, reproducibility, and cross-team collaboration.
7. Stay current with the latest developments in data engineering, BI tools, and cloud technologies. Proactively recommend and implement improvements to enhance efficiency and analytical capabilities.
Job Requirements Details:
Education:
8. Bachelor"s degree in Computer Science, Information Technology, Data Science, or a related field.
9. Experience:
10. 5+ years of hands-on experience working across the data lifecycle, including data engineering, modeling, and business intelligence development, whether in a single full-stack role or through multiple complementary positions.
11. The ability to showcase previous work through a GitHub repository, portfolio of reports/dashboards, or other demonstrable project examples is highly valued.
Technical Proficiency:
12. Proven experience designing, developing, and maintaining end-to-end data solutions across both data engineering and business intelligence domains.
13. Strong proficiency in Python for data processing, automation, and integration.
14. Hands-on experience with Azure Data Factory (ADF), Databricks, and Microsoft Fabric for building and orchestrating ETL/ELT pipelines.
15. Solid understanding of data warehousing and data modeling principles (e.g., star/snowflake schemas, dimensional modeling).
16. Proficiency in SQL and experience with MS SQL Server, Snowflake, PostgreSQL, and Oracle / PL-SQL .
17. Exposure to NoSQL databases such as MongoDB and big data ecosystems (e.g., Spark, Delta Lake, Azure Data Lake, Hadoop, or similar distributed data technologies).
18. Familiarity with data integration tools and API-based data ingestion is a plus.
Business Intelligence & Visualization:
19. Advanced expertise in Power BI, including Power Query (M language), DAX, and data model design .
20. Ability to create interactive, visually compelling, and performance-optimized dashboards for business stakeholders.