Role Overview
We are looking for an experienced Azure Data Engineer to build and manage scalable data pipelines integrating SAP data (via BDC processes) into Microsoft Fabric. This role will focus on enabling seamless data movement, transformation, and analytics across enterprise systems. The ideal candidate will have strong expertise in Azure data services, SAP data extraction, and modern data platform architecture, with hands-on experience in building production-grade pipelines.
Core Duties:
Reliable and efficient SAP to Fabric data pipelines
High data accuracy and consistency
Strong collaboration with SAP and analytics teams
Key Responsibilities
Design and implement end-to-end data pipelines to extract data from SAP systems (BDC and related interfaces) and ingest into Microsoft Fabric
Develop and maintain ETL/ELT workflows using Azure Data Factory / Fabric Data Pipelines
Transform and model data within Fabric Lakehouse / Data Warehouse for analytics use cases
Work closely with SAP teams to understand BDC jobs, data structures, and extraction mechanisms
Ensure data quality, validation, and reconciliation between SAP and target systems
Optimize pipelines for performance, scalability, and cost efficiency
Implement monitoring, alerting, and logging frameworks
Support data governance, security, and compliance standards
Collaborate with analytics and BI teams for data consumption and reporting requirements
Mandatory Skills
Strong experience with Azure Data Engineering stack
Azure Data Factory / Fabric Pipelines
Azure Data Lake Storage (ADLS Gen2)
Microsoft Fabric (Lakehouse, Warehouse)
Highly Desired Skills
Hands-on experience with SAP data extraction, including:
SAP BDC (Batch Data Communication)
Understanding of SAP tables and data flow
Proficiency in SQL and Python / PySpark
Experience in data transformation, ingestion, and orchestration
Strong understanding of data modeling (dimensional modeling preferred)
Good to Have / Preferred Skills
Experience with Microsoft Fabric end-to-end capabilities
Exposure to Delta Lake / Spark-based processing
Familiarity with Power BI for downstream consumption
Experience with CI/CD (Azure DevOps) and deployment pipelines
Experience with real-time / near real-time data ingestion
Knowledge of data governance tools and practices
Azure certification (DP-203 / Fabric certification)
Education
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
…