Job Location: Kolkata/Bengaluru/Hyderabad
Responsibilities:
· Designing, building, and maintaining scalable data pipelines and workflows.
· Developing and optimizing data ingestion, transformation, and integration processes.
· Working with Azure Data Factory and related technologies to build and orchestrate data workflows.
· Ensuring data quality and integrity through robust validation and monitoring systems.
· Supporting data modeling efforts, including relational and dimensional modeling.
· Building reporting and monitoring solutions to ensure efficient and accurate data operations.
· Processing, logging, and correlating data from multiple sources to support business needs.
Requirements:
· Bachelor’s degree in Computer Science, Engineering, or a related field.
· Proficiency in SQL and relational database design.
· Strong scripting skills for data/file manipulation (e.g., Python or similar languages).
· Experience with Azure Data Factory, SQL Server, or similar tools for data integration and transformation.
· Hands-on experience with scalable data systems, including batch and real-time
· processing.
· Strong understanding of ETL pipelines, data modeling, and Business Intelligence
· systems.
· Solid knowledge of event processing and workflow orchestration.
· Ability to write high-quality, maintainable code and follow best practices in
· software engineering.
· Excellent problem-solving and debugging skills.
· A team player with strong communication skills and a drive for results.
Nice to Have:
· Experience with cloud platforms, particularly Azure or AWS.
· Advanced knowledge of SQL and relational database performance optimization.
· Familiarity with tools like Azure Data Lake, Power BI, or similar analytics platforms.
…