Company: Weekday AI
Location: Bengaluru
Job Description:
This role is for one of the Weekday’s clients
Min Experience: 5 years
Location: Bangalore
JobType: full-time
We are looking for a highly skilled and driven Lead Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and optimizing scalable data pipelines and architectures that power critical business insights and applications. You will play a key role in shaping the organization’s data strategy while mentoring a team of engineers and collaborating closely with cross-functional stakeholders.
Requirements
Key Responsibilities:
- Design, develop, and maintain robust, scalable, and high-performance data pipelines using modern data engineering practices.
- Lead the end-to-end ETL process, including data ingestion, transformation, validation, and delivery across multiple data sources.
- Architect and manage data warehousing solutions to support analytics, reporting, and business intelligence use cases.
- Leverage AWS services such as S3, Redshift, Glue, Lambda, and EMR to build efficient and cost-effective data infrastructure.
- Ensure data quality, integrity, and reliability through automated testing, monitoring, and alerting mechanisms.
- Optimize data workflows for performance, scalability, and cost-efficiency.
- Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver actionable datasets.
- Establish best practices for data governance, security, and compliance.
- Mentor and guide junior data engineers, conducting code reviews and promoting engineering excellence.
- Stay updated with the latest trends and technologies in data engineering and recommend improvements to existing systems.
Required Skills & Qualifications:
- 5–9 years of hands-on experience in Data Engineering or a related field.
- Strong expertise in building and maintaining ETL pipelines using tools such as Apache Airflow, Spark, or similar frameworks.
- Extensive experience with AWS cloud platform and its data services (S3, Redshift, Glue, Lambda, EMR, RDS).
- Solid understanding of data warehousing concepts, including schema design (star/snowflake), partitioning, and indexing.
- Proficiency in SQL and at least one programming language such as Python or Scala.
- Experience working with large-scale, distributed data systems and handling structured and unstructured data.
- Familiarity with real-time data processing and streaming technologies (e.g., Kafka, Kinesis) is a plus.
- Strong problem-solving skills and ability to work in a fast-paced, dynamic environment.
- Excellent communication and leadership skills, with experience leading or mentoring teams.
Nice to Have:
- Experience with modern data stack tools (e.g., dbt, Snowflake, Databricks).
- Exposure to data governance, lineage, and cataloging tools.
- Understanding of DevOps practices and CI/CD pipelines for data workflows.
…
Posted: March 27th, 2026