Senior Software Data Engineer Job Description
About HME:
For over 50 years, HME has created industry-leading products and services, beginning with the first wireless microphone for the professional audio market in 1974. Today, HME continues to innovate across multiple industries, delivering reliable, high-quality solutions that enable seamless connectivity and communication. HME values people beyond resumes, encouraging diverse ideas, collaboration, and a strong culture of quality, safety, and continuous improvement. As part of the HME GCC team, you will work with modern cloud technologies to solve complex data challenges in a manufacturing-focused environment.
Job Summary:
The Senior Software Data Engineer designs, builds, and maintains scalable enterprise data pipelines and platforms using Azure Data Lake and Databricks. This role is hands-on and
focuses on developing robust ETL pipelines, implementing Medallion Architecture and integrating data across ERP, CRM, PLM, and MES systems within a manufacturing
environment. The engineer collaborates closely with business and IT stakeholders, contributes to data engineering best practices, and is deeply involved in implementation and delivery.
Essential Job Functions:
- Designs, develops, and maintains data pipelines using Azure Data Lake and Databricks following Medallion Architecture best practices.
- Builds and optimizes ETL pipelines for manufacturing data, ensuring data quality, consistency, and performance.
- Integrates enterprise systems including ERP (Microsoft D365 preferred), CRM, PLM, and MES using Azure services and iPaaS platforms such as Boomi or Workato.
- Collaborates with business and IT stakeholders to gather requirements and translate them into scalable data models.
- Contributes to data engineering best practices including CI/CD, version control, automated testing, and documentation.
- Monitors, troubleshoots, and supports data pipelines to ensure reliability, availability, and performance.
- Supports analytics and reporting use cases by delivering clean, well-modeled, and reliable datasets.
Key Competencies:
- Customer Driven Demonstrates a strong commitment to quality service and customer- focused solutions.
- Lean Thinking Continuously seeks opportunities to eliminate waste and improve data processes.
- Safety & Quality Maintains high standards for accuracy, reliability, and secure data handling.
- Decision Making Analyzes complex problems, gathers information, and makes sound technical decisions.
- Managing Priorities Effectively balances multiple responsibilities in a fast-paced environment.
- Communication & Teamwork Communicates clearly and collaborates effectively across teams.
Qualifications:
Experience:
- 5+ years of experience in data engineering roles.
- 4+ years of hands-on experience with Azure and Databricks.
- Proven experience implementing Medallion Architecture in Databricks.
- Strong experience designing and developing ETL pipelines in manufacturing or ERP- driven environments.
- Hands-on experience integrating ERP (Microsoft D365 preferred), CRM, PLM, or MES systems.
- Strong proficiency in SQL, Python, Spark, and PySpark.
- Experience with Azure services such as Data Factory, Synapse, Data Lake, and Key Vault.
- Familiarity with iPaaS platforms such as Boomi or Workato.
- Exposure to CI/CD practices and cloud security best practices.
Education:
- Bachelors or masters degree in computer science, Data Engineering, or a related field is required.
Travel:
- Up to 10% travel may be required.
Physical Demands & Work Environment
The physical demands and work environment described are representative of those required to perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform these functions. The role involves regular use of computer equipment and collaboration in an office or hybrid work environment.
…