Company: ThreatXIntel
Location: Bangalore
Job Description:
Role Overview
We are hiring a Big Data Engineer with strong hands-on experience in DBT, Google BigQuery, and Kubernetes. The ideal candidate will work on building scalable data pipelines, managing cloud data platforms, and enabling real-time and batch data processing.
Key Responsibilities
- Design, build, and maintain scalable data pipelines using Apache Airflow DAGs
- Develop and optimize transformations using DBT (Data Build Tool)
- Work extensively with Google BigQuery for data warehousing and analytics
- Manage and deploy workloads on Kubernetes clusters
- Implement real-time data streaming using Apache Kafka
- Write efficient and optimized queries using SQL
- Develop data processing scripts and automation using Python
- Collaborate with cross-functional teams for data integration and reporting
- Maintain version control and CI/CD workflows using GitHub
Required Skills & Experience
- 5+ years of experience in Data Engineering / Big Data
- Strong expertise in:
- DBT (mandatory)
- Google BigQuery (mandatory)
- Kubernetes / Cluster management (mandatory)
- Hands-on experience with:
- Apache Airflow (DAGs)
- Kafka Streaming
- GCP Cloud Platform
- SQL (advanced level)
- Python (data processing)
- GitHub (version control)
Good to Have
- Experience with CI/CD pipelines
- Exposure to data lake / warehouse architectures
- Knowledge of monitoring and logging tools
…
Posted: March 29th, 2026