Full time jobsLatest Private jobs

Databricks Consultant Job in Bengaluru | Deloitte Data Engineering Careers – May 2025

Databricks Consultant Job in Bengaluru | Join Deloitte’s Data Engineering Team – May 2025

Are you ready to work at the forefront of data engineering and innovation? Deloitte, a global leader in consulting and professional services, is inviting talented professionals to join its dynamic team as a Databricks Analyst/Consultant in Bengaluru. This role is part of Deloitte’s Artificial Intelligence & Data (AI&D) practice, where data meets purpose-driven transformation.

Whether you’re passionate about building robust data pipelines, optimizing big data workflows, or enabling intelligent business decisions using Databricks, Spark, and cloud platforms, this opportunity could be your ideal next career move.


About Deloitte’s AI&D Practice

Deloitte’s Artificial Intelligence & Data (AI&D) practice enables organizations to unlock the full potential of their data assets. The practice provides end-to-end services including data modernization, advanced analytics, business intelligence, cloud solutions, and cognitive computing. The goal? To empower businesses with insight-driven decision-making that fuels sustainable growth and competitive advantage.

As part of a global network of professionals, Deloitte’s AI&D experts work across industries to deliver cutting-edge data solutions tailored to clients’ unique business objectives.


Position Overview

  • Job Title: Analyst / Consultant – Databricks

  • Requisition ID: 79149

  • Location: Bengaluru, India

  • Employment Type: Full-Time

  • Department: Engineering – Data Modernization Team

  • Date Posted: May 14, 2025

This role offers a chance to work on challenging, high-impact data engineering projects that support strategic decision-making for top-tier clients. You’ll be expected to apply advanced skills in Databricks, Apache Spark, Python, SQL, and cloud services like Azure, AWS, or GCP.


Key Responsibilities

As a Databricks Consultant at Deloitte, you’ll help clients design and implement modern data platforms that are scalable, efficient, and analytics-ready. Your key duties include:

1. Data Engineering on Databricks

  • Design and build end-to-end data pipelines and workflows using Databricks and Apache Spark.

  • Ingest, transform, and process large datasets from various structured and unstructured data sources.

  • Leverage Delta Lake for scalable and reliable data lakes.

2. Cloud Platform Integration

  • Deploy data engineering solutions on cloud platforms such as Microsoft Azure, Amazon Web Services (AWS), or Google Cloud Platform (GCP).

  • Optimize compute performance, resource allocation, and job scheduling.

3. ETL and ELT Design

  • Develop scalable ETL and ELT pipelines to support real-time and batch data processing needs.

  • Automate ingestion, transformation, and loading processes to ensure data availability for downstream analytics.

4. Collaboration with Stakeholders

  • Work closely with data scientists, architects, and business analysts to gather requirements and build solutions aligned with business goals.

  • Communicate technical insights and findings effectively to non-technical stakeholders.

5. Data Modelling and Performance Tuning

  • Develop logical and physical data models to support reporting, dashboards, and machine learning workflows.

  • Implement performance optimization strategies for Spark jobs and data flows.

6. Data Quality and Governance

  • Ensure data integrity, consistency, and security through validation checks, auditing, and access controls.

  • Implement monitoring frameworks to track pipeline health and error logs.


Required Skills & Experience

To be successful in this role, candidates must demonstrate strong technical proficiency, problem-solving abilities, and communication skills.

Technical Expertise

  • Databricks platform expertise, including notebooks, clusters, jobs, and workspace management.

  • Proficiency in Apache Spark, using either PySpark or Spark SQL.

  • Hands-on experience with Python and/or SQL for data engineering tasks.

  • Experience working with Delta Lake for efficient data storage and streaming.

Cloud Platform Experience

  • Practical exposure to at least one major cloud provider: Azure, AWS, or Google Cloud.

  • Experience integrating with cloud-native services such as Azure Data Factory, AWS Glue, or Google Cloud Dataflow.

ETL and Data Modelling

  • Solid understanding of ETL/ELT frameworks and data modelling concepts, including star and snowflake schemas.

DevOps and CI/CD

  • Awareness of DevOps principles, including automated testing, CI/CD pipelines, version control (Git), and deployment automation.


Qualifications

  • Education: Bachelor’s or Master’s degree in Engineering, Computer Science, Data Science, or related fields.

  • Experience: 2 to 6 years of professional experience in data engineering, with proven work on Databricks implementations.


Preferred Qualifications

  • Familiarity with machine learning workflows, data science collaboration, and model lifecycle support.

  • Experience working in agile environments, using tools like Jira and Confluence.

  • Exposure to real-time data streaming and event-driven architecture (Kafka, Event Hub, etc.).


Soft Skills

  • Strong analytical thinking and problem-solving ability.

  • Excellent written and verbal communication skills.

  • Team-oriented with the ability to work independently and in collaborative settings.

  • Client-focused mindset with a proactive and solution-oriented attitude.


Leadership Expectations at Deloitte

At Deloitte, leadership is not confined to titles—it is embedded at every level. As a Consultant, you are expected to:

  • Inspire and support team members to deliver exceptional client service.

  • Collaborate across business functions and geographies to drive high performance.

  • Build trust-based relationships with stakeholders and clients.

  • Take initiative in identifying new opportunities and driving data-led innovation.


Career Growth & Learning Opportunities

Deloitte offers a career platform where your growth is nurtured through a combination of:

  • On-the-job learning

  • Access to Deloitte University and technical certifications

  • Mentorship from senior leaders

  • Structured development paths to roles such as:

    • Senior Consultant – Data Engineering

    • Manager – Data Platforms

    • Associate Director – Cloud & AI Solutions


Work Culture & Benefits

Deloitte is committed to creating a workplace that values diversity, inclusion, and continuous learning. As an employee, you will enjoy:

  • Hybrid work model tailored to your team’s needs

  • Competitive compensation and performance-based bonuses

  • Medical and wellness benefits

  • Learning allowances and upskilling initiatives

  • Access to global Deloitte knowledge resources and communities


Making an Impact at Deloitte

Deloitte’s purpose is to make an impact that matters—not just for clients, but for people and communities around the world. Our professionals are encouraged to align their careers with purpose, sustainability, and ethical innovation.

Whether it’s delivering a complex data modernization strategy or mentoring a team member, every day at Deloitte presents an opportunity to contribute meaningfully.


How to Apply

Interested candidates can apply via the Deloitte careers portal using Requisition ID: 79149. Be sure to include details of your Databricks projects, cloud certifications, and data engineering achievements.


Conclusion

If you’re a data engineer who thrives on solving complex challenges, enabling data-driven transformation, and working with next-gen technologies like Databricks and Apache Spark, then Deloitte’s Analyst/Consultant role in Bengaluru is the opportunity you’ve been looking for.

Join a team that’s shaping the future of data engineering—one pipeline at a time.

Click here to Sign Up

Leave a Reply

Your email address will not be published. Required fields are marked *