Our talented Data & AI Practice is made up of globally recognized experts - and there's room for more analytical and ambitious data professionals. If you're passionate about helping clients make better data-driven decisions to tackle their most complex business issues, let's talk. Take your skills to a new level and launch a career where you can truly do what matters.
Key Responsibilities
Lead the Design: Define and implement robust data architectures utilizing the Databricks ecosystem, including Delta Lake, Unity Catalog, Delta Live Tables (DLT), and Databricks Workflows.
Hands-on Development: Serve as the most senior developer, writing high-quality, production-grade code in PySpark/Scala and SQL for complex batch and streaming ETL/ELT pipelines.
Performance Optimization: Lead performance tuning and optimization efforts for large-scale Spark jobs, ensuring efficient cluster utilization and cost management.
Standards & Best Practices: Define and enforce technical standards, code quality, testing frameworks (unit, integration), and DataOps/CI/CD pipelines for the engineering team.
Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
8+ years of progressive experience in Data Engineering, with [4]+ years focused on large-scale data processing using Databricks and Apache Spark.
Expert proficiency in Python (PySpark) and advanced SQL.
Deep expertise in implementing Lakehouse architecture using Delta Lake and familiarity with the Medallion Architecture (Bronze, Silver, Gold).
Strong experience with data modeling techniques (e.g., dimensional modeling, data vault) suitable for large data volumes.
Proven experience leading technical teams, including defining tasks, performing code reviews, and providing architectural guidance.
Hands-on experience implementing DataOps/DevOps practices and CI/CD pipelines for data solutions (e.g., Terraform, Databricks Asset Bundles, Git).
Preferred Skills & Certifications:
Experience with Databricks features like Delta Live Tables (DLT), Databricks Workflows, and Unity Catalog.
Experience with streaming technologies (e.g., Kafka, Spark Streaming).
Familiarity with CI/CD tools and Infrastructure-as-Code (e.g., Terraform, Databricks Asset Bundles).
* Databricks Certified Data Engineer Associate or Professional certification
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.