Databricks Architect

Kuala Lumpur, M14, MY, Malaysia

Job Description

Summary




Our talented Data & AI Practice is made up of globally recognized experts - and there's room for more analytical and ambitious data professionals. If you're passionate about helping clients make better data-driven decisions to tackle their most complex business issues, let's talk. Take your skills to a new level and launch a career where you can truly do what matters.

Key Responsibilities



Architectural Leadership: Define and own the end-to-end technical architecture and roadmap for the enterprise data platform on Databricks, including data ingestion, transformation, storage, and consumption layers. Lakehouse Design: Design and champion the implementation of the Lakehouse architecture utilizing Delta Lake, Databricks Unity Catalog, and Databricks SQL Warehouse to support all data, analytics, and AI/ML initiatives. Data Governance & Security: Architect and enforce enterprise-level data governance, security, and access control policies using Unity Catalog (e.g., fine-grained access, lineage tracking, auditing). Technical Guidance & Mentorship: Provide technical leadership, guidance, and mentorship to a team of Databricks Engineers. Conduct architectural reviews, code audits, and ensure adherence to best practices and standards. Performance and Cost Optimization: Define strategies for, and lead, major performance tuning and cost optimization initiatives across all Databricks workloads, clusters, and Delta Lake storage. Cloud Integration: Lead the integration of Databricks with core cloud services (AWS, Azure, or GCP) and other enterprise systems (e.g., data catalogs, BI tools, ML platforms). Stakeholder Collaboration: Engage with executive stakeholders, data scientists, and business leaders to understand complex requirements and translate them into robust, scalable technical designs and solution blueprints. DataOps and Automation Strategy: Define the DevOps/DataOps strategy for the Databricks environment, including continuous integration/continuous delivery (CI/CD) pipelines and Infrastructure-as-Code (IaC) using tools like Terraform or Databricks Asset Bundles. Innovation: Evaluate new Databricks features, open-source technologies (e.g., MLflow), and industry trends to drive continuous platform improvement and competitive advantage.


10+ years of experience in data architecture, data engineering, or a related senior technical role. 5+ years of deep, hands-on experience designing and implementing large-scale data solutions on the Databricks Platform. Expertise in Lakehouse Architecture, Delta Lake, Apache Spark, and performance tuning techniques. Proven experience implementing Databricks Unity Catalog for centralized governance. Deep proficiency in Python/PySpark or Scala and advanced SQL. Extensive experience with one major cloud platform and its security and data services. Strong experience with data modeling (dimensional, data vault) and data warehouse concepts. Demonstrated ability to lead technical discussions, document architectural designs, and communicate complex concepts to both technical and non-technical audiences. Experience with real-time/streaming data architectures (Structured Streaming). Experience with MLOps practices, including MLflow for model lifecycle management. Expertise in Infrastructure-as-Code tools (e.g., Terraform) and automated CI/CD for Databricks. * Databricks Certified Data Engineer Professional or Databricks Certified Machine Learning Professional or equivalent cloud architecture certification.

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1297426
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Kuala Lumpur, M14, MY, Malaysia
  • Education
    Not mentioned