Data Engineer (contract)

Kuala Lumpur, Malaysia

Job Description

Role Overview:
This position sits within the technology transformation function of a multinational organisation, supporting enterprise-wide reporting, analytics, and modern data architecture initiatives. The role focuses on designing and operationalising automated data ingestion, transformation pipelines, and governed cloud-based data layers to enable business intelligence, process transparency, and AI-driven insights. It requires close collaboration with product, analytics, and portfolio teams to deliver scalable data solutions within a structured cloud environment.
Key Responsibilities:

  • Build and maintain automated ingestion pipelines from workflow tools and other enterprise data sources into cloud-based storage and analytics platforms.
  • Develop and tune pipelines supporting batch and near-real-time loads, including incremental ingestion from relational databases.
  • Design layered data architecture (raw to curated to consumption) and create models optimised for reporting, analytics, and downstream semantic layers.
  • Implement data quality checks, monitoring mechanisms, and remediation workflows covering completeness, consistency, timeliness, and lineage tracking.
  • Apply metadata governance, cataloguing, lifecycle rules, and policy enforcement using cloud-native governance tooling.
  • Develop and validate transformation logic using SQL/Python, incorporating unit/integration testing and CI/CD deployment patterns.
  • Collaborate with product owners and BI teams to translate reporting needs into data contracts, datasets, and model structures suitable for analytical tools.
  • Produce technical documentation including schema definitions, runbooks, SLO/SLA expectations, and re-usable standards for future data products.
Key Requirements:
Must-Have:
  • Minimum 5 years' experience building production-grade cloud data pipelines.
  • Proven hands-on expertise with Google Cloud data services (e.g., BigQuery, Dataplex, Dataflow).
  • Strong SQL capabilities, including performance tuning, stored procedures, and CDC patterns across RDBMS.
  • Demonstrated experience integrating workflow platform APIs and processing structured files from enterprise repositories.
  • Solid understanding of dimensional modelling, lakehouse patterns, and governed data architecture frameworks.
  • Experience in data governance, quality validation rules, and monitoring frameworks.
  • Familiarity with BI consumption patterns and semantic layer development.
Nice-to-Have:
  • Exposure to insurance or financial services data domains.
  • Working knowledge of Power BI modelling and DAX optimisation considerations.
If this role aligns with your experience and career goals, please send your application to AviralBhargava@argyllscott.sg.
Argyll Scott Asia is acting as an Employment Business in relation to this vacancy.

Skills Required

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1395624
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Kuala Lumpur, Malaysia
  • Education
    Not mentioned