We are seeking a Data Engineer with strong expertise in Azure data services to work alongside our Data Analyst and Data Modeler on a high-impact, time-sensitive project. The role involves building and optimizing data pipelines, integrating multiple data sources and ensuring robust data transformations while collaborating in an Agile / Scrum environment using Azure DevOps (ADO). The ideal candidate will also excel at stakeholder engagement and delivering quality outcomes under tight deadlines.
Key Responsibilities
:
- Collaborate with Data Analysts, Data Modeler and business stakeholders to understand data requirements and transformation logic.
- Extensive experience in designing and developing data ingestion pipelines using Azure Data Factory (ADF) to integrate data from diverse on-premise and cloud sources.
- Ingest and integrate data from multiple sources including SFTP, REST APIs, SOAP services and SharePoint.
- Implement Medallion (Bronze, Silver and Gold) data layer architecture, ensuring compliance with governance and anonymization requires for sensitive fields.
- Proficient in building and optimising scalable data transformation workflows in Databricks using Python and SQL, leveraging Azure Data Lake Storage (ADLS) as an external location for efficient storage processing and configuring Service Principal Name (SPN) for secure, automated access.
- Designed, implemented and managed secure data pipelines with AES and PGP encryption, ensuring data at rest and data in transit protection. Integrated Azure Key Vault for key management, achieving compliance with enterprise security policies and regulatory requirements.
- Work within Agile / Scrum frameworks, participating in sprint planning, daily stand-ups, backlog grooming and sprint reviews.
- Utilise Azure DevOps (ADO) for work tracking, repository management, version control, collaboration and CI/CD to streamline development workflows and ensure continuous integration and delivery.
- Partner with Data Modelers to align data structures with conceptual, logical and physical models.
Required Skills and Qualifications
:
- Proves experience as a Data Engineer working in Azure environments.
- Strong working knowledge of Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Key Vault and related services.
- Experience with data ingestion from SFTP, REST, SOAP and SharePoint.
- Proficiency in ETL / ELT design and implementing medallion architecture.
- Familiarity with data anonymization / masking techniques for sensitive information.
- Hands-on experience with Azure DevOps (ADO) in Agile / Scrum delivery.
- Strong SQL skills and scripting abilities (Python / SQL).
- Excellent communication and stakeholder management.
- Ability to work effectively in time-sensitive, high pressure project environments.
Preferred Qualifications
:
- Experience with Delta Lake and data lakehouse patterns.
- Familiarity with CI/CD practices for data pipelines in Azure DevOps.
- Understanding of data warehousing and BI tools (Power BI)
- Experienced in Enterprise Data Hub implementation.
- Experienced in development suing ADF & Databricks.
Job Type: Contract
Contract length: 12 months
Pay: RM6,500.00 - RM7,500.00 per month
Benefits:
Maternity leave
Opportunities for promotion
Parental leave
Professional development
Application Question(s):
How many years of experience do you have, working in Azure environments?
How much is your expected salary?
Experience:
Data Engineer: 5 years (Preferred)
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.
Job Detail
Job Id
JD1223122
Industry
Not mentioned
Total Positions
1
Job Type:
Contract
Salary:
Not mentioned
Employment Status
Permanent
Job Location
Kuala Lumpur, M14, MY, Malaysia
Education
Not mentioned
Apply For This Job
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.