Lead the integration, operations, and monitoring of enterprise data pipelines and platforms, ensuring high standards of data quality, reliability, completeness, and availability. Drive proactive issue management, consumer satisfaction, cost efficiency, and ongoing innovation in the organization's data ecosystem.
Key Responsibilities:
End-to-End Data Integration
Oversee and document mapping, integration, and validation of all key data touchpoints.
Ensure data sources are inventoried and integrated consistently, accurately, and securely across all systems.
Pipeline Maintenance & Uptime
Design, implement, and monitor robust data pipelines, focusing on maximizing uptime and minimizing failures.
Maintain and regularly test failover, maintenance, and disaster recovery procedures.
Data Quality Management
Develop and operationalize data quality frameworks including periodic profiling, assessment reports, and cleansing.
Maintain logs of data issues, lead resolution efforts, and report ongoing improvements.
Deliver and update Data Quality Scorecards and benchmarking reports.
Incident Management
Lead detection and resolution of data-related incidents, ensuring efficient incident logging, analysis, and fast Mean Time To Resolution (MTTR).
Maintain, communicate, and update Incident Management SOPs.
Performance Optimization
Monitor and tune data pipeline performance to minimize latency from data ingestion to AI/analytics availability.
Maintain and analyze Pipeline Tuning & Optimization logs and dashboards.
Database Reliability
Monitor and report database replication status, backup health, and server storage utilization.
Set up proactive alerts and reporting for database infrastructure integrity.
Stakeholder Engagement & Satisfaction
Implement regular consumer surveys, analyze feedback, and communicate data product improvements.
Deliver clear and actionable Data Consumer Survey Reports and product feedback syntheses.
Cost Control
Track and optimize cost per data pipeline, run, or transaction.
Document and report cost-saving measures and realized efficiencies.
Innovation Pipeline
Lead research and execution of data & AI innovative projects, pilots, and POCs.
Record and report business impact, outcomes, and roadmap for scalable innovation.
Key Success Measures:
% of key data touchpoints fully integrated, validated, and available
Data pipeline uptime (%)
Data completeness and accuracy rates
Incident MTTR (Mean Time To Resolution)
Average % data latency (event to AI/analytics)
Stakeholder/data consumer satisfaction score
Cost per pipeline/run/transaction
Number/value of innovation pilots delivered
Required Skills & Qualifications:
Experience in data integration, pipeline operations, and monitoring (Mage, Airflow, etc.)
Strong ability in data quality frameworks and issue management
Incident management and troubleshooting skills
Knowledge of cloud or on-prem database deployment, replication, and backup monitoring
Experience with BI dashboards, survey/feedback methods, and cost control
Capacity for innovation, project leadership and collaboration with cross-functional teams
Bachelor's or higher degree in Data Engineering, Computer Science, Information Systems, or related discipline
Job Type: Full-time
Pay: RM2,000.00 - RM3,000.00 per month
Benefits:
Professional development
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.