PURPOSE:
A role to be responsible for system design, development, and implementation of regional frontend systems. Provide maintenance and support on production systems.
KEY ROLE:
• Design, develop, document, and implement end-to-end data pipelines and data integration processes, both batch and real-time. This includes data analysis, data profiling, data cleansing, data lineage, data mapping, data transformation, developing ETL / ELT jobs and workflows, and deployment of data solutions.
• Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency, and cleanliness, and to optimize and fine-tune ETL / ELT processes.
• Recommend, execute and deliver best practices in data management and data lifecycle processes, including modular development of ETL / ELT processes, coding and configuration standards, error handling and notification standards, auditing standards, and data archival standards.
• Prepare test data, assist to create and execute test plans, test cases, and test scripts.
• Collaborate with Data Architect, Data Modeler, IT team members, SMEs, vendors, and internal business stakeholders, to understand data needs, gather requirements, and implement data solutions to deliver business goals.
• BAU support for any data issues and change requests, document all investigations, findings, recommendations, and resolutions.
QUALIFICATIONS / EXPERIENCE:
• Bachelor in IT, Computer Science, or Engineering.
• At least 3-5 years of using Big Data technologies like Azure and AWS Big Data Solution, Hadoop, Hive, HBase, Spark, Sqoop, Kafka, and Spark Streaming.
• Minimum 5 years of professional experience in data warehouse, operational data store, and large-scale data architecture implementations in Unix or/and Windows environments.
• At least 5 years of solid hands-on development experience with ETL development to transform complex data structure in multiple data sources environment.
• At least 5 years data model (relational and/or data warehouse), data mart design, and implementation.
• Minimum 3-5 years ETL programming in any of these languages including Python, Scala, Java, or R
KNOWLEDGE & TECHNICAL SKILLS:
• Familiar with ETL/ELT framework, data warehousing concepts, data management
Experienced in handling and processing different types of data (structured, semi-structured, and unstructured).
• Strong knowledge of various database technologies (RDBMS, NoSQL, and columnar).
• Preferably with a good understanding of data analytics and data visualization, strongly prefer Power BI and Tableau.
• Strong understanding of programming languages like Python, Scala, Java, R, Shell and PLSQL
• Good understanding of Master Data Management (MDM) and Data Governance Tools preferring Informatica technologies
• Experienced working in the insurance industry will be an added advantage.
• Ability to communicate and present technical information in a clear and unambiguous manner.
• Strong ability to work independently and cooperate with diverse teams in a multiple stakeholder environment.
• Strong sense of work ownership, high affinity with anything data, and a desire for constant improvements.
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.