Senior Data Engineer

Batu Kawan, Malaysia

Job Description


Essential Duties and Responsibilities:

  • Build partnerships with key stakeholders and provide front line support for business request from various functions/groups
  • Lead the design and development of data delivery solutions in various fields/domains
  • Maintains and Enhances scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
  • Experience building complete data platform solutions including Storage, Governance, Security and different read/write access
  • Outline and participate in producing team deliverables including architecture, technical design documentation, standards, code development to high quality standards
  • Build Data Pipelines
  • Improve existing data pipelines in cloud for data transformation and aggregation.
  • Address a majority of technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of DataBricks or Snowflake
  • Guide the team to handle configurations for multiple departments/groups within single environment stack.
  • Partner with Data Engineers, Data architects, domain experts, data analysts and other teams to build foundational data sets that are trusted, well understood, aligned with business strategy and enable self-service
  • Ability to simultaneously work and deliver on more than one project on Individual contributor role
  • Perform integration, performance, unit, and integration tests as needed for system changes
  • Develop Documentation and training materials to support data delivery organization
  • Participate in data cleansing and migration effort to new system
  • Be part of the Global Data team and assist in the design, realize, test, implement, and support capabilities of UCT Data Requirements

Knowledge, Skills and Abilities:
  • Required knowledge & skills
o 3+ years of experience Developing, deploying data solutions using AWS Data Bricks or Azure Data Bricks, Power BI, and Apache Spark
o Experience designing Data pipelines (ETL/ELT), Datawarehouse and Data marts
o Experience deploying enterprise grade Power BI solutions, including app and workspace management, performance optimization, advanced visualizations, and data transformations
o Expertise in Snowflake \xe2\x80\x93 data modelling, ELT using Snowflake SQL, Snowflake Task Orchestration implementing complex stored Procedures and standard DWH and ETL concepts
o Hands on knowledge on Python along its main data libraries like Pandas, Numpy, Beautiful soup etc
o Experience with AWS S3
o Experience working in Agile development methodologies (Scrum, Kanban, Kanplan, etc)
  • Preferred knowledge & skills
o Ability to work on multiple areas like Data pipeline ETL, Data modelling & design, writing complex SQL queries etc.
o Hands-on expert with real-time data processing and analytics, data ingestion (batched and streamed), and data storage solutions
o Demonstrated strength in data management, orchestration, access control, etc.
o A proven expert in writing optimized SQL to deal with large data volumes
o Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
o Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
o Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
o Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
o Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
o Experience with data security and data access controls and design
o Experience with AWS Athena , AWS Glue
o Build processes supporting data transformation, data structures, metadata, dependency and workload management
o Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
o Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
o Broad exposure to Data Visualization Tools: Qlik, Power BI, Tableau etc.
o Functional Knowledge in areas of Sales & Distribution, Material Management, Finance and Production & Planning preferred
o Strong written communication skills. Is effective and persuasive in both written and oral communication

Educational/Certification Requirement:
  • BA/BS degree in Computer Science, Math, Business, Operations or similar.
  • Certified Snowflake cloud data warehouse Architect (Desirable).
  • Certified DataBricks Cloud (Desirable)
Experience Requirement:
  • Must have total 8+ yrs. In IT and 3+ years\xe2\x80\x99 experience working as a Data Engineering or Data Lake Solutions and 6+ years in Data warehouse, ETL, BI projects.
  • Must have experience at least 1-2 ends to end implementation of DataBricks or Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle/SAP Business Warehouse / SAP HANA
  • Experience in Hi-Tech and/or Semiconductor or manufacturing industry will be preferred

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD949131
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Batu Kawan, Malaysia
  • Education
    Not mentioned