Work closely with the Software Engineering team with responsibility for the following task:
Responsible for developing programs for large Application Services in the Big Data and Data Warehouse space.
In charge of executing data mapping and lineage strategy for the organization
Translate business requirements to technology implementation.
Work together with senior team members such as CIO, Faculty Dean & Professor, Business Analysts, QA, etc. to implement the development.
Work on disparate data sets, carry out analysis of vast data stores and uncover insights.
Responsible for future Hadoop development and implementation.
Responsible for developing scalable and high-performance web services for data tracking.
Evaluate new decision-making dashboard technology options, recommending the adoption of the new dashboard and ETL technologies into the solution.
Implementation of digitalization or automation.
Requirements
Possess a bachelor\'s degree or Diploma in Computer Science / Big Data Engineering.
2+ years experience designing and developing commercial-grade dashboard data modeling using tools like Grafana or Superset.
Experience with integration of data from multiple data sources Oracle SQL, MSSQL, MySQL, PostgresSQL, and MongoDB.
Knowledge of various ETL techniques and frameworks, such as Pentaho, Stitch, or SSIS (SQL Server Integration Services)
Relational Data modeling.
Knowledge of web dashboard decision-making usability trends.
Experience with various messaging systems, such as Kafka or RabbitMQ.
Experience collaborating effectively with team members and partners in a distributed project environment.
Problem Solving: Ability to analyze and resolve complex infrastructure resource and application deployment issues.
Analytical mindset: in addition to technical expertise, it is essential that applicants have the technical understanding to translate business requirements into efficient technical solutions
Flexibility: IT is dynamic and rapidly changing, so employers often seek individuals with a proven record of working within a fast-paced, dynamic, team-oriented environment
Collaboration skills: Big Data Developers work with other developers, business analysts, ETL Architects, etc. Therefore, they must be comfortable working together with others within the functional units, external stakeholders, and client organizations
Communication skills: It is important that big data developers can listen actively to understand client needs and business requirements, and communicate with them both verbally and in writing.
Basic Knowledge of Microsoft .NET will be advantages
Professional commitment to high quality, and a passion for learning new skills.
Detail-oriented individual with the ability to rapidly learn new concepts and technologies.
Must be a strong team player with the ability to communicate and collaborate effectively in a geographically disperse working environment.
Desired Skills (Not essential but beneficial to have): * Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.
Knowledge: Applicants for the job must possess experience in developing large scale enterprise applications for internal or external customers using Big Data open-source solutions such as Hadoop, Spark, Kafka, and Elastic Search
Deep understanding of Agile principles, with experience in scrums, implementing user stories, writing designs, and delivering near-real-time applications. Applicants must also have capabilities using ETL, data quality, and data integration utilizing tools such as Informatica and Ab Initio.
Deadline: 31 Mar 2023 | Last Update: 01 Jan 2023
Want to know more? Contact us today to find out more.
About UCSI
Admissions
Life @ UCSI
Others
\xc2\xa9 Copyright 2021. UCSI Education Sdn. Bhd. [198901008177 (185479-U)]. All rights reserved. A member of