Owns and delivers installation, configuration of AI products.
Participates in solution architecture discussions to estimate hardware sizing.
Analyses defects and conducts effective triaging.
Works closely with engineering, product, solution architects in resolving defects.
Proactively participates in product release cycle and sign off on client releases.
Supports current ETL designs and code to resolves defects on existing solutions.
Develops logical and physical data flow models for ETL applications.
Must-Have Skills:
2+ years of experience in working with big data application stack including HDFS, YARN, Spark, Hive and HBase
1+ years of experience in AWS Cloud Setup
1+ years of experience in Enterprise SW installation especially
packages. Good to Have Skills:
1- or 2-year Experience in ETL in Big Data hadoop environment (Hive/Hbase knowledge) and shell scripting or python.
2+ years of experience in Hadoop Big Data projects Experience with developing, tuning and debugging code, python/shell scripts loading into Hive, Mariadb, mysql.
Experience setting up enterprise security solutions including setting up active directories, firewalls, SSL certificates, Kerberos KDC servers, etc.
Experience working with CI/CD tools like Jenkins and various test report and coverage.
Job Perks:
Attractive variable compensation package.
Flexible working hours - everything is results-oriented.
Opportunity to work with an award-winning organization in the hottest space in tech - artificial intelligence and advanced machine learning
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.