To start ASAP will await notice if on a contract and suitable for the position 6 Months Contract Our corporate client in The Insurance industry based in the CBD surroundings requires your sound experience as a Hadoop Developer. This role will play a vital role in building new data pipelines from various structured and unstructured sources into Hadoop. Must be eager to wear multiple hats, and be capable of picking up new technologies at a fast pace. Minimum Requirements Tertiary qualifications with majors in at least one of the following: Computer Science, Information Systems or similar Certification in Hadoop Development. (Advantageous) In-depth knowledge of Data Warehouse and Big Data best practices. Strong experience in Hadoop – HDFS, HIVE, Spark, Sqoop, and Map Reduce Writing high-performance, reliable and maintainable modular code. Good knowledge of database structures, theories, principles, and practices. Hands on experience in HiveQL. Ability to use linux/unix OS Practical experience using HDP or Cloudera Knowledge of workflow/schedulers like Control M would be beneficial. Analytical and problem-solving skills, applied to Big Data domain Good aptitude in multi-threading and concurrency concepts. (Advantageous) Must have python experience Financial Services experience. Knowledge and technical appreciation of the interconnectivities and interfaces between various technical platforms, operating systems and processes. Good understanding of data ITIL Must understand the need to align the IT and business strategies.