Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Data Warehouse Business Analyst

Brief description of the vacancy: To play a crucial role in bridging the gap between business needs and technical solutions within the realm of data warehous...


From Scholtz Partners International - Western Cape

Published a month ago

Technical Draughtsperson

Technical Draughtsperson - Montague Gardens, Cape Town R15 000 to R17 000 Per Month CTC + Provident Fund After 6 Months Duties & Responsibilities: Draw up ex...


From West Coast Personnel - Western Cape

Published a month ago

Functional Erp Consultant (Erp Explorers)

Requirements : Possess a proven track record as a Netsuite ERP Voyager or equivalent, with a history of successful expeditions and discoveries Command an int...


From Communicate Recruitment - Western Cape

Published a month ago

Intune Administrator - Cpt

What will make you successful in this role? Intune Deployment and Configuration: Deploy and configure Microsoft Intune for managing endpoints, including Wind...


From Recru-It - Western Cape

Published a month ago

Senior Data Engineer

Senior Data Engineer
Company:

Key


Details of the offer

Excellent opportunity to demonstrate your deep technical skills as a member of a top team of IT professionals engaged by one of South Africa’s most successful retail businesses based in Cape Town.

This role functions as a core member of an agile team and you’ll be involved in building and supporting data pipelines and datamarts built off those pipelines, both scaleable, repeatable and secure.

Helping to facilitate gathering data from a variety of different sources, in the correct format, assuring that it conforms to data quality standards and assuring that downstream users can get to that data timeously comes naturally to you.

In your role as a Data Engineering professional, you’ll be responsible for the infrastructure that provides insights from raw data, handling and integrating diverse sources of data seamlessly as well as enabling solutions, by handling large volumes of data in batch and real-time by leveraging emerging technologies from both the big data and cloud spaces.

Additional responsibilities include developing proof of concepts and implementing complex big data solutions with a focus on collecting, parsing, managing, analysing and visualising large datasets.

Your work involves:
Architecting Data analytics framework.
Translating complex functional and technical requirements into detailed architecture, design, and high performing software.
Leading data and batch/real-time analytical solutions leveraging transformational technologies.
Working on multiple projects as a technical lead driving user story analysis and elaboration, design and development of software applications, testing, and building automation tools.

Main focus areas:

Development and Operations
Database Development and Operations
Policies, Standards and Procedures
Communications
Business Continuity & Disaster Recovery
Research and Evaluation
Coaching/ Mentoring
QUALIFICATIONSMinimum4 years’Bachelor’s degreein computer science, computer engineering, or equivalent work experience
AWS Certificationat least to associate level

EXPERIENCEMinimum5 years Data Engineering or Software Engineering
3-5 years demonstrated experience leading teams of engineers
2+ years Big Data experience
5+ years’ experience with Extract Transform and Load (ETL) processes
2+ years Cloud AWS experience
At least 2 years demonstrated experience with agile or other rapid application development methods
Agile exposure, Kanban or Scrum
5 years demonstrated experience with object oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large scale data infrastructures.

KNOWLEDGE AND SKILLSMust be able to:Create data feeds from ‘on premise’ to AWS Cloud (2 years)
Support data feeds in production on break fix basis (2 years)
Create data marts using Talend or similar ETL development tool (4 years)
Manipulate data using python and pyspark (2 years)
Process data using the Hadoop paradigm particularly using EMR, AWS’s distribution of Hadoop (2 years)
Develop for Big Data and Business Intelligence including automated testing and deployment (2 years)

Must have:Extensive knowledge in different programming or scripting languages
Expert knowledge of data modelling and understanding of different data structures and their benefits and limitations under particular use cases.

Further technical skills required:Capability to architect highly scalable distributed systems, using different open source tools.
Big Data batch and streaming tools
Talend (1 year)
AWS: EMR, EC2, S3 (1 year)
Python (1 year)
PySpark or Spark (1 year) - Desirable
Business Intelligence Data modelling (3 years)
SQL (3 years)


Source: Neuvoo1_Ppc

Requirements

Senior Data Engineer
Company:

Key


Built at: 2024-04-24T10:07:10.295Z