iKhokha is looking to hire a Data Engineer. Do you have previous experience in machine learning techniques used to create and sustain structures that allow for the analysis of data?So, what will you do?
You will work as a key member of a data-centric team to drive the development, execution, and continuous improvement of core data analytics infrastructure and processes at iKhokha.
You will reformulate existing frameworks to optimise their functioning.
You will conceptualise and generate infrastructure that allows big data to be accessed and analysed.
Sounds cool right?Deal Breakers
3 - 4 years’ experience working with programming languages (such as Python, R, Scala, Java, SQL)
2+ years’ experience working with large data sets and relational and non-relational databases (Snowflake, Aurora/Redshift).
2+ years’ experience in data migration from Aurora/Redshift to Snowflake using Snowpipe and other ETL tools.
3 + years’ experience in visualization tools such as Tableau, PowerBI, Qlik.
3 + years’ experience demonstrating strong analytical skills with the ability to collect, organize, analyze and disseminate significant amounts of information with attention to detail and accuracy.
Experience working across multiple disciplines, interdepartmental teams and engaging with internal and external suppliers and stakeholders.
Strong oral and written communication skills with the ability to effectively communicate and present data findings at all levels in the organization.
Additional Skills:
Experience with ETL tools such as Alteryx.
Experience with stream processing systems (such as Kafka).What would you be responsible for?
Partner with functional leads to understand their data and reporting requirements and translate them into definitions and technical specifications
Leading independent analysis and consistent follow-up with all relevant stakeholders to drive completion of project deliverables, action items, and objectives.
Use analytics tools such as Tableau, that utilize the data pipeline to provide actionable insights into operational efficiency, financial reports and other key business performance metrics.
Assemble large, complex data sets that meet functional business requirements
Work with stakeholders including the Executive, Product, Marketing and Creative teams to assist with data-related technical issues and support their data infrastructure needs
Architect, build and maintain scalable automated data pipelines ground up.
Be an expert of stitching and calibrating data across various data sources.
Participate in data planning process including inception, technical design, development, testing and delivery of BI solution
Understand business processes in order to translate it to logical data models.
Document data processes and best practices for data quality.
Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.