Our client is recruiting for an ICT Senior Data Engineer (Azure, Python & R) on a six (6) month fixed-term contract aligned to the operational requirements of the ICT Division
Duties & Responsibilities
- Design, build, and optimize data pipelines and architectures that support advanced analytics, machine learning models, and data-driven decision-making.
- Implement robust and scalable data solutions using Microsoft Azure technologies (e.g., Azure Data Factory, Synapse Analytics, Databricks, Data Lake Storage).
- Develop and maintain Python and R scripts for data processing, automation, and statistical analysis.
- Collaborate closely with data scientists, analysts, and engineering teams to ensure seamless integration of data solutions across systems.
- Ensure data quality, compliance, and governance through systematic testing and monitoring.
- Support CI/CD processes for automation of data workflows and ensure adherence to data security best practices.
- Maintain and support the operational environment to ensure high availability and performance of data systems.
- Develop training materials, documentation, and best practices for effective use of open-source tools (Python & R) for corporate use.
- Facilitate knowledge transfer sessions to enhance analytics and data management capabilities across GEMS.
- Ensure compliance with organisational policies, POPIA, and relevant data protection regulations
Qualification Requirements:
- A Diploma or University Degree in Computer Science, Information Management, Data Science, or a related field (NQF level 6)
- A minimum of 8 years’ relevant experience as a Data Engineer or in a similar role.
- Strong expertise in Microsoft Azure (Data Factory, Synapse Analytics, Databricks, Data Lake Storage Gen2).
- Proficiency in Python (Numpy, Pandas, Matplotlib, Seaborn) and R for data processing and statistical modelling.
- Advanced knowledge of SQL, data warehousing, and data modelling principles.
- Experience with version control (e.g., Git, DevOps CI/CD pipelines)
- Understanding of data governance and security best practices.
- Added advantage: experience with real-time data streaming (Kafka, Event Hubs); ML lifecycle tools (MLflow, Azure ML); BI tools (Power BI, Tableau); containerization (Docker), and orchestration (Kubernetes, Airflow).
Skills
- Strong analytical and problem-solving skills.
- Excellent understanding of data engineering and systems integration.
- Proven experience delivering scalable, efficient, and quality data solutions.
- Strong knowledge of data processing frameworks and database administration.
- Excellent communication and documentation skills.
- Ability to collaborate effectively with cross-functional teams.
- Strong project and time management abilities.
Behavioural Competencies:
- Integrity: Demonstrates professionalism, fairness, and confidentiality.
- Innovation: Explores creative and modern data engineering techniques.
- Excellence Orientation: Focused on quality, accuracy, and continuous improvement.
- Collaboration: Builds constructive relationships with internal and external stakeholders.
- Member-Centric Approach: Acts in the best interest of Scheme members and organisational objectives.
- Analytical Thinking: Uses critical reasoning and data insights to solve challenges.
- Adaptability: Performs effectively in a dynamic and rapidly evolving environment.
- Communication: Clearly conveys technical and analytical concepts to non-technical audiences.