Data Engineer at Hollard Insurance

Company:

Hollard Insurance

Hollard Insurance

Industry: Insurance

Deadline: Not specified

Job Type: Full Time

Experience: 5 years

Location: Gauteng

Province:

Field: Data, Business Analysis and AI , ICT / Computer

    Role Objectives:

  • The primary objective of this role is to design, develop, and maintain robust and scalable Big Data Pipelines and data architectures, ensuring optimal extraction, transformation, and loading of data across multiple application platforms. The Data Engineer will act as a custodian of data, ensuring compliance with information classification requirements, and enabling data consumers to build and optimize data consumption effectively. This role demands profound technical expertise in Data Engineering, Data Warehouse Design, and advanced analytics, utilizing modern software engineering concepts and BI tools. The Data Engineer will leverage Azure Data Solutions and various data-related programming languages and frameworks to analyze data elements, perform root cause analysis, and collaborate with technology colleagues and data teams to deliver viable data solutions within architectural guidelines.

Key Responsibilities:

  • Responsible for building and maintaining Big Data Pipelines
  • Custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis
  • Experience using programming skills in data related programming languages and frameworks, such as Python, Spark, SQL
  • Experience with Azure Data Solutions: Azure Data Factory, Azure Data Explorer, Azure Databricks
  • Profound technical understanding for Data Engineering and Data Warehouse Design
  • Familiar with modern software engineering concepts Know-how in advanced analytics and BI Tools.
  • Develop and maintain complete data architecture across several application platforms
  • Analyze data elements and systems
  • Build required infrastructure for optimal extraction, transformation and loading of data
  • Build, create, manage and optimize data pipelines
  • Create data tooling, enabling data consumers in building and optimizing data consumption
  • Execute on the design, definition and development of (API’s)
  • Develop across several application platforms
  • Experience performing root cause analysis on internal and external data and processes
  • Knowledge of integration patterns, styles, protocols and systems
  • Liaise and collaborate with technology colleagues and data teams to understand viable data solutions within architectural guidelines
  • Update technical documentation on data extracts and report functionality to facilitate future understanding to the extent required for ongoing support. Respond to user queries, error logging and further enhancement requests to ensure reports are used and serve their intended purpose.
  • Provide OLAP support and end-user training on the various cubes used for reporting downstream
  • Data Modelling – emphasizes on what data is needed and how it should be organized instead of what operations need to be performed on the data. Data Model is like architect’s building plan which helps to build a conceptual model and set the relationship between data items. This data capability is required in preparation of the Data Platform implementation, and will also align with the approach taken by Group Data to implement ERWIN as a tool of choice for Data Modelling
  • Data Architecture – ensures that we have composed of models, policies, rules or standards that govern which data is collected, and how it is stored, arranged, integrated, and put to use in data systems and in organizations. This data capability is also aligned with the direction that Group Data is taking.

Required Knowledge and Experience    

Project Related Deliverables and Tasks

Assist the Data Engineering Lead on the below-listed tasks:

  • Select and implement the appropriate tools, software, applications, and systems to support cloud data technology goals from the chosen MS Azure
  • Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring good data quality.
  • Collaborate with Project & Data Stream Leads, Consultants and business unit leaders for all exercises pertaining to the Data Platform
  • Create and maintain data model and metadata policies and procedures for functional design.
  • Provide technical recommendations and engage with ETL/BI Architects, Business SMEs and other stakeholders throughout the Solution/Data Architecture and implementation lifecycle and recommend effective solutions to develop high performance and highly scalable data solutions (data marts/warehouse and data mining and advanced analytics)
  • Raise and ensure data-related problems in regard to systems integration, compatibility, and multiple-platform integration are
  • Develop and implement key components as needed to create testing criteria in order to guarantee the reliability and performance of data architecture. Ensure the testing process incorporates Test
  • Drive a Data Platform environment that enables maintenance of the current and accurate view of the larger data picture, an environment that supports a single version of the truth and is scalable to support future analytical needs.
  • Identify and develop opportunities for data reuse, migration, or retirement and platform upgrades.
  • Communicate with customer, project team in a timely manner and escalates issues & risks appropriately.

Required Knowledge, Experience and skills:

  • Deep knowledge of contemporary architectural principles across the data architectural domain
  • Understanding of the enterprise architecture to ensure on-going alignment, driving the technical roadmap for key applications and
  • Knowledge of different SDLC Methodologies, preferably Agile
  • Ability to produce solutions using:
  • Microsoft SQL Integration Services, MS SQL Analysis Services, MS SQL Reporting Services, MS    Power BI
  • Knowledge of & Scaled Agile Framework (SAFe) practices will be advantageous
  • Experience of Cloud Services (Azure) is
  • Python
  • 5 years+ Experience in cloud computing infrastructure (e.g., MS Azure, Amazon Web Services etc.) and considerations for scalable, distributed systems

Educational Requirements    

  • Matric,
  • BSc Computer Science, Data Informatics, Information Technology, or any other related degree
  • Cloud Certification is a key requirement
  • Microsoft AWS Certification



Share this job:

Senior BI Developer at Hollard Insurance

Business Analyst at Hollard Insurance