Company:
FNB South Africa
Industry: Banking / Financial Services
Deadline: Nov 30, 2025
Job Type: Full Time
Experience:
Location: Gauteng
Province: Johannesburg
Field: Data, Business Analysis and AIĀ , ICT / Computer
Job Description
- To design, build, and maintain scalable data pipelines and data model environments that support business intelligence, analytics, and cloud migration initiatives. The role involves hands-on development, automation, troubleshooting, and collaboration with cross-functional teams to ensure data integrity, performance, and security.
Data Engineering & ETL Development
- Build and maintain data pipelines using Ab Initio ETL graphs from source systems to target platforms.
- Develop and optimize complex SQL queries for data extraction, transformation, and loading.
- Configure and test Ab Initio graphs across various platforms including Hadoop, Kafka, Hive, and flat files.
Version Control & CI/CD
- Create and manage Bitbucket repositories for code versioning.
- Set up and maintain CI/CD pipelines using Bamboo for automated builds and deployments.
- Review and validate developersā data model code for quality and consistency.
Cloud Migration & Big Data
- Support data migration projects to AWS Cloud Technologies including S3, Lake Formation, Athena, Glue, and Redshift.
- Test and validate big data processing on HDFS, Hive, and Spark environments.
Production Support & Troubleshooting
- Provide technical support for production ETL and analytic data model failures.
- Collaborate with stakeholders to resolve data-related technical issues and infrastructure needs.
- Schedule and manage jobs using Control-M for automated data processing.
Data Governance & Security
- Conduct security assessments and recommend disaster recovery strategies for data applications.
- Ensure data lineage is accurately maintained using Ab Initio Data Catalogue.
- Validate and cleanse sensitive data, ensuring compliance with data governance standards.
Collaboration & Stakeholder Engagement
- Work closely with analysts, developers, and business stakeholders to understand data requirements.
- Assist in resolving technical challenges and improving data infrastructure reliability.
Core Technologies
- Ab Initio (ETL & Technical Repository Management)
- SQL (Netezza Proc SQL)
- Python
- Spark
- AWS (S3, Glue, Athena, Lake Formation, Redshift)
Additional Tools & Platforms
- Control-M
- Bitbucket
- Bamboo
- Zeppelin
- Putty / WinSCP
- Kafka
- Linux
- Hadoop / HDFS / Hive
Other Competencies
- Strong understanding of data validation, cleansing, and verification.
- Experience with data lineage and metadata management.
- Ability to troubleshoot complex data issues in production environments.
- Excellent communication and collaboration skills.
Qualifications
- Bachelorās Degree in Computer Science, Information Systems, or related field.
- Certifications in AWS, Big Data, or ETL tools are advantageous.
End Date: November 17, 2025