0111465659 admin@hrmd.co.ke

Specialist Platform Engineer at Absa Bank

Job role insights

  • Date posted

    March 24, 2026

  • Closing date

    March 27, 2026

  • Hiring location

    Nairobi, Kenya

  • Qualification

    Bachelor Degree

Description

remote type
Hybrid
locations
Absa Headquarters (KE)
time type
Full time
time left to apply
End Date: March 27, 2026 (2 days left to apply)
job requisition id
R-15985029
Empowering Africa’s tomorrow, together…one story at a time.

With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

Job Summary

We are looking for an engineer to join our team to help build and maintain Kafka-based streaming applications and support the Kafka platform across on-prem and Confluent Cloud environments. The role is a hybrid of development, platform responsibilities, and observability, providing a unique opportunity to work on distributed systems at scale.

Job Description

Core Responsibilities:

• Develop, maintain, and optimize Kafka-based applications and event streaming pipelines using Java (Spring / Spring Boot), Python, or .NET.

• Work with distributed systems concepts: partitions, replication, fault-tolerance, scaling, and event-driven architectures.

• Contribute to provisioning, managing, and securing Kafka clusters both on-prem and in Confluent Cloud.

• Implement and maintain security and authorization mechanisms, including ACLs, Kerberos, SSL, and OAuth for Confluent Cloud.

• Automate infrastructure deployment and configuration using Terraform, Ansible, CloudFormation, Docker, or Kubernetes.

• Configure, monitor, and maintain observability for Kafka clusters, including metrics, alerts, and dashboards (e.g., Prometheus, Grafana, Confluent Control Center, ElasticSearch).

• Assist in troubleshooting production issues and perform root cause analysis.

• Collaborate closely with developers, DevOps/SRE teams, and other stakeholders to ensure reliable and performant streaming systems.

• Contribute to best practices for connector configuration, high availability, disaster recovery, and performance tuning, including streaming applications and pipelines built with Kafka Streams, ksqlDB, Apache Flink, and TableFlow.

 

Required Skills:

• Strong programming experience in Java(Spring / Spring Boot), Python, or .NET. Ability to write clean, maintainable, and performant code.

• Solid understanding of distributed systems principles and event-driven architectures.

• Hands-on experience with Kafka in production or strong ability to learn quickly.

• Knowledge of Kafka ecosystem components (Connect, Schema Registry, KSQL, MirrorMaker, Control Center, Kafka Streams, Apache Flink, TableFlow) is a plus.

• Familiarity with security best practices for Kafka, including ACLs, Kerberos, SSL, and OAuth.

• Experience with infrastructure as code and containerized environments.

• Experience with monitoring and observability tools for distributed systems.

 

Desirable Skills / Bonus Points:

• Experience with Confluent Cloud or other managed Kafka platforms.

• Experience with AWS

• Experience building streaming pipelines across multiple systems and environments.

• Familiarity with CI/CD pipelines and automated deployments.

 

Behavioural / Soft Skills:

• Strong problem-solving and analytical skills.

• Excellent communication and interpersonal skills.

• Ability to work independently and prioritize across multiple BAU and project tasks.

• Product-minded approach, focusing on delivering value and scalable solutions.

Education

Bachelor's Degree: Information Technology

Apply

Interested in this job?

2 days left to apply

Call employer
+358
Job Alert
Subscribe to receive instant alerts of new relevant jobs directly to your email inbox.
Subcrible
Send message
Cancel