Kafka Expert Needed to Implement Event Messaging Service

We are seeking an experienced Kafka expert to help us design and implement an efficient, scalable event messaging service. Our goal is to streamline event-driven processes and enhance real-time data flows within our architecture. The ideal candidate will have hands-on experience with Kafka, deep knowledge of event streaming, and a strong understanding of distributed systems. This role would be to design and integrate to our existing systems.

Responsibilities:
Develop and implement Kafka-based event messaging service
Set up Kafka brokers, clusters, and manage message streams
Design and optimize data pipelines for real-time processing
Ensure high availability and fault tolerance in the messaging system
Collaborate with our development team to integrate Kafka with existing systems
Provide documentation and guidance on best practices for Kafka deployment and management

Requirements:
Proven experience with Apache Kafka and event-driven architectures
Strong understanding of distributed systems and data processing
Experience with Kafka ecosystem tools (e.g., Kafka Connect, Kafka Streams)
Familiarity with data serialization formats (e.g., Avro, JSON)
Ability to troubleshoot performance issues and optimize message delivery
Excellent communication skills and experience in documentation

Preferred Qualifications:
Experience with stream processing frameworks
Knowledge of data engineering and data warehousing solutions
Familiarity with cloud-based Kafka solutions (e.g., Confluent, Amazon MSK)

Share the Post:

Related Posts