Location: Kuwait (On-site)
Employment Type: Full-time
We are seeking a talented and experienced Kafka Developer to join our innovative team in Kuwait. The ideal candidate will have a strong background in developing and managing Kafka-based solutions. As a Kafka Developer, you will be responsible for designing, building, and maintaining data streaming platforms to support our business operations and enable real-time data processing.
- Design and Develop Kafka Solutions: Create and maintain Kafka-based applications and infrastructure to support real-time data streaming and processing.
- Collaborate with Cross-Functional Teams: Work closely with data engineers, software developers, and other stakeholders to understand data requirements and deliver robust Kafka solutions.
- Implement Data Pipelines: Design and implement data pipelines to facilitate efficient and reliable data flow between various systems and applications.
- Monitor and Optimize Performance: Monitor Kafka cluster performance and optimize for reliability, scalability, and efficiency.
- Troubleshoot and Resolve Issues: Diagnose and resolve issues related to Kafka clusters, including performance bottlenecks, data inconsistencies, and connectivity problems.
- Ensure Data Security: Implement best practices for data security and ensure compliance with data governance policies.
- Stay Updated with Industry Trends: Keep up-to-date with the latest trends, tools, and technologies in data streaming and Kafka development.
- Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Experience: Minimum of 2-4 years of professional experience in software development, with at least 2 years of experience working with Apache Kafka.
- Proficiency in Kafka, including Kafka Streams and Kafka Connect.
- Strong programming skills in languages such as Java, Scala, or Python.
- Experience with distributed systems and data streaming concepts.
- Familiarity with data serialization formats like Avro, Protobuf, or JSON.
- Knowledge of big data technologies such as Hadoop, Spark, and Flink is a plus.
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization (e.g., Docker, Kubernetes) is a plus.
- Proficiency with monitoring tools like Prometheus, Grafana, and Kafka Manager.