Role & responsibilities
- Participate in building strategy and roadmap of the Platform based on Confluent Kafka
- Architect and design scalable, high-throughput, low-latency streaming data platforms using Confluent Kafka.
- Deep understanding of Kafka internals, partitions, replication, retention, ISR, and offset management.
- Define best practices for Kafka topic design, schema evolution, data governance, security, and high availability
- Lead Kafka migrations (e.g., Apache Kafka to Confluent Cloud or self-managed Confluent Platform).
- Collaborate with cross-functional teams (data engineering, application development, DevOps) to integrate Kafka with enterprise systems.
- Proficiency in Confluent Kafka ecosystem components: Kafka Connect, Schema Registry, KSQL, REST Proxy, Control Center.
- Strong experience with Confluent Cloud and Kafka deployment automation using Terraform, Helm, or Ansible.
- Expertise in Java/Scala/Python for Kafka producer/consumer application development.
- Good understanding of IAM, encryption (TLS, SSL), RBAC, and auditing in Kafka.
- Experience with cloud platforms (AWS/GCP/Azure), Kubernetes, and CI/CD pipelines.
- Define and implement strategies for monitoring, alerting, and observability of Kafka workloads.
- Provide technical leadership and mentoring to engineering teams on Kafka-based solutions.
- Ensure data reliability, fault tolerance, and compliance in production streaming environments.
Experience Range: 10 to 16 Years
Job Locations : Anywhere in India