Java & JVMJava & JVM
Conference50min
INTERMEDIATE

Spring into Apache Kafka® with Kotlin

This session demonstrates integrating Spring applications with Apache Kafka using Kotlin, leveraging Spring's compatibility with Kafka for event-driven design. It covers producing and consuming Kafka events with Spring Kafka, configuring applications for Confluent Cloud, and managing data schemas with Apache Avro and a schema registry, highlighting Kotlin's advantages in JVM-based development.

Sandon Jacobs
Sandon JacobsConfluent

talkDetail.whenAndWhere

Wednesday, June 11, 09:45-10:35
Room 4A
talks.description
So, I hear you’re developing Spring applications and microservices. Along comes event streaming with Apache KafkaⓇ and you need to integrate. As fate would have it, Spring and Kafka are already pretty good friends. This means you can leverage your organization’s expertise in building, testing, deploying, and monitoring Spring applications, while also reaping the benefits of event-driven design.But why bore ourselves with yet another Java microservice. Kotlin is a first class citizen of the Spring framework. It’s proven itself as a popular language with constructs to simplify JVM-based development. And not just for cross-platform development, but for server-side implementations with frameworks like Spring, Ktor, and Micronaut - just to name a few.In this session, I’ll walk you through writing a solution in Kotlin for producing and consuming Kafka events using Spring Kafka. We’ll highlight the Spring configuration involved in binding our application to a Kafka cluster in Confluent Cloud. We’ll use structured data - serialized with Apache Avro® - whose schemas are managed and governed by a schema registry.When we’re done, you’ll be ready to explore this Spring-Kafka-Kotlin friendship for yourself.
spring
avro
kotlin
kafka
talks.speakers
Sandon Jacobs

Sandon Jacobs

Confluent

United States of America

Sandon Jacobs is a Developer Advocate at Confluent, based in Raleigh, NC. Sandon has two decades of experience designing and building applications, primarily with Java and Scala. His data streaming journey began while building data pipelines for real-time bidding on mobile advertising exchanges—and Apache Kafka was the platform to meet that need. Later experiences in television media and the energy sector led his teams to Kafka Streams and Kafka Connect, integrating data from various in-house and vendor sources to build canonical data models.

Outside of work, Sandon is actively involved in his Indigenous tribal community. He serves on the NC American Indian Heritage Commission, and also as a powwow singer and emcee at many celebrations around North America. Follow Sandon on Twitter @SandonJacobs or Instagram @_sandonjacobs, where he posts about his powwow travels, family, golf, and more.
comments.title

comments.speakerNotEnabledComments