Handling BIG DATA can feel like a nightmare! But not with Apache Kafka Streams! Its DSL is easy to use, nor complex and perfect when dealing with large sets of data. It almost sounds like a sweet dream, or is it?
During the live coding session, we will discuss the two use cases of Kafka Streams:
- The Streams DSL, which is a perfect fit for functional programming! Like the StreamsAPI in Java you also have the possibility to stream your data and use most used patterns like branching, joining, mapping (and even more).
- The Processor API, where we have the option to create custom logic (like aggregators) on a lower level.
After this session, you’re able to handle BIG DATA with Kafka Streams.
Are you ready for some sweet dreams?
Ko Turk is an experienced developer and speaker working for adesso. He is focussing on Java, Kotlin and isn’t afraid to code in Typescript. He likes to write articles for the Dutch NLJUG JavaMagazine. Also he is regularly speaking at international conferences about Apache Kafka Streams, Micrometer and Kotlin. Because he doesn’t like bull it slide presentations, you can find him (live) coding at stage. You can also find him at the UtrechtJUG and love to have a chat!