filmov
tv
Introduction to Event Streams Development with Kafka Streams
Показать описание
Introduction to Event Streams Development with Kafka Streams
Bill Bejeck
A presentation from ApacheCon @Home 2020
Developers today work with a lot of data. Much of this data is available near real-time. And it presents the opportunity for businesses and organizations to improve service and deliver more value to users of today's applications. But the question is, how to manage this incoming stream of records? Viewing the incoming data as event streams is one way to think about working with data. In recent years, Apache Kafka has become a defacto standard for ingesting record streams. To work with the incoming data, Apache Kafka provides a Producer and Consumer interface as the basic building blocks for sending to and reading records from Kafka. When building a Kafka-based microservice, using the Producer and Consumer clients means handling all the details of communicating yourself. To enable building event-driven applications, Apache Kafka provides Kafka Streams. Kafka Streams is the native stream procession library for Apache Kafka In this talk, we'll review Kafka and how it can function as a central nervous system for incoming data. From there, we'll cover how Kafka Producers and Consumers work and how developers can build a microservice using these building blocks. Finally, we'll transition our application to a Kafka Streams application and demonstrate how using Kafka Streams can simplify building a Kafka based microservice. Attendees of this presentation will gain the knowledge needed to understand how Kafka Streams works and how they can get started using it to simplify the development of applications involving Apache Kafka. Additionally, developers in attendance that aren't familiar with Apache Kafka itself will gain an understanding of how it can help their business or organization make effective use of available incoming event streams.
Bill Bejeck is working at Confluent as an integration architect on the Developer Relations Team before that Bill was a software engineer on the Kafka Streams team for three years. He has been a software engineer for over 17 years and has regularly contributed to Kafka Streams. Before Confluent, he worked on various ingest applications as a U.S. Government contractor using distributed software such as Apache Kafka, Apache Spark™, and Apache™ Hadoop®. Bill is a committer to Apache Kafka and has also written a book about Kafka Streams titled Kafka Streams in Action.
Bill Bejeck
A presentation from ApacheCon @Home 2020
Developers today work with a lot of data. Much of this data is available near real-time. And it presents the opportunity for businesses and organizations to improve service and deliver more value to users of today's applications. But the question is, how to manage this incoming stream of records? Viewing the incoming data as event streams is one way to think about working with data. In recent years, Apache Kafka has become a defacto standard for ingesting record streams. To work with the incoming data, Apache Kafka provides a Producer and Consumer interface as the basic building blocks for sending to and reading records from Kafka. When building a Kafka-based microservice, using the Producer and Consumer clients means handling all the details of communicating yourself. To enable building event-driven applications, Apache Kafka provides Kafka Streams. Kafka Streams is the native stream procession library for Apache Kafka In this talk, we'll review Kafka and how it can function as a central nervous system for incoming data. From there, we'll cover how Kafka Producers and Consumers work and how developers can build a microservice using these building blocks. Finally, we'll transition our application to a Kafka Streams application and demonstrate how using Kafka Streams can simplify building a Kafka based microservice. Attendees of this presentation will gain the knowledge needed to understand how Kafka Streams works and how they can get started using it to simplify the development of applications involving Apache Kafka. Additionally, developers in attendance that aren't familiar with Apache Kafka itself will gain an understanding of how it can help their business or organization make effective use of available incoming event streams.
Bill Bejeck is working at Confluent as an integration architect on the Developer Relations Team before that Bill was a software engineer on the Kafka Streams team for three years. He has been a software engineer for over 17 years and has regularly contributed to Kafka Streams. Before Confluent, he worked on various ingest applications as a U.S. Government contractor using distributed software such as Apache Kafka, Apache Spark™, and Apache™ Hadoop®. Bill is a committer to Apache Kafka and has also written a book about Kafka Streams titled Kafka Streams in Action.