Kafka Streams 101: Data Serialization (2023)

preview_player
Показать описание

Apache Kafka® brokers only work with records in bytes, so data serialization is important.

In this video, learn to convert objects and other useful types into bytes that can be sent across the network or put into a state store—and vice versa (deserialization).

--
ABOUT CONFLUENT

#kafka #kafkastreams #streamprocessing #apachekafka #confluent
Рекомендации по теме
Комментарии
Автор

I still don't get why Kafka API authors mixed concerns about serialization and deserialization. The only notion of Serdes breaks the SRP. Those are different things. And it's easy to know when you need a serializer - on Producer if you write something to broker. And the place where you need a deserializer is on Consumer - when you reed something from Kafka into the process memory. That should be obvious. Also, I'm pretty sure that more often than not you read records of type A from Kafka, apply a series of transformations that does an A => B transition, and write B to Kfka, not A! So why for god's sake you need to define serialization for type A and deserialization for type B? Those pieces of code may be required in a completely different application!

oleksandrsova
Автор

I had to slow down to 0.75 speed to understand what this auctioneer was saying.

benjaminmcswain
Автор

Brilliant video about how to completely confuse anyone interested in Kafka [de]serialization. Don't you guys listen to person's teaching skills before bringing him/her as a tutor and instructor? I have nothing against this nice girl, she may be a great engineer, but she is not a good explainer.

georgetsiklauri