Schema Evolution with Zero Down Time | Designing Event-Driven Microservices

preview_player
Показать описание

In this video, we'll look at some techniques for evolving events by analyzing a specific use case in a banking fraud detection system.

It's rare in modern software to build a system that is static, and unchanging. Most systems are impacted by fluctuations in the business environment. Teams are forced to evolve their event schemas to adapt to new requirements. However, these evolutions must be performed in a live system, without incurring downtime. That requires careful planning to ensure that both the producer and consumer of the data streams can be updated independently to avoid having to synchronize deployment.

RELATED RESOURCES

CHAPTERS
00:00 - Intro
00:55 - Digital Fingerprints in Fraud Detection.
01:33 - Evolving Message Schemas with Additive Changes.
02:26 - Consumer First Approaches to Evolving a Schema.
03:12 - Producer First Approaches to Evolving a Schema.
03:53 - Replaying Old Events.
04:41 - Evolving Existing Fields in a Schema.
05:54 - Versioning, and Replacing Events
06:51 - Closing

--

ABOUT CONFLUENT

#microservices #apachekafka #kafka #confluent
Рекомендации по теме
Комментарии
Автор

Hi, @6:38 you say that once old events are migrated then update the producer to emit messages with new encryption but my question is how can there be a scenario of old events being fully migrated as events are continously flowing from producer.

balasubramanianravichandra
Автор

How about, having a header value of the schema version, and manage it via consumergroups ?
Consumer group 1 will consume v1, and Consumer group 2 will only consume v2, always discarding the messages from the incorrect schema.

v2 can be ready, up & running and we only need to switch the traffic to the producer v2.

danielagostinho
Автор

Hey, Wade!
I see you didnt mentioned creating a 2nd integration point, such as topic with another version, cause it feels like breaking changes on existing topic, if some stream processing is required.
Was that intentional or?

Fikusiklol
Автор

With the final encryption example, I was thinking, when v2 of a topic becomes available, as a consumer I'd probably have to remember my v1 offset, last committed, so that when I start again it's from that position.

But that's a given. I think.

Still, a checklist for particular evolution scenarios would be good, we all make mistakes.

ndewet