What is a Data Streaming Platform?

preview_player
Показать описание
A Data Streaming Platform (DSP) is composed of six major inter-related components, covered in detail in this video. Each component has evolved over time to servce specific needs in the data streaming platform, but together form a whole that is greater than its sum.

A DSP provides a complete solution for unified end-to-end data streaming. It provides all the necessary capabilities to succeed in data streaming, incuding data connectivity, integration, discovery, security, and management. Ultimately, a DSP makes it easy for you to build, use, and share data all across your organization for any use case.

RELATED RESOURCES

CHAPTERS
00:00 - Introduction, a DSP’s Six Parts
00:30 - Pyramid Model of Data Streaming Evolution
00:46 - Apache Kafka (Part 1)
1:55 - Schema Registry (Part 2)
3:53 - Data Portal (Part 3)
5:44 - Connectors (Part 4)
7:30 - Stream Processor Flink (Part 5)
8:40 - Tableflow (Part 6)
10:33 - Conclusion



ABOUT CONFLUENT

#datastreaming #dsp #apachekafka #kafka #confluent
Рекомендации по теме
Комментарии
Автор

As always amazing content. The video is almost a "short" summarization from the book "Building an Event-Driven Data Mesh: Patterns for Designing & Building Event-Driven Architectures". The building blocks are well defined and explained. Thanks Adam for sharing your knowledge.

douglaspiresmartins
Автор

Great job. I understood everything. I'd love to see practical examples

romanstoianov
Автор

Great video would also be nice to get something similar for data mesh

nas
Автор

Can we connect data lakes with table formats e.g. iceberg and use the data portal part to actually label what the headings in those data formats mean? If yes, next question is can we do governance in the data portal like RBAC/ABAC for different sources of data.

emonymph
Автор

while doing patching of Kafka brokers, usually we do last for C3 patching right. if we done C3 first and then activate controller broker patching. any issue vl happen ?

soloboy