Window Aggregations | Stream Processing

preview_player
Показать описание
One of the most common use cases of stream processors is aggregating different values based on time windows. Aggregations range from sum and average to max or min. Windows range from fixed interval windows to sliding windows and even more complex session-based ones.

🥹 If you found this helpful, follow me online here:

0:00 What is Windowing?
03:00 States & Memory
04:00 What Time to use for windowing?
06:00 Late Data & Windowing
09:20 Which time to use for your windowing?

#streaming #flink #beam #kafka #programming
Рекомендации по теме
Комментарии
Автор

Thank you for providing such valuable content, please add more such videos.

hariomsharma
Автор

Hey - just a quick call out, you pasted the edit URL for the document in description. People can delete/modify it unexpectedly.

saip
Автор

Also something on Knowledge Graphs and how Search Engines use them behind the scenes, maybe building one with something like Python and Neo4J.

akshadk
Автор

Great video. What do you think about when to use batch processing vs when to use stream processing with respect to time window size? For example if you are generating a report say every day I think batch processing from raw log data makes more sense. However if you are trying to generate aggregated data near real time, every 2 mins say, then stream processing makes more sense. But what can be a time window size where switching from stream processing to batch processing is beneficial.

joydeepbanerjee
Автор

Hi can you showcase how windowing can be achieved on synthetically generated random time stamped data in python and consumed using Kafka

anirudhsyal
Автор

Hey, can we have a Video on Designing a Real-Time Data Streaming Service based System, something like Spotify (Music Streaming), ClubHouse (Audio Streaming), Hotstar Live (Video Streaming), Google Meet (Video Streaming) ?

akshadk
welcome to shbcf.ru