Build a Reactive Data Streaming App with Python and Apache Kafka | Coding In Motion

preview_player
Показать описание

In this episode of Coding in Motion we’re going to build a solution that brings some data to life. Join Kris Jenkins in another step-by-step build as he demonstrates how to turn a static data source—YouTube’s REST API—into a reactive system that:

► Uses Python to fetch and process data from a static web API
► Streams that data live, from Python into a Kafka topic
► Processes the incoming source data with ksqlDB, watching for important changes
► Then streams out live, custom notifications via Telegram

LEARN MORE

TIMESTAMPS
00:00 Intro
00:27 What Are We Building?
01:24 Setting Up A Basic Python Program
02:57 Planning Our Approach
04:00 Fetching Data From Google ("So let's do that.")
07:28 Handling Paging With Python Generators
17:39 Fetching Specific Video Data
22:10 Setting Up A Kafka Cluster
24:26 Defining A Persistent Data Stream
26:03 Setting Up The Python Kafka Library
31:27 Serializing and Storing Our Data
35:02 Detecting Stream Changes With ksqlDB
39:59 Creating A Telegram Alert Bot
43:42 Setting Up An HTTP Sink Connector
46:58 Defining And Triggering The Alerts
50:59 Retrospective
53:02 Outro

ABOUT CONFLUENT

#streamprocessing #python #apachekafka #kafka #confluent
Рекомендации по теме
Комментарии
Автор

Rare to come across a session on YouTube where the instructor is very clean on their delivery and emits a 1-1 match between every instruction and the tutoree's experience, (at least as of this comment). This was a treat!

safraz.rampersaud
Автор

First one of these I’ve watched. Beautifully explained, with all the required detail, really bringing it to life. Great work!

jonmaddison
Автор

Fantastic video -> event-driven architecture, data pipeline, notifications, Kafka, Telegram, code, best practices, and humour all in a single package🙂

ssdey
Автор

This was brilliant, perfect and fun. What a clear instructor and perfect way to explain and introduce Kafka. And that Python generator solution was great.

TannerBarcelos
Автор

love that you do everything in the terminal, no need for an IDE. Respect 😆

AnnChu-tbhp
Автор

Outstanding and extremely clean explanation

PritamBaral-jm
Автор

I liked this video very much. Just the relevant parts, leaving out all the fluff but with a lot of humour!

MilcoNuman
Автор

great video, great energy, very didactic. I really enjoy every minute of the video. Also the way you talk transmit really nice vibes

eduardogpisco
Автор

Thank you so much for this video. Made getting acquainted with Kafka as a beginner a pleasure.

ishaanme
Автор

Fabulous video. Very well taught, coded and explained.

____prajwal____
Автор

Finally, I has found the best video about kafka. Thank you for this video

vohoang
Автор

Great video! I really appreciated the smooth and intuitive coding process. If you could, please consider refactoring for further improvements. In the meantime, I'll continue exploring Kafka Streams and the various applications that can be developed using it. Cheers! :)

balanced-living
Автор

Thanks for the session Kris, very interesting and touching many topics
a lot of functionality with so little code!
Looking forward to see this next refactoring video.

Congrats!

viccos
Автор

This is the first video that I have watched on your channel, and I just loved it. Especially because I have started learning kafka lately. I would love to know how we could deploy such kinds of applications.

rohitlohia
Автор

I would LOVE to see a vim tutorial from Kris!!

codewithcarter
Автор

Loved the way you explained each and every part of the process. Hope you get a notification on your bot!! 😂😊

RobertoDelgadoDIY
Автор

Would love to see the Higher order function refactoring and see if any more neat features have been added into Confluent cloud since this video was made. Retrospectives in action ;)

freefortravelsshorts
Автор

Such great video on less than one hour

danielruiz
Автор

This is very well explained!!. I really enjoyed it

gramundi
Автор

For those who cannot find API Credentials:
1. Select your environment (probably 'default')
2. You'll see the "endpoint" in the right bottom right corner. This is what you're looking for.

bernasiakk