From Zero to Hero with Kafka Connect

preview_player
Показать описание
Integrating Apache Kafka with other systems in a reliable and scalable way is often a key part of a streaming platform. Fortunately, Apache Kafka includes the Connect API that enables streaming integration both in and out of Kafka. Like any technology, understanding its architecture and deployment patterns is key to successful use, as is knowing where to go looking when things aren't working.

This talk covers:
* key design concepts within Kafka Connect
* Deployment modes
* Live demo
* Diagnosing and resolving common issues encountered with Kafka Connect.
* Single Message Transforms
* Deployment of Kafka Connect in containers.

⏱ Time codes:

00:00 What is Kafka Connect?
03:38 Demo streaming data from MySQL into Elasticsearch
11:43 Configuring Kafka Connect
12:33 👉 Connector plugins
13:33 👉 Converters
13:53 👉 Serialisation and Schemas (Avro, Protobuf, JSON Schema)
17:13 👉 Single Message Transforms
19:43 👉 Confluent Hub
19:51 Running Kafka Connect
20:24 👉 Connectors and Tasks
21:29 👉 Workers
21:56 👉 Standalone Worker
22:50 👉 Distributed Worker
23:10 👉 Scaling Kafka Connect
24:42 Kafka Connect on Docker
26:17 Troubleshooting Kafka Connect
27:56 👉 Dynamic Log levels in Kafka Connect
28:48 👉 Error handling and Dead Letter Queues
32:16 Monitoring Kafka Connect
32:59 Recap & Resources

--

☁️ Confluent Cloud ☁️

--

Рекомендации по теме
Комментарии
Автор

Hi Robin,
I never write comments on youtube videos, but i deeply want to thankyou for all your work !

thomaskaminski
Автор

Hi Robin,

I am a software engineer at a startup. Last year we build a pipeline to sync our postgres data to elasticsearch and cassandra. It was all custom java code with lot of operational handling. Thank you for this video, I am planning to use connect for those pipelines.

vivekshah
Автор

Your examples are always very well chosen. Thanks.

lossoth
Автор

Hai Robin, I am a new subscriber fan here

rajeshantony
Автор

Hi Robin, thanks for this video. I wonder 'mariadb-jdbc-connect' is available in this project. Thanks :)

배성현-sp
Автор

Thanks Robin. I have question on Plugin_path. you have given while installing the connector. From where that path came? Can i give any path? Where i can find that path to mention in Dockerfile?

rishi
Автор

Hi Robin, is there a source connector for adobe or can we use a json connector as long as the streaming data is in json format?

esbee
Автор

Thanks Robin - from your newest fan and subscriber :) I'm really loving all the information coming from Confluent. Doing a top job. We are getting serious about implementing a solution centralized on Kafka (on limited budget) - guess there is just a lot of different ways and means. Will post on the community bit later - but just wondering - off top of your head if you were combining web logs from multiple websites of a similar nature (db schema is same - although as per your suggestion will look into avro) - would you combine all users into 1 topic (perhaps tagging where they originated) or set-up a topic for each website. Ultimately queries are centralized on username, so origination just fyi. Somewhere I heard/read about creating a topic per user - but this did n't seem right (for 10ks of users)

marcuspaget
Автор

Hello Robin, I connected azureSQL with kafkaconnect by giving table name, host name, server name. .But not able to specify the db schema name anywhere, is there any way to specify schema name? Because without specifying schema name it is creating new table in db.

aparnas
Автор

Hi Robin,

Thanks for amazing videos.We are implementing Kafka in our project and when ever I got stuck your videos are helping a lot to clear of the concepts and issues.

I have small conceptual doubt.

Does Kafka and Kafka connect supports ENUM datatypes . We are facing error like Type cast the data type when syncing data from source table to sync table .

lohitraja
Автор

Hi Robin, I facing issue in creating topic in Kafka for decimal data type is store as byte any way to slove that

AnkitSingh-dkqb
Автор

The key format 'AVRO' is not currently supported. - when using FOEMAT='AVRO' in the KSQL

armenchakhalyan
Автор

In distributed mode, somtimes connect worker throws error about status.storage.topic cleanup.policy should be set to compact. I'm wondering why it throws that error occasionally!? and...Would setting log.cleanup.policy to compact on Kafka broker fix the issue!?

mitanshukr
Автор

Can you do a video of "how to integrate MQ with Kafka topic through ibmmq source connector?"

rinuellis
Автор

Can you share any documents for msk as sink connectors

bavisettijyothsna
Автор

I'm getting
ERROR 1049 (42000): Unknown database 'demo'
while trying to connect to mysql...

miristegal
Автор

Hey Robin, thanks for this video. But could u pls guide us first on how to start apache kafka connect? And how to check if it is already running.

abhinavkumar-sefd
Автор

Hi Robin,
How can we include the json schema in the message, when field is an array of objects ? I don't have the option to use avro.

rkravinderkumar
Автор

I hope it isn't too late to thank you Robin

radityoperwianto
visit shbcf.ru