Source MySQL table data to Kafka | Build JDBC Source Connector | Confluent Connector | Kafka Connect

preview_player
Показать описание
The Kafka Connect JDBC Source connector allows you to import data from any relational database with a JDBC driver into an Apache Kafka® topic.

This video will build JDBC Source Connector to source or stream MySql Table data into Kafka Topic on real time basis. It will explain creating source connector config and then deploy it into Confluent Platform using Control Center Gui.

Like | Subscribe | Share
Рекомендации по теме
Комментарии
Автор

Dear Vishal,
Thank you for your effort & time to produce & upload these tutorials.

oyeyemirafiuowolabi
Автор

Nice video bro. It worked for me. One quick question for you:
How to run query like this:
select id, CUST_LN_NBR FROM flex_activity limit 1;

In this query I am using limit option in due to that it is getting failed. If I use simple query without limit then it works fine:

Error I am getting:

You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'WHERE `ID` > -1 ORDER BY `ID` ASC' at line 1

This is not only with limit option, this is for all whenever you have to apply any filter.

manindersingh
Автор

Thank for the great tutorial. Did you have to create the avro schema or it gets generated?

resam
Автор

AWESOME !!.Can you also upload video on sink connector to oracle database or any rdbms

amankumar-fnr
Автор

Very helpful video. I see the 'timestamp.column.name' should have been TXN_DATE (as flashed up on video) otherwise update wont work. It would have been nice to see this working at the end in addition to adding new records

ttc
Автор

Hi Vishal,
I am facing issue Invalid connector configuration: There are 2 fields that require your attention
while connecting to mysql db ..is this required mysql jdbc configuration as well ? if yes how we can do that ?

Rajeshkumar-hshi
Автор

Hi, Vishal, Could you please help me in pushing confluent topic data into scylladb table...

MohdMahebub
Автор

Hi Vishal thanks for sharing videos. Please let me know where can i check the errors/logs when connectors failed? also logs/errors if any issues in processing data. Please share me paths

nareshkyv
Автор

Hi Vishal, thank you very much for this! :)
I have 1 question, how do you installed the connector plugin for sink ?

carlaguelpa
Автор

I'm beginner on kafka and DB concepts. What is dialect exactly in this case ?

krushnapardeshi
Автор

Great video! Can you help with connecting to snowflake as source?

mirzauzair
Автор

Please, what is the best method or connector available to pull or ingest data that normally get updated or changed. For example bank accounts statement.
Thanks.

oyeyemirafiuowolabi
Автор

Hey, i tried to connect to postgreSQL and was finally landing with below error in logs
Caused by: Error serializing Avro message
Caused by: Failed to serialize Avro data from topic <topic_name>
any help ?

pavankumarmantha
Автор

Hi Vishal,
Could you please help me with Snowflake Source connector via (JDBC) to unload the data to Confluent Kafka.
If there's any sample code please share me the github details.

ravirty
Автор

hi Vishal, when i want to upload the connector i faced the issue "invalid connector class" i need advice. thanks

mdicopriatama
Автор

hi
by using your code connector is got connected but status is got failed and no data is fetched from db can i know what was the reason

muralidharan
Автор

Could you please do a Video loading tables into their respective topics for each tables, using SQL SERVER as a source??.... so everytime each table suffers a DML action the new event s sent to the Confluent plataform..

jinraven
Автор

My Confluent installation dont have JDBC source connector by default, Please help me how to install ?

sandeepravitej
Автор

Sir nice video, but atleast you can upload the code link.

shuchikumari
Автор

Can you please guide me how we do this in local instead of confluent cloud?
What I want to do is like to connect database with kafka connect and the data gets inserted into kafka cluster

akshatkhandelwal