Kafka Tutorial - Callback & Acknowledgment

preview_player
Показать описание
Spark Programming and Azure Databricks ILT Master Class by Prashant Kumar Pandey - Fill out the google form for Course inquiry.
-------------------------------------------------------------------
Data Engineering using is one of the highest-paid jobs of today.
It is going to remain in the top IT skills forever.

Are you in database development, data warehousing, ETL tools, data analysis, SQL, PL/QL development?
I have a well-crafted success path for you.
I will help you get prepared for the data engineer and solution architect role depending on your profile and experience.
We created a course that takes you deep into core data engineering technology and masters it.

If you are a working professional:
1. Aspiring to become a data engineer.
2. Change your career to data engineering.
3. Grow your data engineering career.
4. Get Databricks Spark Certification.
5. Crack the Spark Data Engineering interviews.

ScholarNest is offering a one-stop integrated Learning Path.
The course is open for registration.

The course delivers an example-driven approach and project-based learning.
You will be practicing the skills using MCQ, Coding Exercises, and Capstone Projects.
The course comes with the following integrated services.
1. Technical support and Doubt Clarification
2. Live Project Discussion
3. Resume Building
4. Interview Preparation
5. Mock Interviews

Course Duration: 6 Months
Course Prerequisite: Programming and SQL Knowledge
Target Audience: Working Professionals
Batch start: Registration Started
Fill out the below form for more details and course inquiries.

--------------------------------------------------------------------------
Best place to learn Data engineering, Bigdata, Apache Spark, Databricks, Apache Kafka, Confluent Cloud, AWS Cloud Computing, Azure Cloud, Google Cloud - Self-paced, Instructor-led, Certification courses, and practice tests.
========================================================

SPARK COURSES
-----------------------------

KAFKA COURSES
--------------------------------

AWS CLOUD
------------------------

PYTHON
------------------

========================================
We are also available on the Udemy Platform
Check out the below link for our Courses on Udemy

=======================================
You can also find us on Oreilly Learning

=========================================
Follow us on Social Media

========================================
Рекомендации по теме
Комментарии
Автор

Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code.

ScholarNest
Автор

your entire series is short and crispy, even anyone can learn kafka within a day. Thanks Buddy.

mkvjaipur
Автор

These are really very good videos.. in depth and explained very clearly. Thank you for providing these.

dasikalyan
Автор

is there a tutorial for acknowledgment in kafka consumer?If yes, can you share the link? TIA

ruchibhagwat
Автор

Hello friends, I would like to thank Mr Mayank Pandey for creating this awsome, flawless, rich of knowledge, with a simple and easy to understanding dialogue delivery.Without hasitation this is the The best series of Kafka (not to mention he has some more equally better series of cources/ videos on his channel).Why I am mentioning that is, participants who wathched and commented /suscribed his channel put some very interesting questions and got very knowledgable answer.So Not only watch the videos but also read the Que and Ans asked in the comment section.So hold tight and have a very interesting journy of learning ---and its free.A great thank you to Mr Mayank Pandey...keep doing it .

skkkks
Автор

best tutorial, thanks for your effort to educate us. thanks alot. please let me know if there is Spark tutorials.

AlokSingh-fjnm
Автор

Is there a way to publish an event on topic saying all messages are published and for our consumer to just poll once this event is available that all messages are published now, start consuming. ?

ronakjain
Автор

Can I use this code to express the UDP ?? When I want to make it reliable like TCP?

shareefalshareef
Автор

Thanks for such a useful demo. I have got a problem while getting the offset no after successfully publishing the message, once i send the message using callback class i am checking if any error occur or not ? using RecordMetadata data i am logging the offset id but i am getting always -1 as offset if for all the messages. could you please suggest.

harikrishnathariboyina
Автор

Hello Sir, I read some where "For a producer we have three choices. On each message we can (1) wait for all in sync replicas to acknowledge the message, (2) wait for only the leader to acknowledge the message, or (3) do not wait for acknowledgement."
So in your video for the callback method usage is the 2nd point that is mentioned above?

arkadey
Автор

Sir can you help for python. As it depends on poll. Also do ack required to set all or 1 or 0 for a call back to work properly

SP-xisu
Автор

Suppose kafka server is down will the code will catch the exception and tell server is down..

SatishKumar-yiqi
Автор

Hey what do you suggest for aws lambda process writing to kafka with no scope for message loss. AS callback will keep java function alive till it gets response.

virenme
Автор

Hi Sir,


I have one question if one message is consumed by multiple consumers then why we need to keep all these consumers, in different consumer groups.
Can you please explain to me.

nitusharma
Автор

Great tutorial. Is there a way to receive a call back only incase of error(something like onError), instead of receiving call back for all(onCompletion). Thanks

ragavkb
Автор

Can you please share some real life usage/examples of Kafka for Synchronous Send ?

amitsingh
Автор

Hi,
How multi threaded Producer will work when u have single source of streaming data? Lets say we have 3 producer thread. how each producer will know from where the data consumption will start? we don't want each producer to consume the same data(duplicate). Please let me how we can achieved this. Thanks

mikelongjam