Building a distributed realtime stream processing system

preview_player
Показать описание
An overview of how to build, scale and monitor a stream processing pipeline. As presented at Velocity Conference London 2018

The future of software is distributed. If you run a backend service of consequence, you’re probably dealing with some sort of distributed system. Stream processing applications form the backbone of New Relic’s data pipeline processing billions of data points a minute. As a result, the company has learned a few useful things about building scalable distributed stream processing systems.

While there are many great tools such as Kafka and Docker orchestration upon which to build feature-rich systems, you still need to understand how these building blocks work and how to apply them effectively and reliably at scale. Amy Boyle walks you through building, scaling, and monitoring a stream processing pipeline, drawing on examples from New Relic’s data pipeline.
Рекомендации по теме
join shbcf.ru