A wide variety of use cases such as banking, cyber threat detection and e-commerce require real time processing in the order of hundreds of thousands per second. This talk details how to build a platform at that scale while maintaining resilience and availability guarantees.
A few years ago I delivered a bespoke software streaming framework that across 10 nodes reached the dizzy heights of just 5,000 events per second. Fast forward to 2017 and we’re easily achieving numbers of 150,000 events per second for pipelines across the same number of nodes. What’s more these solutions linearly scale; double the nodes, double the performance. The question is, how do we achieve this?
Dan Cook is a Lead Developer at Scott Logic. He has led the development of the UK Hydrographic Office’s Hadoop-as-a-Service offering, has built software streaming frameworks long before the rise of Spark and Storm and more recently worked on a cyber threat detection platform for BT. When not coding he can normally be found going up a hill very slowly on a bicycle.