The Critéo Big-Data Platform
Speaker(s) : Thierry Lefort, etc.
In this presentation we propose to go through some of the challenges we face in our day to day work in order to maintain and to scale the Criteo platform. That platform handles 2 million HTTP requests per second on average with peaks at 2.6 million. To answer this load we align more than 7000 front servers and 1515 for caching, this represent 80To of data transferred every day on intercontinental links to a 1050 servers Hadoop cluster.
- Thierry Lefort : Criteo's Data Architecture 10,000ft Overview
- Corentin Chary : Scaling Graphite
- François Visconte : Stream processing at Criteo with Kafka
- Rémy Saissy : Lessons Learned from Scaling Hadoop
- Yann Schwartz : Challenges of Building a Multi-Datacenter Infrastructure for Fraud Detection
marc.shapiro (at) nullacm.org