Streaming MySQL Changes to ClickHouse: Designing an End to End CDC Pipeline

May 27-29, 2026 • Computer History Museum, CaliforniaDate, time, and room will be announced soon.
Many organizations rely on MySQL for operational workloads while analytics teams increasingly need fast access to fresh data for reporting, experimentation, and data-driven applications. Running complex analytical queries directly on production databases can quickly become a bottleneck and introduce operational risk.
Change Data Capture (CDC) offers a practical solution by streaming database changes in real time to systems designed for analytics.
In this session we will walk through an end to end architecture that streams changes from MySQL to ClickHouse using binlog based CDC. Starting from the MySQL binary log, we will follow the lifecycle of a change event as it is captured, transported through a streaming pipeline, and delivered into ClickHouse for high performance analytical queries.
Beyond the architecture itself, the talk focuses on lessons learned operating this pattern at scale, handling billions of records. We will discuss practical challenges that appear in production environments, including data consistency, pipeline reliability and operational troubleshooting.
Attendees will learn:
• How MySQL binlog based CDC works
• How to design a reliable pipeline for streaming database changes
• Data modeling considerations when moving from OLTP to columnar analytics
• Lessons learned operating CDC pipelines at scale
• Common pitfalls and operational challenges in real production environments
By the end of the session, participants will understand how to design and operate a modern architecture that keeps MySQL optimized for transactional workloads while enabling real time analytics with ClickHouse.
Speaker

Javier Zon is a database and platform engineer with more than 20 years of experience working with large-scale data systems and open source technologies. He began his journey in the MySQL ecosystem many years ago and …

