I recently switched from daily full backups in S3 for my ClickHouse server to incremental backups.
Daily backups are useful. But they take too much space. That’s bad. But most of your data doesn’t change every day, so you can store only the parts that change.
- Get a list of tables
- Freeze each table
- Copy the metadata folder into your frozen backup folder
- Use rsync to make an incremental backup using the
--link-dest
option.
I store my incremental backups in rsync.net. But any SSH host, or any path you can mount on Linux will also work.
Citation
If you find this work useful, please cite it as:
@article{yaltirakli,
title = "ClickHouse incremental backup",
author = "Yaltirakli, Gokberk",
journal = "gkbrk.com",
year = "2025",
url = "https://www.gkbrk.com/clickhouse-incremental-backup"
}
Not using BibTeX? Click here for more citation styles.
IEEE Citation Gokberk Yaltirakli, "ClickHouse incremental backup", September, 2025. [Online]. Available: https://www.gkbrk.com/clickhouse-incremental-backup. [Accessed Sep. 29, 2025].
APA Style Yaltirakli, G. (2025, September 29). ClickHouse incremental backup. https://www.gkbrk.com/clickhouse-incremental-backup
Bluebook Style Gokberk Yaltirakli, ClickHouse incremental backup, GKBRK.COM (Sep. 29, 2025), https://www.gkbrk.com/clickhouse-incremental-backup