Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For now, handrolled. The idea is to be able to do either `cat * .sql' or just `cat LAST-PART.sql'. I run mysqldump once per each large table with --where="_rowid >= $FROM AND < $TO" argument to mysqldump, and call mysqldump in loop with consecutive $FROM and $TO. It works, it gets the job done, but it's not transaction safe.

That `_rowid' is a reserved symbol in MySQL. Refers to table's PRIMARY KEY (but only if it's single INT). In the usual case, the script doesn't have to know table's PRIMARY KEY.

Another way would be to use `rolling checksum' to split files; the concept described in http://beeznest.wordpress.com/2005/02/03/rsyncable-gzip/ But you could end up with dump files split in the middle of SQL statement, not very cool.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: