2012. július 23., hétfő

split large sql dump (the hard way)

I exported a table from a MySQL database using phpmyadmin, and got a 45MiB dump. After bzip’ing, its size went down under 2MiB, the file size limit for importing.
After that, I deleted the table from the original database (I know, I should have renamed it ... too late).

When I wanted to import the dump back to the database, I got a 500 error from the server. After a bit of experimenting, it turned out that the file to be imported has to be under 2MiB uncompressed!

So I was stuck with a huge dump file with a single INSERT statement.

Here is what I did:

  • I searched StackOverflow for ‘split large text file’, read the Q&A Easy Way to Split a Large Text File? and facepalmed since I had not known the standard unix utility split
  • I extracted the “header” of the dump (SET SQL_MODE... etc, and INSERT ... VALUES) to sql_header
  • split --line-bytes 2000000 tablename.sql
    • this way the fragments don’t exceed 2MB and don’t split lines
  • for i in xa? ; do cat sql_header $i | sed "$ s/,$/;/" >tablename_$i.sql ; done
    • this way all fragments got a ; at the end
    • note that in sed $ selects the last line for operation
  • and then I imported the fragments in alphabetical order.