Copying large MySQL table in background -
i need re-create big (80 mil records) mysql table table need add together index table , using alter slow / run out of ram.
i've tried running mysql script containing insert table2 select * table1 in screen instance can detach , exit ssh session, reason did first 20 1000000 rows (which took couple of hours) , randomly stopped. method require lot of memory? how else can fast , in background?
dump table .csv file
mysqldump -u [username] -p -t -t/path/to/directory/file.txt [database] [table] --fields-terminated-by=,
create new table proper schema
then load new table .csv info in it
load info infile '/path/to/directory/file.txt' table database.new_table fields terminated ','
fyi: these commands aren't tested , may need adjusting, should idea
mysql
No comments:
Post a Comment