Efficiently loading a csv file into a MySQL table -
the current way loading file is:
load data local infile 'file_name' table tablea fields terminated ',' enclosed '"' lines terminated '\n'; is optimal way load in table in unix machine. create optimal table size? want table takes smallest space.
myisam
if table myisam, should following:
set bulk_insert_buffer_size = 1024 * 1024 * 256; alter table tablea disable keys; load data local infile 'file_name' table tablea fields terminated ',' enclosed '"' lines terminated '\n'; alter table tablea enable keys; innodb
if table innodb, should following:
set bulk_insert_buffer_size = 1024 * 1024 * 256; load data local infile 'file_name' table tablea fields terminated ',' enclosed '"' lines terminated '\n'; no take least space (loading empty table), rows buffered in treelike structure in memory based on bulk_insert_buffer_size caching data quicker during reload.
if worried ibdata1 exploding, need convert innodb tables use innodb_file_per_table. please use innodb cleanup steps : howto: clean mysql innodb storage engine?
Comments
Post a Comment