Efficiently loading a csv file into a MySQL table -


the current way loading file is:

   load data local infile 'file_name' table tablea   fields terminated ',' enclosed '"' lines terminated '\n'; 

is optimal way load in table in unix machine. create optimal table size? want table takes smallest space.

myisam

if table myisam, should following:

set bulk_insert_buffer_size = 1024 * 1024 * 256; alter table tablea disable keys; load data local infile 'file_name' table tablea   fields terminated ',' enclosed '"' lines terminated '\n'; alter table tablea enable keys; 

innodb

if table innodb, should following:

set bulk_insert_buffer_size = 1024 * 1024 * 256; load data local infile 'file_name' table tablea   fields terminated ',' enclosed '"' lines terminated '\n'; 

no take least space (loading empty table), rows buffered in treelike structure in memory based on bulk_insert_buffer_size caching data quicker during reload.

if worried ibdata1 exploding, need convert innodb tables use innodb_file_per_table. please use innodb cleanup steps : howto: clean mysql innodb storage engine?


Comments

Popular posts from this blog

java - Play! framework 2.0: How to display multiple image? -

gmail - Is there any documentation for read-only access to the Google Contacts API? -

php - Controller/JToolBar not working in Joomla 2.5 -