Mysql Data Import optimization

1, when we bulk import data, import data file can be done for a certain optimization

Load data local infile '/root/a.sql' into table order_by_1 fields terminated by ',' lines terminated by '\n'


Ordered data import efficiency is very high.


For example, to import a batch 10,000,000 if the primary key data file is imported data increment primary key, and is sorted in order, when it is introduced mysql very efficient.
If the import content of the document is the primary key field to the other, or is the increment primary keys but not so good sorting process efficiency is very low insertion

2 close the transaction uniqueness check

If our existence and uniqueness of the index table when the transaction redesign
namely: 'UNIQUE KEY index name' ( 'index column')

Before we have to insert the data and then insert a unique index to close the transaction,
the command:

set unique_checks = 0
reopening unique index in the insertion transaction is completed.

 

3 Manual-affairs

Set autocommit=0;

When a transaction is committed after the completion of the transaction is committed to manually open talk

Guess you like

Origin www.cnblogs.com/NZT-HJX/p/11961433.html