Stress testing often needs to insert a large amount of data into the database. Below is the script I used to insert data into the two databases
SQL server
declare @maxSum int,
@lid nvarchar(64), -- 'lid'为表id
@cid int,
@userid nvarchar(64),
@oper_time nvarchar(26),
@oper_type nvarchar(10),
@oper_host nvarchar(64),
@permission nvarchar(100),
@status nvarchar(10),
@detalls nvarchar(max),
@version int
set @maxSum=1
set @cid='1'
set @userid='1'
set @oper_time='2020-10-26 12:15:07.000761'
set @oper_type='7'
set @oper_host='127.0.0.1'
set @permission='system'
set @status='0'
set @detalls='{"msg":"Login for User:admin.","logBeans":null}'
set @version='0'
begin tran
while @maxSum<200000
begin
set @lid='LID'+convert(nvarchar,@maxSum) -- id加'LID'前缀方便识别
insert into T_AT_LOG (LID,CID,USERID,OPER_TIME,OPER_TYPE,OPER_HOST,PERMISSION,STATUS,DETAILS,VERSION)
values(@lid,@cid,@userid,@oper_time,@oper_type,@oper_host,@permission,@status,@detalls,@version)
set @maxSum=@maxSum+1
end
commit
The data needs to be Commited in batches, here I am 200,000 transactions in a single time, and the database cannot support several million commits in a single time.
Single stroke takes 28 seconds
The efficiency of this script is not very satisfactory. If there is a more efficient script, please share in the comment area
Oracle
DECLARE
a number(30) := 0;
BEGIN
for i in 1 .. 200000 loop
INSERT INTO T_AT_LOG(LID,CID,USERID,OPER_TIME,OPER_TYPE,OPER_HOST,PERMISSION,STATUS,DETAILS,VERSION)
VALUES(i,'1','1','2020-10-13 10:25:38.000176','8','127.0.0.1','system','0','{"msg":"Login for User:admin.","logBeans":null}','0');
end loop;
commit;
END;
Here is a batch Commit
33 seconds for a single stroke