Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 10 total
Thread Speeding up multiple inserts
Tue, Feb 1 2011 10:49 AMPermanent Link

John Taylor

Using DBIsam 4.29 (not client server)

I need to insert multiple records into a table.  Using sql 'INSERT INTO ....
blah blah' I have found a typical insert of say 500 records
will take 5 minutes or more on a relatively fast machine.  How can I speed
up this process ?  Multiple users will  be
inserting records into the same table residing on a server so as I
understand it using transactions is out of the question.

Is there another way to speed up multiple inserts ?

Thanks
JT

Wed, Feb 2 2011 4:46 AMPermanent Link

Jose Eduardo Helminsky

HPro Informatica

John

<<
How can I speed  up this process ?
>>
You should use transactions.

begin transaction
;
insert into ...
;
commit transaction

<<
Multiple users will  be inserting records into the same table residing on a
server so as I
understand it using transactions is out of the question.
>>
This is not a problem, when one use is performing the insert during
transaction, the other users are "waiting" for the first one to release
(commit/rollback). When the first user end the process the second starts and
so on.

Eduardo

Thu, Feb 3 2011 8:28 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John


That sounds exceptionally slow - 100 records a minute inserted. I know there were a lot of postings and a lot of benchmarks run (can't remember wether it was V3 or V4) which showed DBISAM slowing down in very big tables (1,000,000 records). I found one reference

DBISAM 3.30   Elapsed: 3,9 sec,  50.000 records 12 903 Rcds/Sec

and can't be bothered to look for more.

In terms of what might be causing it here's a few thoughts:

1.    You don't say if this speed is achieved over the network or on a local machine. If the former I'd bet network speed / traffic
2.    Aggressive antivirus scanning slowing things down
3.    Loads of indices, especially if full text indexing is used
4.    Some kind of after post processing built into the engine as a UDF
5.    Depending on how you're doing it (eg build script with a wadge of inserts and then fire that off or send each insert of separately) network traffic and lock contention could be drastically affected.

Transactions may help, depending on how you're doing it. To expand on what Eduardo posted

begin transaction
;
insert into ...
;
commit transaction

is BAD (sorry Eduardo I know what you mean Smiley

Do not wrap an individual insert in a transaction. You need to wrap a batch of them

begin transaction
;
insert into ...
insert into ...
insert into ...
insert into ...
;
commit transaction

When you run a transaction its initially carried out in memory nothing actually touches the database until you reach commit at which point the whole lot is "posted". It should help to reduce network traffic and contention.

Final point is that with a transaction its all or nothing so you have to alter your programming to handle the case where the transaction fails rather than a single post fails.


Roy Lambert [Team Elevate]
Tue, Feb 8 2011 10:42 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

John,

<< I need to insert multiple records into a table.  Using sql 'INSERT INTO
.....  blah blah' I have found a typical insert of say 500 records
will take 5 minutes or more on a relatively fast machine.  How can I speed
up this process ? >>

As Roy indicated, this is unusually slow, and you should try to determine if
something like AV software is interfering with the operation of DBISAM.  As
for speeding things up, transactions are the way to go, as Eduardo
indicated.

--
Tim Young
Elevate Software
www.elevatesoft.com
Wed, Feb 9 2011 10:25 AMPermanent Link

John Taylor

Tim,

Actually, my timing was on a local data set not over a network, no av
problem.

Here is what I'm trying to do...

I have a List of records in a TList.  I want to insert a record into the
table for each item in the List.  A blob field
is involved which will be loaded from a file, the file is the same for all
records.  I'm doing like this short example...

Q := TDBIsamQuery.Create(nil);
Q.Sql.Add('INSERT INTO ' + sTableName);
Q.Sql.Add('(JOB_NAME,JOB_DATE,JOB_BLOB)')
Q.Sql.Add('VALUES(Tongueame,Tongueate,Tonguelob)');
Q.ParamByName('pBlob').LoadFromFile(sFileName,ftBlob);
Q.ParamByName('pDate').AsDate := Date;
for I := 0 to Pred(List.Count) do
begin
 p := List[I];
 Q.ParambyName('pName') .asString := p^.sName;
 Q.ExecSql;
end;

How do I apply transactions to this ?

Help appreciated !

John


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:FAC01BB7-1AD1-4C22-A3FB-226A26A82687@news.elevatesoft.com...
> John,
>
> << I need to insert multiple records into a table.  Using sql 'INSERT INTO
> ....  blah blah' I have found a typical insert of say 500 records
> will take 5 minutes or more on a relatively fast machine.  How can I speed
> up this process ? >>
>
> As Roy indicated, this is unusually slow, and you should try to determine
> if something like AV software is interfering with the operation of DBISAM.
> As for speeding things up, transactions are the way to go, as Eduardo
> indicated.
>
> --
> Tim Young
> Elevate Software
> www.elevatesoft.com
Wed, Feb 9 2011 11:02 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John

Its a long time since I used DBISAM but I think you'll need to build a script and have a START TRANSACTION as the first line and COMMIT as the last. Alternatively you could create a database and use that. Check out the manual.

However, since you've now supplied some more information I have some more questions - what's the size of the blob and what's the blobblocksize?


If the blob is large then you might not be able to build a script / run the transaction in memory.

Roy Lambert [Team Elevate]

Wed, Feb 9 2011 1:52 PMPermanent Link

John Taylor

Roy,

The Blob could be quite large, it might be a one page document (tif file) to
be faxed or it might be 40 pages.
The Blob block size is 2048

John

"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:C1D689B7-460D-4BBC-A224-72A16E7B5428@news.elevatesoft.com...
> John
>
> Its a long time since I used DBISAM but I think you'll need to build a
> script and have a START TRANSACTION as the first line and COMMIT as the
> last. Alternatively you could create a database and use that. Check out
> the manual.
>
> However, since you've now supplied some more information I have some more
> questions - what's the size of the blob and what's the blobblocksize?
>
>
> If the blob is large then you might not be able to build a script / run
> the transaction in memory.
>
> Roy Lambert [Team Elevate]
>
>
Thu, Feb 10 2011 5:51 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John


Try upping the blobblocksize - try 16k as a starter.

There's been a few discussions on this but I can't remember wether its read speed or write as well. I'd suspect both.

Roy Lambert [Team Elevate]
Tue, Feb 15 2011 9:06 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

John,

<< Actually, my timing was on a local data set not over a network, no av
problem. >>

Anti-virus software is *especially* meddlesome with local data, so I would
double-check and make sure that the AV software isn't scanning or trying to
validate any of the database files.

Use this code for optimal transaction usage:

var
  Tables: TStrings;
begin
Tables:=TStringList.Create;
Tables.Add(sTableName);
Q := TDBIsamQuery.Create(nil);
Q.Sql.Add('INSERT INTO ' + sTableName);
Q.Sql.Add('(JOB_NAME,JOB_DATE,JOB_BLOB)')
Q.Sql.Add('VALUES(Tongueame,Tongueate,Tonguelob)');
Q.Prepare;
Q.Database.StartTransaction(Tables);
try
   Q.ParamByName('pBlob').LoadFromFile(sFileName,ftBlob);
   Q.ParamByName('pDate').AsDate := Date;
   for I := 0 to Pred(List.Count) do
   begin
   p := List[I];
   Q.ParambyName('pName') .asString := p^.sName;
   Q.ExecSql;
   end;
  Q.Database.Commit;
except
   Q.Database.Rollback;
   raise;
end;
Q.Free;
end;

You'll want to try..finally the allocations/deallocations, of course. Smile

--
Tim Young
Elevate Software
www.elevatesoft.com
Tue, Feb 15 2011 8:05 PMPermanent Link

John Taylor

Thanks Tim, I will give this a try

John


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:CC7412DA-CE05-45C3-8401-87B58EEFD4BD@news.elevatesoft.com...
> John,
>
> << Actually, my timing was on a local data set not over a network, no av
> problem. >>
>
> Anti-virus software is *especially* meddlesome with local data, so I would
> double-check and make sure that the AV software isn't scanning or trying
> to validate any of the database files.
>
> Use this code for optimal transaction usage:
>
> var
>   Tables: TStrings;
> begin
> Tables:=TStringList.Create;
> Tables.Add(sTableName);
> Q := TDBIsamQuery.Create(nil);
> Q.Sql.Add('INSERT INTO ' + sTableName);
> Q.Sql.Add('(JOB_NAME,JOB_DATE,JOB_BLOB)')
> Q.Sql.Add('VALUES(Tongueame,Tongueate,Tonguelob)');
> Q.Prepare;
> Q.Database.StartTransaction(Tables);
> try
>    Q.ParamByName('pBlob').LoadFromFile(sFileName,ftBlob);
>    Q.ParamByName('pDate').AsDate := Date;
>    for I := 0 to Pred(List.Count) do
>    begin
>    p := List[I];
>    Q.ParambyName('pName') .asString := p^.sName;
>    Q.ExecSql;
>    end;
>   Q.Database.Commit;
> except
>    Q.Database.Rollback;
>    raise;
> end;
> Q.Free;
> end;
>
> You'll want to try..finally the allocations/deallocations, of course. Smile
>
> --
> Tim Young
> Elevate Software
> www.elevatesoft.com
Image