Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 22 total
Thread Speed Issue
Sun, Nov 26 2006 8:10 PMPermanent Link

"Alan Questell"
I'm coming back to DBISAM after some time away. (I just upgraded to 4.24 a
few days ago from 3.) I'm writing a program to go through records from a CSV
file and divide the info up into DBISAM tables for further processing. In my
test file, there are only about 5200 records. The following checks a table
to see if a record has already been created for a certain type record and,
if not, creates the record.

This is running very slow for some reason and I can't figure out why. It
takes several minutes just to run through this small set.

I did a comparison to Absolute Database using the same database structure
and indexes and it is several magnitudes faster. (All I did was to drop
their components on the same data form and used the exact same code.) I'd
like to use DBISAM because I need client/server later.

There ends up being only a few hundred records in this table because of
duplicates.

I've tried several things to speed this up. I haven't used DBISAM in a
couple of years and it just seems much slower than I remember. Any
suggestions would be appreciated.

Using Turbo Delphi Pro, by the way.

begin
 qryCourses.Close;
 qryCourses.SQL.Clear;
 qryCourses.SQL.Add('SELECT ID FROM COURSES WHERE ID = '+
QuotedStr(CourseID));             <== primary index
 qryCourses.Open;
 if qryCourses.Eof = True then begin
   qryCourses.Close;
   tSQL := 'INSERT INTO COURSES (ID, TERM, DEPT, CAT, SECTION, TYPE) ' +
           'VALUES(' + QuotedStr(CourseID)+ ',' +
              QuotedStr(Term) + ',' + QuotedStr(Dept) + ',' +
              QuotedStr(Cat) + ',' + QuotedStr(Section) + ',' +
              QuotedStr(Copy(Section,0,1)) + ')';
   qryGen.SQL.Clear;
   qryGen.SQL.Add(tSQL);
   qryGen.ExecSQL;
 end;
end;

Mon, Nov 27 2006 4:26 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Alan


I'd use prepared queries and parameters, or a simple TDBISAMTable.

Roy Lambert
Mon, Nov 27 2006 8:41 AMPermanent Link

"Alan Questell"
I don't think a parameterized query is any faster if the parameter changes
each time.

I also tried using a table, locate, append, post sequence, but it is no
faster.

"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:2ECD4B45-DCDD-4067-A9B8-C2A3B22FE5EE@news.elevatesoft.com...
> Alan
>
>
> I'd use prepared queries and parameters, or a simple TDBISAMTable.
>
> Roy Lambert
>

Mon, Nov 27 2006 9:12 AMPermanent Link

"Jose Eduardo Helminsky"
Alan

Do you try the same code using transactions ?
I think it will speed up a lot this operation

Eduardo

Mon, Nov 27 2006 9:22 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Alan


A parameterised prepared query IS faster.

A normal query has to open the tables each time. A prepared one holds them open until unprepared which saves a lot of time.

If the data isn't too big, or confidential email it to me and I'll have a look for you.

Roy Lambert
Mon, Nov 27 2006 10:09 AMPermanent Link

Chris Holland

SEC Solutions Ltd.

Avatar

Team Elevate Team Elevate

Hi Alan,

If CourseID is case insensitive it might be worth trying

....UPPER(ID) = UPPER(CourseID)...

Just to make sure that it is using the Primary index

Chris Holland


Alan Questell wrote:
> I'm coming back to DBISAM after some time away. (I just upgraded to 4.24 a
> few days ago from 3.) I'm writing a program to go through records from a CSV
> file and divide the info up into DBISAM tables for further processing. In my
> test file, there are only about 5200 records. The following checks a table
> to see if a record has already been created for a certain type record and,
> if not, creates the record.
>
> This is running very slow for some reason and I can't figure out why. It
> takes several minutes just to run through this small set.
>
> I did a comparison to Absolute Database using the same database structure
> and indexes and it is several magnitudes faster. (All I did was to drop
> their components on the same data form and used the exact same code.) I'd
> like to use DBISAM because I need client/server later.
>
> There ends up being only a few hundred records in this table because of
> duplicates.
>
> I've tried several things to speed this up. I haven't used DBISAM in a
> couple of years and it just seems much slower than I remember. Any
> suggestions would be appreciated.
>
> Using Turbo Delphi Pro, by the way.
>
> begin
>   qryCourses.Close;
>   qryCourses.SQL.Clear;
>   qryCourses.SQL.Add('SELECT ID FROM COURSES WHERE ID = '+
> QuotedStr(CourseID));             <== primary index
>   qryCourses.Open;
>   if qryCourses.Eof = True then begin
>     qryCourses.Close;
>     tSQL := 'INSERT INTO COURSES (ID, TERM, DEPT, CAT, SECTION, TYPE) ' +
>             'VALUES(' + QuotedStr(CourseID)+ ',' +
>                QuotedStr(Term) + ',' + QuotedStr(Dept) + ',' +
>                QuotedStr(Cat) + ',' + QuotedStr(Section) + ',' +
>                QuotedStr(Copy(Section,0,1)) + ')';
>     qryGen.SQL.Clear;
>     qryGen.SQL.Add(tSQL);
>     qryGen.ExecSQL;
>   end;
> end;
>
>
Mon, Nov 27 2006 11:08 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Chris


Good one. I always forget that.

Roy Lambert
Mon, Nov 27 2006 11:12 AMPermanent Link

"Donat Hebert \(Worldsoft\)"
Why not simply read your data into a temporary table, ensure your PK is set,
then
perform a single SQL insert into your target.

Insert into target ( fn...)
Select fn...
from Source s
left join target t on s.pk = t.pk
Where t.pk is null;

If large, you can set a commit interval say .. 500 or 1000 records .
Should just fly this way.  We process very large import files this way.
hth.

Donat.

Mon, Nov 27 2006 11:51 AMPermanent Link

"Alan Questell"
I've thought about that and will do it that way if that is what it
takes...it just seems that it should be faster.

As I said, I tried the exact same code just substituting components with
Absolute Database and it must have been at least 40x faster.

"Donat Hebert (Worldsoft)" <donat.hebert@worldsoft.ca> wrote in message
news:216ADAEB-97AA-4C91-9060-C83F4C8A86D1@news.elevatesoft.com...
> Why not simply read your data into a temporary table, ensure your PK is
> set, then
> perform a single SQL insert into your target.
>
> Insert into target ( fn...)
> Select fn...
> from Source s
> left join target t on s.pk = t.pk
> Where t.pk is null;
>
> If large, you can set a commit interval say .. 500 or 1000 records .
> Should just fly this way.  We process very large import files this way.
> hth.
>
> Donat.
>

Mon, Nov 27 2006 11:52 AMPermanent Link

"Alan Questell"
I've now tried a parametized query and notice no difference in speed. I'm
going to keep experimenting.


"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:01707F1B-F425-459E-B0C5-A1287949EF54@news.elevatesoft.com...
> Alan
>
>
> A parameterised prepared query IS faster.
>
> A normal query has to open the tables each time. A prepared one holds them
> open until unprepared which saves a lot of time.
>
> If the data isn't too big, or confidential email it to me and I'll have a
> look for you.
>
> Roy Lambert
>

Page 1 of 3Next Page »
Jump to Page:  1 2 3
Image