Icon View Thread

The following is the text of the current message along with any replies.
Messages 11 to 20 of 22 total
Thread Speed Issue
Mon, Nov 27 2006 12:21 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Alan


Did you specifically prepare it?

Roy Lambert
Mon, Nov 27 2006 12:23 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Donat


That actually brings up an interesting thought:

Import into a table without any indices. Once that's complete restructure introducing a unique primary key and accept the data loss.

Roy Lambert
Mon, Nov 27 2006 12:24 PMPermanent Link

"Donat Hebert \(Worldsoft\)"
Don't forget though not comparing same type of db..  I find that if you use
either transactions in db or SQL approach, speed
is very good.  Case sensitive keys or Upper(fn) as noted earlier..  For
large imports (ie 1 million payroll records), we always
place into temporary tables with appropriate keys and perform bulk
processing via SQL.  Customers are very happy with
speed and want to stick with DBISAM given its stability, features, and
simplicity for support.

Donat.

"Alan Questell" <alanq@pinehurst.net> wrote in message
news:73765F4D-D632-44A8-845F-314FFCEF817F@news.elevatesoft.com...
> I've thought about that and will do it that way if that is what it
> takes...it just seems that it should be faster.
>
> As I said, I tried the exact same code just substituting components with
> Absolute Database and it must have been at least 40x faster.
>
> "Donat Hebert (Worldsoft)" <donat.hebert@worldsoft.ca> wrote in message
> news:216ADAEB-97AA-4C91-9060-C83F4C8A86D1@news.elevatesoft.com...
>> Why not simply read your data into a temporary table, ensure your PK is
>> set, then
>> perform a single SQL insert into your target.
>>
>> Insert into target ( fn...)
>> Select fn...
>> from Source s
>> left join target t on s.pk = t.pk
>> Where t.pk is null;
>>
>> If large, you can set a commit interval say .. 500 or 1000 records .
>> Should just fly this way.  We process very large import files this way.
>> hth.
>>
>> Donat.
>>
>
>

Mon, Nov 27 2006 12:35 PMPermanent Link

"Alan Questell"
No, I've used DBISAM in the past for some major C/S programs that ran for
years with no problems. So I'm well aware of DBISAM's stability and speed. I
just haven't used it in a couple of years as I've been away from programming
and this is my first time using version 4.

I just haven't found the answer to this yet.


"Donat Hebert (Worldsoft)" <donat.hebert@worldsoft.ca> wrote in message
news:94B309C7-43DA-4A18-982A-F807C309FE6E@news.elevatesoft.com...
> Don't forget though not comparing same type of db..  I find that if you
> use either transactions in db or SQL approach, speed
> is very good.  Case sensitive keys or Upper(fn) as noted earlier..  For
> large imports (ie 1 million payroll records), we always
> place into temporary tables with appropriate keys and perform bulk
> processing via SQL.  Customers are very happy with
> speed and want to stick with DBISAM given its stability, features, and
> simplicity for support.
>
> Donat.
>
> "Alan Questell" <alanq@pinehurst.net> wrote in message
> news:73765F4D-D632-44A8-845F-314FFCEF817F@news.elevatesoft.com...
>> I've thought about that and will do it that way if that is what it
>> takes...it just seems that it should be faster.
>>
>> As I said, I tried the exact same code just substituting components with
>> Absolute Database and it must have been at least 40x faster.
>>
>> "Donat Hebert (Worldsoft)" <donat.hebert@worldsoft.ca> wrote in message
>> news:216ADAEB-97AA-4C91-9060-C83F4C8A86D1@news.elevatesoft.com...
>>> Why not simply read your data into a temporary table, ensure your PK is
>>> set, then
>>> perform a single SQL insert into your target.
>>>
>>> Insert into target ( fn...)
>>> Select fn...
>>> from Source s
>>> left join target t on s.pk = t.pk
>>> Where t.pk is null;
>>>
>>> If large, you can set a commit interval say .. 500 or 1000 records .
>>> Should just fly this way.  We process very large import files this way.
>>> hth.
>>>
>>> Donat.
>>>
>>
>>
>
>

Mon, Nov 27 2006 12:48 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Alan

Try this

qryGen.SQL.Text := 'INSERT INTO COURSES (ID, TERM, DEPT, CAT, SECTION, TYPE VALUES( :CourseID, :Term, :Dept, :Cat, :Section, :SubSection'
qryGen.Prepare
begin
 qryGen.Close;
 qryGen.ParamByName(':CourseID').AsString := CourseID;
 qryGen.ParamByName(':Term').AsString := Term;
 qryGen.ParamByName(':Dept').AsString := Dept;
 qryGen.ParamByName(':Cat').AsString := Cat;
 qryGen.ParamByName(':Section').AsString := Section;
 qryGen.ParamByName(':SubSection').AsString := Copy(Section,0,1);
 try
  qryGen.ExecSQL;
 except
 end;
end;


Roy Lambert
Mon, Nov 27 2006 12:50 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Alan,

<< I don't think a parameterized query is any faster if the parameter
changes each time. >>

Actually that's the whole purpose of a parameterized query - to allow for
variable substitution for constants in an SQL statement.  Make sure to
manually call Prepare before starting the reading process for the CSV and
you should see a big difference in performance.  Another idea would be to
simply drop the INSERT statement and just use a live query result.  That way
you can just insert into the TDBISAMQuery component directly using the
Insert/Post methods without having to execute another SQL statement.

--
Tim Young
Elevate Software
www.elevatesoft.com

Mon, Nov 27 2006 12:58 PMPermanent Link

"Alan Questell"
I'll try it.

"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:885EC7F9-B98A-414B-8235-B447FB07E4F2@news.elevatesoft.com...
> Alan
>
> Try this
>
> qryGen.SQL.Text := 'INSERT INTO COURSES (ID, TERM, DEPT, CAT, SECTION,
> TYPE VALUES( :CourseID, :Term, :Dept, :Cat, :Section, :SubSection'
> qryGen.Prepare
> begin
> qryGen.Close;
> qryGen.ParamByName(':CourseID').AsString := CourseID;
> qryGen.ParamByName(':Term').AsString := Term;
> qryGen.ParamByName(':Dept').AsString := Dept;
> qryGen.ParamByName(':Cat').AsString := Cat;
> qryGen.ParamByName(':Section').AsString := Section;
> qryGen.ParamByName(':SubSection').AsString := Copy(Section,0,1);
> try
> qryGen.ExecSQL;
> except
> end;
> end;
>
>
> Roy Lambert

Mon, Nov 27 2006 1:00 PMPermanent Link

"Alan Questell"
I'll try this, but I think it's the query that actually looks for the key in
the first place, not the INSERT that is slow...I'll do some testing to see.

"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:885EC7F9-B98A-414B-8235-B447FB07E4F2@news.elevatesoft.com...
> Alan
>
> Try this
>
> qryGen.SQL.Text := 'INSERT INTO COURSES (ID, TERM, DEPT, CAT, SECTION,
> TYPE VALUES( :CourseID, :Term, :Dept, :Cat, :Section, :SubSection'
> qryGen.Prepare
> begin
> qryGen.Close;
> qryGen.ParamByName(':CourseID').AsString := CourseID;
> qryGen.ParamByName(':Term').AsString := Term;
> qryGen.ParamByName(':Dept').AsString := Dept;
> qryGen.ParamByName(':Cat').AsString := Cat;
> qryGen.ParamByName(':Section').AsString := Section;
> qryGen.ParamByName(':SubSection').AsString := Copy(Section,0,1);
> try
> qryGen.ExecSQL;
> except
> end;
> end;
>
>
> Roy Lambert

Mon, Nov 27 2006 1:02 PMPermanent Link

"Alan Questell"
I tried both of these and don't see any difference.

But it's probably me doing (or not doing) something dumb. Smile

I'll keep playing with it. It's been a while since I've worked with DBISAM
and I just upgraded to 4 this past weekend.

"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:2E942D64-66C0-40B9-BC34-66A0F18FF2C6@news.elevatesoft.com...
> Alan,
>
> << I don't think a parameterized query is any faster if the parameter
> changes each time. >>
>
> Actually that's the whole purpose of a parameterized query - to allow for
> variable substitution for constants in an SQL statement.  Make sure to
> manually call Prepare before starting the reading process for the CSV and
> you should see a big difference in performance.  Another idea would be to
> simply drop the INSERT statement and just use a live query result.  That
> way you can just insert into the TDBISAMQuery component directly using the
> Insert/Post methods without having to execute another SQL statement.
>
> --
> Tim Young
> Elevate Software
> www.elevatesoft.com
>

Mon, Nov 27 2006 5:53 PMPermanent Link

"Alan Questell"
This was the kind of dumb thing I was afraid I would find. After leaving it
for a few hours and coming back to it, I saw that the way I was doing
something caused the engine to have to connect each time through an
interation instead of once. No wonder it was slow.

It's very fast now...and I'm getting back up to speed myself after being
away for a while.

"Alan Questell" <alanq@pinehurst.net> wrote in message
news:031325F8-52CC-48C6-968C-519120C04E07@news.elevatesoft.com...
> I'm coming back to DBISAM after some time away. (I just upgraded to 4.24 a
> few days ago from 3.) I'm writing a program to go through records from a
> CSV

« Previous PagePage 2 of 3Next Page »
Jump to Page:  1 2 3
Image