Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 9 of 9 total
Thread How to optimize for speed by manipulating buffers.
Fri, Jun 16 2006 12:46 PMPermanent Link

Joe Real
We have a Win2K Pro with 120 GB hard drive, dual CPU, 1 GB RAM.

DBISAM ver 4.19 Delphi 7

I have 5 physical tables, with the major one in the loop having close to 1 million records
and 15 indices.

I ran local program on it that updates local DBISAM tables. Not satisfied with the speed,
I tweaked the default buffer count and data size as follows:

 with engine do begin
   MaxTableDataBufferCount  := 8192*4;
   MaxTableDataBufferSize   := 32768*4;
   MaxTableIndexBufferCount := 8192*6;
   MaxTableIndexBufferSize  := 65536*6;
 end;

Then I ran my program, and watched memory consumption and CPU performance.

Before the memory modifications, I ran the program and both CPUs were swinging back and
forth from 20% to 80% performance, and the  memory consumed stayed around 325 MB total
with 575 MB available when the local program was running.

After the memory modifications, the program ran four times as long, both CPU's were
swinging from 7% to 20% only, and memory consumed stayed around 330 MB and 570 MB available.

What happened here?

In which direction should I manipulate memory in order to optimize for speed.
Fri, Jun 16 2006 1:00 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Joe,

<< Before the memory modifications, I ran the program and both CPUs were
swinging back and forth from 20% to 80% performance, and the  memory
consumed stayed around 325 MB total with 575 MB available when the local
program was running.

After the memory modifications, the program ran four times as long, both
CPU's were swinging from 7% to 20% only, and memory consumed stayed around
330 MB and 570 MB available. >>

You're not saying what is going on in this loop (transactions, exclusive
access, etc.).  Lowered CPU, for example, usually indicates that the OS is
spending more time writing to the hard drive.

--
Tim Young
Elevate Software
www.elevatesoft.com

Fri, Jun 16 2006 1:18 PMPermanent Link

Joe Real
Thanks for the fast response Tim!

Exclusive is False, the tables must be shared across the network and is accessed by DBISAM
Server also. I have a utility program which access the tables directly and I am running
the utility program locally, I don't use transactions for this loop, should I use start
transaction, commit transaction logic for the local table updates? I do have flushbuffers
on the main table that I execute every 100 record updates, should I increase this?

Thanks for any tip.

Joe


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote:

Joe,

<< Before the memory modifications, I ran the program and both CPUs were
swinging back and forth from 20% to 80% performance, and the  memory
consumed stayed around 325 MB total with 575 MB available when the local
program was running.

After the memory modifications, the program ran four times as long, both
CPU's were swinging from 7% to 20% only, and memory consumed stayed around
330 MB and 570 MB available. >>

You're not saying what is going on in this loop (transactions, exclusive
access, etc.).  Lowered CPU, for example, usually indicates that the OS is
spending more time writing to the hard drive.

--
Tim Young
Elevate Software
www.elevatesoft.com

Fri, Jun 16 2006 1:37 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Joe


The other thing you haven't mentioned is what was the performance before and after altering the buffers ie time taken.

Roy Lambert
Fri, Jun 16 2006 1:42 PMPermanent Link

Joe Real
Roy,

I mentioned that, it took 4 times longer when the only modifications I did was changing
the buffer sizes. Memory consumption did not increase much, but the CPU consumption slowed
down and the loop finished 4 times longer after the buffer sizes were changed from their
default settings, data buffer and size was increased by 4 times, the index buffer and size
was increased by six times, but the program performance decreased by a factor of 4 and
have been expecting it the other way.

Joe


Roy Lambert <roy.lambert@skynet.co.uk> wrote:

Joe


The other thing you haven't mentioned is what was the performance before and after
altering the buffers ie time taken.

Roy Lambert
Fri, Jun 16 2006 1:55 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Joe,

<< Exclusive is False, the tables must be shared across the network and is
accessed by DBISAM Server also. I have a utility program which access the
tables directly and I am running the utility program locally, I don't use
transactions for this loop, should I use start transaction, commit
transaction logic for the local table updates? >>

Well, it will help the speed a bit.  The problem is most likely caused by
the fact that DBISAM has to constantly scan the buffers list when it goes to
flush any dirty buffers to disk after each update.  When you increase the
buffer size, it increases the size of this list and thus increases the total
time it takes.  Transactions "group" the updates and therefore reduce the
effect of the buffer sizes on the individual updates.

<< I do have flushbuffers on the main table that I execute every 100 record
updates, should I increase this? >>

No, but the transactions will help.

--
Tim Young
Elevate Software
www.elevatesoft.com

Fri, Jun 16 2006 2:07 PMPermanent Link

Joe Real
Tim,

Thanks for the tips. Just a few more questions...

So if I use transactions, will increasing the buffer page and data sizes help improve
performance?

If I use transactions, can I forego the flushbuffers procedure?

Thanks,

Joe



"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote:

Joe,

Well, it will help the speed a bit.  The problem is most likely caused by
the fact that DBISAM has to constantly scan the buffers list when it goes to
flush any dirty buffers to disk after each update.  When you increase the
buffer size, it increases the size of this list and thus increases the total
time it takes.  Transactions "group" the updates and therefore reduce the
effect of the buffer sizes on the individual updates.

<< I do have flushbuffers on the main table that I execute every 100 record
updates, should I increase this? >>

No, but the transactions will help.

--
Tim Young
Elevate Software
www.elevatesoft.com

Fri, Jun 16 2006 4:47 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Joe,

<< So if I use transactions, will increasing the buffer page and data sizes
help improve performance?  >>

Not really, the transactions will use more buffer space as needed anyways.

<< If I use transactions, can I forego the flushbuffers procedure? >>

Yes, if you use the default Commit with the default FlushBuffers parameter
set to True.

--
Tim Young
Elevate Software
www.elevatesoft.com

Fri, Jun 16 2006 5:09 PMPermanent Link

Joe Real
Tim,

Thank you so much for all the help!

Will be tweaking the codes soon.

Regards,

Joe

"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote:

Joe,

<< So if I use transactions, will increasing the buffer page and data sizes
help improve performance?  >>

Not really, the transactions will use more buffer space as needed anyways.

<< If I use transactions, can I forego the flushbuffers procedure? >>

Yes, if you use the default Commit with the default FlushBuffers parameter
set to True.

--
Tim Young
Elevate Software
www.elevatesoft.com

Image