Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 10 total
Thread Caching Lookup Data (Possible Speed Improvement?)
Sun, May 27 2007 11:26 PMPermanent Link

"Adam H."
Hi Tim,

I'm currently looking for some ways I might be able to improve speed to my
application, and have found one area that seems to be a possibility.

I have some large queries that I use for lookup data for lookupcomboboxes,
etc. I've noticed the more lookup combo boxes I add with the query as a
lookup source, the longer it takes to open the query and populate the lookup
components. (Understandable).

However - what I have noticed is that there seems to be a lot of traffic
going back and forth over the Lan the more lookup combo boxes I have.

I've done some experementing, and found the following:

Using stLocal or stRemote if I run the following query - it takes 0.2
seconds to execute:

select * From Names



If I add a number of lookup combo boxes that look at the TDBISAMQuery as the
lookup source - this extends the execute time to 2.3 seconds.



If I change the SQL to the following, it reduces that time to 0.7 seconds

Select into Memory\MyNameCache * From Names


I figure that what I've effectively done is told the query to cache the data
locally, and any work needed to be done on the lookup components are done
from the cached data in the memory.


However - if I attempt to do the same with stRemote, the data (obviously) is
cached in the server's memory, and still takes 2.3 seconds to open the
TDBISamQuery.

Is it possible for me to cache data locally when using stRemote so I can
benefit from this kind of caching for lookup data?

(DBISam 4.25b4)

Thanks & Regards

Adam.

Mon, May 28 2007 3:37 AMPermanent Link

Dave M
Adam H. wrote:


> However - what I have noticed is that there seems to be a lot of traffic
> going back and forth over the Lan the more lookup combo boxes I have.
>
Hope you don't mind a reply, since you addressed your question to Tim.
Some time ago I built a quick "monitor" for dbisam, using RemoteTrace.
If I remember correctly, part of the TDataSet architecture is that
anything that moves the record pointer to the first or last records
causes a resend of RemoteReadSize. Thus, making the read size bigger can
make the problem worse. If I remember correctly, it happens even with
cached updates. The only workarounds I can recall is streaming or
ClientDataSets. Please note the record pointer will be moved to BOF to
fill the ClientDataSet.

Dave M.
Mon, May 28 2007 5:24 AMPermanent Link

Dave M

Did a recheck and have some corrections. Whatever causes an
internalfirst, or internallast causes the reread. Cachedupdates *does* help.

 Dave M.
Mon, May 28 2007 6:56 PMPermanent Link

"Adam H."
Hi Dave,

> Hope you don't mind a reply, since you addressed your question to Tim.

Not at all - the more help, the better. Smile

> Some time ago I built a quick "monitor" for dbisam, using RemoteTrace. If
> I remember correctly, part of the TDataSet architecture is that anything
> that moves the record pointer to the first or last records causes a resend
> of RemoteReadSize. Thus, making the read size bigger can make the problem
> worse. If I remember correctly, it happens even with cached updates. The
> only workarounds I can recall is streaming or ClientDataSets. Please note
> the record pointer will be moved to BOF to fill the ClientDataSet.

> Did a recheck and have some corrections. Whatever causes an internalfirst,
> or internallast causes the reread. Cachedupdates *does* help.

The problem that I have is that the slowness is caused during the Opening of
the dataset. I can't use BeginCachedUpdates until the dataset is opened due
to the lookupcombo boxes.

I'd love to be able to make an inherited dataset component from
TDBISamQuery, give it a cache property, and tell it to ignore the refreshing
of data when it hits the end/start of a resultset (or ignore refreshing data
at all, unless it's closed and reopened), but I think it's over my head.
(Athough Tim, if you want to consider adding this function in a future
release, I'd be very grateful Smiley I think it would help to speed up some
applications significantly. However - I'm not sure whether this would also
be practicable with the internals of DBISam.

Instead, for the time being, I decided to take on your advise about
streaming the data, and have created the procedure below that allows me to
use local tables to cache the data, without having to add any components to
the form, but by simply adding the following lines to the beforeopen
property of a TDBISam Query. This seems to do the job, and allows me to add
a number of existing 'lookup queries' to use local cache mode without the
need to modify too much of the application.

It requires one table on the form, for storing the original SQL in cases of
closing and reopening the query for refreshing, and another DBISamDatabase
and DBISamSession component pointing to a temporary local location for the
cached data. I delete the files in that location when closing down the
application (probably not the optimum way, but the only practicable option I
can see at present). Maybe it can be of some help to someone else in the
future. (Or, please feel free to comment, if you see me going into an area I
shouldn't be Smiley

procedure TDMTickets.NameListQBeforeOpen(DataSet: TDataSet);
begin
 if formsetup.cxcachelookup.checked then //A simple checkbox that allows me
to turn this option on and off within the application
   CacheLocally(NameListQ, 'DMTickets_NameListQ');
end;


procedure TDMTickets.CacheLocally(SourceQuery: TDBISamQuery; CacheName:
string);
var
 NQ: TDBISamQuery;
 MS: TStream;
 TmpTable: TDbiSamTable;
begin
 CacheM.open;
 if CacheM.locate('Component', SourceQuery.name, []) then //Check to see if
we have already changed the SQL of this query
 begin // If so, obtain the original SQL and reassign to the query's SQL
property
   SourceQuery.sql.clear;
   SourceQuery.sql.text := CacheMOrigSQL.asstring;
   SourceQuery.databasename := mainform.db.databasename;
   SourceQuery.sessionname := Mainform.dbs.sessionname;
 end
 else
 begin //If not, Add the original SQL to an internal Memory Table so we can
retrieve the original SQL if the table is closed and reopened later.
   CacheM.insert;
   CacheMComponent.value := SourceQuery.name;
   CacheMOrigSQL.asstring := SourceQuery.sql.text;
   CacheM.post;
 end;

 MS := TMemoryStream.create;
 NQ := TDBISAMQuery.Create(self);
 NQ.Name := CacheName;
 NQ.DatabaseName := SourceQuery.databasename;
 NQ.SessionName := SourceQuery.sessionname;
 NQ.Datasource := SourceQuery.datasource;
 NQ.sql.assign(SourceQuery.sql);
 NQ.open;
 NQ.SaveToStream(MS);
 MS.position := 0;

 TmpTable := TDBISamTable.Create(Self);
 TmpTable.sessionname := mainform.DBSTemp.sessionname;
 tmpTable.databasename := Mainform.DBTemp.DatabaseName;
 tmptable.tablename := 'Cache_' + NQ.Name;
 nq.fielddefs.update;
 tmptable.fielddefs.assign(NQ.fielddefs);
 if tmpTable.exists then
   tmptable.deletetable;
 tmptable.CreateTable;
 tmptable.open;
 tmptable.LoadFromStream(MS);
 tmptable.close;
 SourceQuery.close;
 SourceQuery.sql.clear;
 SourceQuery.sql.add('Select * from ' + tmptable.tablename);
 SourceQuery.sessionname := tmptable.sessionname;
 SourceQuery.databasename := tmptable.databasename;

 tmptable.free;
 tmptable := nil;
 NQ.Close;
 NQ.free;
 NQ := nil;
 MS.free;
 MS := nil;

end;


Cheers

Adam.

Mon, May 28 2007 10:48 PMPermanent Link

Dave M
Adam H. wrote:
> Hi Dave,
> The problem that I have is that the slowness is caused during the Opening of
> the dataset. I can't use BeginCachedUpdates until the dataset is opened due
> to the lookupcombo boxes.

Maybe I'm missing something here, but what I had in mind was keeping the
combobox datasources set to nil, then looping thru the forms' components
looking the correct combboxes and setting their datasources after doing
the query exec and cached updates. Less code.

>
> I'd love to be able to make an inherited dataset component from
> TDBISamQuery, give it a cache property, and tell it to ignore the refreshing
> of data when it hits the end/start of a resultset (or ignore refreshing data
> at all, unless it's closed and reopened), but I think it's over my head.

I ran into this problem with ClientDataSets via the web. My guess is
that edb may change this behavior. I had the same idea, but it sounded
too much like too much work.

dave m
Tue, May 29 2007 6:18 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Dave,

<< I ran into this problem with ClientDataSets via the web. My guess is that
edb may change this behavior. >>

EDB doesn't change things significantly in this area except that it can
navigate cached rows bi-directionally, which DBISAM did not do.  However, it
still doesn't use cached rows for finds, ranges, etc.   The only way to
cause something to cache all rows locally with DBISAM is to use the cached
updates functionality:

BeginCachedUpdates
CancelCachedUpdates to throw away the rows, or ApplyCachedUpdates if no row
changes were made

--
Tim Young
Elevate Software
www.elevatesoft.com

Tue, May 29 2007 6:35 PMPermanent Link

"Adam H."
Good Morning Tim,

Firstly - please thank that wonderful wife of yours for dealing with my
sales email so quickly! Really made my day. (And made me look good in front
of my clients) Smile

> EDB doesn't change things significantly in this area except that it can
> navigate cached rows bi-directionally, which DBISAM did not do.  However,
> it still doesn't use cached rows for finds, ranges, etc.   The only way to
> cause something to cache all rows locally with DBISAM is to use the cached
> updates functionality:
>
> BeginCachedUpdates
> CancelCachedUpdates to throw away the rows, or ApplyCachedUpdates if no
> row changes were made

If I'm only using a TDBISamQuery as lookup data (as explained in my previous
email), is their a problem I call BeginCachedUpdates after opening the
dataset, and then never calling CancelCachedUpdates or ApplyCachedUpdates,
but just leaving CachedUpdates active until the datamodule is free'd?

Or am I better off Streaming the data locally in a c/s application as per my
previous post?

Thanks & Regards

Adam.

Tue, May 29 2007 6:36 PMPermanent Link

"Adam H."
Hi Dave,

> Maybe I'm missing something here, but what I had in mind was keeping the
> combobox datasources set to nil, then looping thru the forms' components
> looking the correct combboxes and setting their datasources after doing
> the query exec and cached updates. Less code.

Ahh - I think it was me who missed something. Smiley

Your post gave me another idea. Instead of updating the comboboxe's
datasource property, it may be better again if I just Disable the
TDatasource component used to link the comboboxes to the dataset until after
I've done the cached updates.

Cheers

Adam.

Wed, May 30 2007 9:34 AMPermanent Link

"David Farrell-Garcia"
Adam,  I have had very good luck using ClientDatasets to cache loookup
data. I have a BaseLookupDataModule and inherit from that for each
specific lookkup type.  In my large application there may be over 100
differnt lookups, so I don't load them all at application start, as
some are quite large. Each LookupDataModule is only loaded the first
time it is requested, so there is a small delay the first time an
end-user, for example, does a customer lookup.  But each subsequent
customer lookup is very fast, and so on. The ClientDataSet requests the
records before any datasources are actually connected so it is pretty
fast.  The user has the ability to refresh the lookups.

That all works very well, but there was a problem if a user was not
looking up records but simply entering a value into an edit form, when
then performed the lookup behind the scenes.  I added a function to the
lookupDataModule that would perform a query (separate from the lookup
query)  and return the record if found or notify the user if not found.

I have expanded the BaseLookupDataModule over time to include a number
of other functions such as FindByName, FindByID, AddToLookup (when a
user adds a new record to a lookup table maintenance screen) etc which
are used by the application in various scenarios. Adding a new lookup
is a piece of cake since all the basic functionality is included in the
BaseLookupDataModule.

The LookupDataModules are reference counted, so if the REferenceCount
gets to zero I dump the table into a local temp folder in native cds
format and when referenced again it loads it from local disk, rather
then hitting the server.

This has resulted in a very fast lookup system, mostly transparent to
the end user and minimizes large memory use by dumping the tables
locally when no longer being used.  The server is hit only when the
lookup table is first loaded or refreshed (not a common occurance). It
also gives me the advantage of being able to use Lookup fields for
display, which is not usually efficient in a C/S scenario.


--
David Farrell-Garcia
Whidbey Island Software, LLC
Wed, May 30 2007 12:36 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Adam,

<< Firstly - please thank that wonderful wife of yours for dealing with my
sales email so quickly! Really made my day. (And made me look good in front
of my clients) Smile>>

I certainly will, and thank you for the order.

<< If I'm only using a TDBISamQuery as lookup data (as explained in my
previous email), is their a problem I call BeginCachedUpdates after opening
the dataset, and then never calling CancelCachedUpdates or
ApplyCachedUpdates, but just leaving CachedUpdates active until the
datamodule is free'd? >>

Sure, that's fine.  If you don't call Apply* or Cancel*, DBISAM will simply
cancel the cached updates when you close the table or query.

<< Or am I better off Streaming the data locally in a c/s application as per
my previous post? >>

What you are doing there is exactly what the cached updates does for you.

--
Tim Young
Elevate Software
www.elevatesoft.com

Image