Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 9 of 9 total
Thread Remote Database Performance
Thu, Mar 27 2014 4:47 PMPermanent Link

Owen

So far I have been doing all my development locally with a copy of the database.    I have  my remote connection piece working now and I can run against the remote server databases.  

Now my little application that worked rather quickly with a local database takes 39 seconds from submit to data fill.  If I run the same SQL in the Database System Utility, it returns data in less than a second.

In my application, I am populating a clientdataset with  4137 records.  

With DBISAM where do I start.  Any suggestions?  
Thu, Mar 27 2014 5:03 PMPermanent Link

Owen

Attached is the plan:


================================================================================
>>>>> 4137 rows affected in 0.687 seconds



Attachments: SQL Plan.txt
Thu, Mar 27 2014 5:50 PMPermanent Link

Raul

Team Elevate Team Elevate

On 3/27/2014 4:47 PM, Owen wrote:
> So far I have been doing all my development locally with a copy of the database.    I have  my remote connection piece working now and I can run against the remote server databases.
>
> Now my little application that worked rather quickly with a local database takes 39 seconds from submit to data fill.  If I run the same SQL in the Database System Utility, it returns data in less than a second.
>
> In my application, I am populating a clientdataset with  4137 records.
>
> With DBISAM where do I start.  Any suggestions?
>

The remote slowdown is usually due to app data "chattiness"  - meaning
number of requests/responses between your app and server. Add network
latency and suddenly time goes up.

Couple of things you can do :

Session has a OnRemoteTrace event that you can connect to see how
"chatty" your app actually is.

Also check out dataset RemoteReadSize property - experiment with larger
values to see if it improves things.

Also check out BeginCachedUpdates/ApplyCachedUpdates as it can speed
commits up quite a bit.

If you're using data bound controls they can cause extra trips to server
so see if you can identify any of those (using trace).

Try the stuff above and see how much it helps. There are some other
options like moving certain expensive operations into dbsrvr functions
and using a local in-memory copy of the server tables instead of remote
but see if steps above help you get the performance up to acceptable
levels first.

Raul
Mon, Mar 31 2014 5:15 PMPermanent Link

Owen

Raul wrote:

On 3/27/2014 4:47 PM, Owen wrote:
> So far I have been doing all my development locally with a copy of the database.    I have  my remote connection piece working now and I can run against the remote server databases.
>
> Now my little application that worked rather quickly with a local database takes 39 seconds from submit to data fill.  If I run the same SQL in the Database System Utility, it returns data in less than a second.
>
> In my application, I am populating a clientdataset with  4137 records.
>
> With DBISAM where do I start.  Any suggestions?
>

The remote slowdown is usually due to app data "chattiness"  - meaning
number of requests/responses between your app and server. Add network
latency and suddenly time goes up.

Couple of things you can do :

Session has a OnRemoteTrace event that you can connect to see how
"chatty" your app actually is.

Also check out dataset RemoteReadSize property - experiment with larger
values to see if it improves things.

Also check out BeginCachedUpdates/ApplyCachedUpdates as it can speed
commits up quite a bit.

If you're using data bound controls they can cause extra trips to server
so see if you can identify any of those (using trace).

Try the stuff above and see how much it helps. There are some other
options like moving certain expensive operations into dbsrvr functions
and using a local in-memory copy of the server tables instead of remote
but see if steps above help you get the performance up to acceptable
levels first.

Raul


Thanks Raul.

My SQL is basically pulling data for the creating a clientdataset and then filling a datagrid.  Then in the application, the user selects a record to do editing etc.

I did play around with the REMOTEREADSIZE, it cut it down from 36 second to 26 seconds.

I ran a remotetracelog and have it attached.  It looks to me like it is formulating @1900 byte record.  It does a OPENBLOB/FREEBLOB thing, I don't know if that is the issue.  I have included a cut down version.  I edited the log file and included the beginning and end and cut out the repetitive stuff in the middle.  

I suspect it could be two things, size of record perhaps or the blob thing.  



Attachments: RemoteTrace.log
Tue, Apr 1 2014 11:14 AMPermanent Link

Raul

Team Elevate Team Elevate

On 3/31/2014 5:15 PM, Owen wrote:
> My SQL is basically pulling data for the creating a clientdataset and then filling a datagrid.  Then in the application, the user selects a record to do editing etc.
>
>   I did play around with the REMOTEREADSIZE, it cut it down from 36 second to 26 seconds.
>
>   I ran a remotetracelog and have it attached.  It looks to me like it is formulating @1900 byte record.  It does a OPENBLOB/FREEBLOB thing, I don't know if that is the issue.  I have included a cut down version.  I edited the log file and included the beginning and end and cut out the repetitive stuff in the middle.
>
> I suspect it could be two things, size of record perhaps or the blob thing.

Looks like blob thing. You mentioned you have 4300 records so 4000 blobs
likely means you have a blobs or memo field as part of the record?  Just
to test it can you temporarily eliminate the memo/blob from sql query
and see what happens with timing ?

Raul
Tue, Apr 1 2014 1:40 PMPermanent Link

Owen

Raul wrote:

On 3/31/2014 5:15 PM, Owen wrote:
> My SQL is basically pulling data for the creating a clientdataset and then filling a datagrid.  Then in the application, the user selects a record to do editing etc.
>
>   I did play around with the REMOTEREADSIZE, it cut it down from 36 second to 26 seconds.
>
>   I ran a remotetracelog and have it attached.  It looks to me like it is formulating @1900 byte record.  It does a OPENBLOB/FREEBLOB thing, I don't know if that is the issue.  I have included a cut down version.  I edited the log file and included the beginning and end and cut out the repetitive stuff in the middle.
>
> I suspect it could be two things, size of record perhaps or the blob thing.

Looks like blob thing. You mentioned you have 4300 records so 4000 blobs
likely means you have a blobs or memo field as part of the record?  Just
to test it can you temporarily eliminate the memo/blob from sql query
and see what happens with timing ?

Raul

Thanks Raul, I have two blob fields.  I turned them into place holders and it ran in 3 seconds.  I have attached the new tracelog.  Is that about the best I am going to get or is there some other tweak I can make?  



Attachments: RemoteTrace.log
Wed, Apr 2 2014 7:17 PMPermanent Link

Raul

Team Elevate Team Elevate

On 4/1/2014 1:40 PM, Owen wrote:
> Thanks Raul, I have two blob fields.  I turned them into place holders and it ran in 3 seconds.  I have attached the new tracelog.  Is that about the best I am going to get or is there some other tweak I can make?
>

I'm not sure about easy tweaks at this point - you might have to look at
some code changes. It looks like you're loading the whole result set
(all 4000+ record) on the client so none of the paging logic would help.

One suggestion to try would be to use SaveToStream and then
LoadFromStream to bring the data locally first and access it from local
in-mem table for example.

Hopefully others have better suggestions as well.

Raul
Thu, Apr 3 2014 2:44 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Owen


About the only thing I can think of is that if you don't need the entire ClientDataset loaded straight away load the needed portion in the main thread and then delegate the remainder to a background thread. Overall it will probably take longer but it will seem faster to the user.

Roy Lambert
Mon, Apr 28 2014 11:02 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Owen,

<< Now my little application that worked rather quickly with a local
database takes 39 seconds from submit to data fill.  If I run the same SQL
in the Database System Utility, it returns data in less than a second.

In my application, I am populating a clientdataset with  4137 records. >>

The difference here is the TClientDataSet, not DBISAM.  The TClientDataSet
is causing all of the rows to get pulled over the wire, including BLOBs.
The number of requests required to do all of that, especially with the
BLOBs, will kill performance.

Is there any particular reason that you're using the TClientDataSet
component ?

Tim Young
Elevate Software
www.elevatesoft.com


Image