Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 13 total
Thread Strict change dtection revisited
Tue, Jun 26 2018 9:32 PMPermanent Link

Arthur Williams

Zarksoft

Avatar

It seems there is a problem with lazy change detection, in that it doesn't detect any changes made by other sessions. The manual says it will spot changes to a record that is about to be edited and a couple of other change cases.

In my example, a separate application adds a record to the table. The first application then sets a filter and reads the table selecting matched records. Records added by another session are not detected.

Now I can understand lazy not detecting changes to records. Adding a new record updates some global values, not least of which would be number of records to be scanned on any query. Shouldn't lazy change detect that case ? Otherwise I see no point in allowing it, you must always use strict detection because lazy will never detect some changes.

This is 4.43 build 3 I think, in C/S mode. About to upgrade to 4.45.
Tue, Jun 26 2018 9:51 PMPermanent Link

Raul

Team Elevate Team Elevate

On 6/26/2018 9:32 PM, Arthur Williams wrote:
> It seems there is a problem with lazy change detection, in that it doesn't detect any changes made by other sessions. The manual says it will spot changes to a record that is about to be edited and a couple of other change cases.

Where does it say that exactly ?  Lazy AFAIk only detects changes when
it needs to read new data or you're posting actual changes (it switches
to strict for those ).

> In my example, a separate application adds a record to the table. The first application then sets a filter and reads the table selecting matched records. Records added by another session are not detected.

I believe that means that the data was already cached by the session -
reading cached data with lazy will not be going to run any additional
change detection.  Refresh when setting filter should fix this.


> Now I can understand lazy not detecting changes to records. Adding a new record updates some global values, not least of which would be number of records to be scanned on any query. Shouldn't lazy change detect that case ? Otherwise I see no point in allowing it, you must always use strict detection because lazy will never detect some changes.

It should detect changes IF it needs to go and retrieve more data.
Otherwise you'd need to refresh manually.

Otherwise strict detection does exactly what you want - it would check
for changes when setting filters etc.

However this will incur extra cost of connections and data transfer so
really comes down to app and data design.

If you want DBISAM to handle this use strict or if you only have couple
of key points where this is problem calling lazy + refresh might be more
efficent

Raul
Wed, Jun 27 2018 1:47 PMPermanent Link

Arthur Williams

Zarksoft

Avatar

Actually, a strictly literal interpretation of the manual is that lazy change detection means no change detection. It says " Lazy change detection works by only checking for changes by other sessions when DBISAM cannot find the desired data locally in its cache". However, this case doesn't occur. Since it doesn't detect changes by other sessions, the only time it won't be able to find the data would be on the initial opening of the table. Once a few records have been loaded, it will never need to do a refresh.

Since it never notices records being added or deleted, as long as there are some records in the cache there can never be a case of not found. If the requested record is not in the cache then a no result is returned; no check for change is requested performed because no change can have occurred.

Lazy plus refresh would be worse than strict because at least strict will not cause a cache dump if there weren't actually any changes, whereas refresh will. So I'll just have to go through my projects and turn strict on everywhere. I had thought lazy meant no detection of field changes to cached records, rather than no detection of any table changes.
Thu, Jun 28 2018 1:39 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Arthur


From personal experience with older DBISAMs before switching to ElevateDB I'd say you'll regret using strict detection everywhere and your performance will suffer overall.

Roy Lambert
Thu, Jun 28 2018 5:10 PMPermanent Link

Arthur Williams

Zarksoft

Avatar

Roy Lambert wrote:

Arthur


From personal experience with older DBISAMs before switching to ElevateDB I'd say you'll regret using strict detection everywhere and your performance will suffer overall.

Roy Lambert

>>>>>>>>>>>>>>>>

Well that may be, but poor performance is preferable to not getting correct results returned. Slow is ok, wrong is not.
Fri, Jun 29 2018 2:17 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Arthur


>Well that may be, but poor performance is preferable to not getting correct results returned. Slow is ok, wrong is not.

As a developer the emphasis of speed over correctness is good, but, in general, users want speed. On a more positive side all the internet apps are nicely training users to wait Smiley

Roy
Fri, Jun 29 2018 12:45 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Arthur,

<< Lazy plus refresh would be worse than strict because at least strict will not cause a cache dump if there weren't actually any changes, whereas refresh will. >>

No, calling the Refresh method will not do this - it simply checks to see if any changes took place, just like the strict change detection, and will not cause a cache dump if there weren't any changes.

It *will* cause a dump of the TDataSet record cache, but a) we have no control over that, b) that typically only means 1 record for non-grid-connected datasets, and c) in the case that no changes were found, the TDataSet-descendant instance will only end up re-reading records that are in the DBISAM cache.

<< So I'll just have to go through my projects and turn strict on everywhere. I had thought lazy meant no detection of field changes to cached records, rather than no detection of any table changes.
>>

The term is "change detection", so it definitely implies that the different methods (strict vs lazy) perform the process of detecting changes in different ways, and the manual makes it pretty clear that the difference is *when and how often* the change detection takes place.

Tim Young
Elevate Software
www.elevatesoft.com
Fri, Jun 29 2018 12:50 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Arthur,

<< Well that may be, but poor performance is preferable to not getting correct results returned. Slow is ok, wrong is not. >>

There is no such thing as "wrong" in this respect.  Is the data "wrong" that has been sitting there on a users desktop in a grid control, but has been updated by other users/sessions ?  The user could read the data and physically act on it without issue when using *any database engine/server on the planet", and it still wouldn't be a flaw in the database engine/server.

DBISAM allows *you* to control how "fresh" you want data to be - it is the developers job to determine how to balance an application's needs in terms of fresh data vs performance.

Tim Young
Elevate Software
www.elevatesoft.com
Fri, Jun 29 2018 4:42 PMPermanent Link

Arthur Williams

Zarksoft

Avatar

Tim Young [Elevate Software] wrote:

Arthur,

<< Well that may be, but poor performance is preferable to not getting correct results returned. Slow is ok, wrong is not. >>

There is no such thing as "wrong" in this respect.  Is the data "wrong" that has been sitting there on a users desktop in a grid control, but has been updated by other users/sessions ?  The user could read the data and physically act on it without issue when using *any database engine/server on the planet", and it still wouldn't be a flaw in the database engine/server.

DBISAM allows *you* to control how "fresh" you want data to be - it is the developers job to determine how to balance an application's needs in terms of fresh data vs performance.

Tim Young
Elevate Software
www.elevatesoft.com

>>>>>>>>>>>>>>>>>>>>>>

It's not wrong that DBISAM doesn't detect field level changes to existing records. I can accept that as a lazy change detection issue. What is wrong is when a brand new record is added to the dataset and DBISAM says the dataset has not changed. So if I have a grid with 100 records in it and another sessions adds 5,000 records, for DBISAM to say on a filtered read, yep, still only 100 records, that's just wrong.
Fri, Jun 29 2018 6:01 PMPermanent Link

Arthur Williams

Zarksoft

Avatar

Tim Young [Elevate Software] wrote:


The term is "change detection", so it definitely implies that the different methods (strict vs lazy) perform the process of detecting changes in different ways, and the manual makes it pretty clear that the difference is *when and how often* the change detection takes place.

Tim Young
Elevate Software
www.elevatesoft.com

>>>>>>>>>>>>>>>>>>

Except you're wrong. It says " Lazy change detection works by only checking for changes by other sessions when DBISAM cannot find the desired data locally in its cache". What this means is that once the session is loaded, DBISAM doesn't check for changes anymore. It can always find the desired data, and if it's not there, then it's just not in the data. It does not check to see if it's missing anything. So lazy detection means no detection.

I know this is true because of my example. I do a filtered pass with .Findfirst/.Findnext. Hours later another session adds brand new records, which will match the filter. Another filter pass does not produce any new records. DBISAM missed relevant data and did not check for anything new. Other databases do not do this, they will detect new records added (Nexus/MySQL), even if they don't detect field level changes like DBISAM doesn't.

I can accept that updates to locally cached records are not detected when lazy change is in effect. The fact that global table changes (like next AutoInsert #) are not detected is incomprehensible. I haven't tested it, but I assume I could delete the table and reload the grid and DBISAM will return all the records like nothing has happened. It's not correct behaviour.
Page 1 of 2Next Page »
Jump to Page:  1 2
Image