Icon View Thread

The following is the text of the current message along with any replies.
Messages 11 to 12 of 12 total
Thread DBISAM Capacity Question
Tue, Feb 27 2018 6:43 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Jose

>Querying tables with more than 10 milion records is a bit slow because DBISAM needs to build a bitmap of pointers and sometimes it needs to read all the records.

I had forgotten that bit

>I have tried (only in my lab) a software called RamDisk (https://www.softperfect.com/products/ramdisk) and this is a fantastic way to solve or decrease the speed of filtering and reading data.

If its faster than memory tables I wonder why.

Roy
Tue, Feb 27 2018 1:56 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Arthur,

<< The technical capacity isn't really relevant. I want to know what the approximate size of table can be used in DBISAM and still remain practical enough to use. I'm also not talking about memory tables, just tables in general. I already know that DBISAM is no good for tables with millions of records. The statement:

mytable.filter := Active;

when you have constructed a filter string takes between 10 and twenty seconds to execute if the table in general has several million records. That's not practical for real world use.  >>

Check the filter optimization level using this property:

https://www.elevatesoft.com/manual?action=viewprop&id=dbisam4&product=rsdelphiwin32&version=10T&comp=TDBISAMDataSet&prop=FilterOptimizeLevel

You can also see this property in the status bar in the Database System Utility.  You'll want your filter to be at least partially-optimized, and ideally, fully-optimized.

Tim Young
Elevate Software
www.elevatesoft.com
« Previous PagePage 2 of 2
Jump to Page:  1 2
Image