Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 15 total
Thread Trouble with repair 2mio records table
Sun, Feb 11 2007 1:23 PMPermanent Link

"Frans van Daalen"
Using 4.22 B1 dbsys on a table with >2.000.000 records both repair and
verify takes forever. Windows task manager will show dbsys <2% cpu and after
several hours hardly any progress is notices.

Any hints how to improve/solve this?

Kind regards,

Frans

Mon, Feb 12 2007 3:46 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Frans,

<< Using 4.22 B1 dbsys on a table with >2.000.000 records both repair and
verify takes forever. Windows task manager will show dbsys <2% cpu and after
several hours hardly any progress is notices. >>

The process is becoming disk-bound, which is why the CPU usage is so slow.
There's really nothing that can be done about this other than increasing the
memory buffering settings for the TDBISAMSession:

MaxTableDataBufferCount
MaxTableDataBufferSize
MaxTableIndexBufferCount
MaxTableIndexBufferSize
MaxTableBlobBufferCount
MaxTableBlobBufferSize

--
Tim Young
Elevate Software
www.elevatesoft.com

Tue, Feb 13 2007 6:18 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Tim

>The process is becoming disk-bound, which is why the CPU usage is so slow.
>There's really nothing that can be done about this other than increasing the
>memory buffering settings for the TDBISAMSession:
>
>MaxTableDataBufferCount
>MaxTableDataBufferSize
>MaxTableIndexBufferCount
>MaxTableIndexBufferSize
>MaxTableBlobBufferCount
>MaxTableBlobBufferSize

Hadn't thought about this to speed up repair/validate/optimize. Is there a recommendation as to the maximum useful settings, maybe on a basis of RAM available?

Roy Lambert
Tue, Feb 13 2007 11:40 AMPermanent Link

Sam Karl
Roy Lambert <roy.lambert@skynet.co.uk> wrote:

<<Hadn't thought about this to speed up repair/validate/optimize. Is there a
recommendation as to the maximum useful settings, maybe on a basis of RAM available?>>

I know with other databases, throwing as much ram as physically possible into indexes will
speed things up dramatically when rebuilding tables. If the index is created solely in
memory then it is 100x faster than if it is created on disk (paging). So throw a couple
more gigs in your box and set the memory to the max for indexes (should be at least 50% of
installed memory, more if you can get away with it).

Sam
Tue, Feb 13 2007 12:08 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Sam


Good idea Sam, only problem is its often not my box Smiley

Roy Lambert

ps the meaning is I'm not paying to upgrade others machines
Tue, Feb 13 2007 3:03 PMPermanent Link

"Frans van Daalen"

"Sam Karl" <sam> wrote in message
news:CF0DCB2C-BA70-4CAD-9365-57AD94DD35FA@news.elevatesoft.com...
> Roy Lambert <roy.lambert@skynet.co.uk> wrote:
>
> <<Hadn't thought about this to speed up repair/validate/optimize. Is there
> a
> recommendation as to the maximum useful settings, maybe on a basis of RAM
> available?>>
>
> I know with other databases, throwing as much ram as physically possible
> into indexes will
> speed things up dramatically when rebuilding tables. If the index is
> created solely in
> memory then it is 100x faster than if it is created on disk (paging). So
> throw a couple
> more gigs in your box and set the memory to the max for indexes (should be
> at least 50% of
> installed memory, more if you can get away with it).
>
> Sam
>
2gig machine but this table is taking more then 10!! hours to repair. Even
worse after one run of the application which uses this table I will get some
av on the database access and another repair is needed. This start happening
only after the table became above 1.800.000 records :<

Tue, Feb 13 2007 3:07 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Frans,

<< 2gig machine but this table is taking more then 10!! hours to repair.
Even worse after one run of the application which uses this table I will get
some av on the database access and another repair is needed. This start
happening only after the table became above 1.800.000 records :<  >>

Are you using LargeFileSupport:=True ?  If so, are you certain that all
applications accessing the database are using LargeFileSupport:=True ?  If
not, then you could corrupt the tables because one application won't see the
other application's locks.

--
Tim Young
Elevate Software
www.elevatesoft.com

Wed, Feb 14 2007 7:06 AMPermanent Link

"Frans van Daalen"

"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:FD335B4F-BC1E-4120-9ED2-13476793C025@news.elevatesoft.com...
> Frans,
>
> << 2gig machine but this table is taking more then 10!! hours to repair.
> Even worse after one run of the application which uses this table I will
> get some av on the database access and another repair is needed. This
> start happening only after the table became above 1.800.000 records :<  >>
>
> Are you using LargeFileSupport:=True ?  If so, are you certain that all
> applications accessing the database are using LargeFileSupport:=True ?  If
> not, then you could corrupt the tables because one application won't see
> the other application's locks.
>

No, no LargeFileSupport. And only 1 application (in VPC, Wxp with 512mb)
with access to this table via a dbisam server (running on the PC with 2gig).
Table is around 1.6gig and the index 1.2gig.

The application will access this table and some other tables (via the
server) around 150.000 + times (both inserts and edits) and then without a
warning start generating AV on access to the database. After some time the
application then disapears without a hint even in the madexcept logfile
other then av.


Debugging it is a bit hard with a repair of 10Hours Smile

Wed, Feb 14 2007 8:21 AMPermanent Link

Allan Brocklehurst
Frans van Daalen wrote:
> "Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
> news:FD335B4F-BC1E-4120-9ED2-13476793C025@news.elevatesoft.com...
>> Frans,
>>
>> << 2gig machine but this table is taking more then 10!! hours to repair.
>> Even worse after one run of the application which uses this table I will
>> get some av on the database access and another repair is needed. This
>> start happening only after the table became above 1.800.000 records :<  >>
>>
>> Are you using LargeFileSupport:=True ?  If so, are you certain that all
>> applications accessing the database are using LargeFileSupport:=True ?  If
>> not, then you could corrupt the tables because one application won't see
>> the other application's locks.
>>
>
> No, no LargeFileSupport. And only 1 application (in VPC, Wxp with 512mb)
> with access to this table via a dbisam server (running on the PC with 2gig).
> Table is around 1.6gig and the index 1.2gig.
>
> The application will access this table and some other tables (via the
> server) around 150.000 + times (both inserts and edits) and then without a
> warning start generating AV on access to the database. After some time the
> application then disapears without a hint even in the madexcept logfile
> other then av.
>
>
> Debugging it is a bit hard with a repair of 10Hours Smile
>
>

this may tick everyone  off but try defragging the disk first before the
repair

Allan
Wed, Feb 14 2007 9:56 AMPermanent Link

"Frans van Daalen"

"Allan Brocklehurst" <brock@ns.sympatico.ca> wrote in message
news:45D30B3B.8040502@ns.sympatico.ca...
> Frans van Daalen wrote:
>>
>> Debugging it is a bit hard with a repair of 10Hours Smile
>
> this may tick everyone  off but try defragging the disk first before the
> repair
>
I just did and am now waiting for the repair to finish Smile

Page 1 of 2Next Page »
Jump to Page:  1 2
Image