Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 16 total
Thread DBIsam v4.29 build 4 Table is full and canot contain any more data
Sun, Dec 26 2010 8:27 AMPermanent Link

John Taylor

I have this bug report data (madexcept) from a customer:

exception class   : EDBISAMEngineError
exception message : DBISAM Engine Error # 9479 The table 'Table_OutGoingFax'
is full and cannot contain any more data.

main thread ($2d0):
00a1929b +017 SF5.exe    dbisamtb  2935   +1 DBISAMError
00a61bcb +087 SF5.exe    dbisamen  5478  +19 TDataEngine.RaiseError
00ab6e4c +094 SF5.exe    dbisamen 42040   +2 TBlobFile.CheckMaxBuffers
00ab69f9 +0ad SF5.exe    dbisamen 41913   +5 TBlock.PutBlockData
00a69c82 +01a SF5.exe    dbisamen  8633   +1 TBuffer.Initialize
00ab676c +024 SF5.exe    dbisamen 41873   +2 TBlock.Initialize
00476c26 +02a SF5.exe    Classes   2924   +7 TList.Add
00a6a3e0 +03c SF5.exe    dbisamen  8907   +7 TBufferedFile.GetBuffer
00a79f55 +02d SF5.exe    dbisamen 16513   +1 TDataTable.GetBlock
00a7f551 +031 SF5.exe    dbisamen 18983   +2 TDataCursor.GetBlock
00a7f4f5 +021 SF5.exe    dbisamen 18964   +1 TDataCursor.GetNextFreeBlock
00aa929c +224 SF5.exe    dbisamen 37388  +49 TDataCursor.WriteBlob
00aa9fdb +15f SF5.exe    dbisamen 37800  +31 TDataCursor.FlushBlobBuffers
00a9b58e +5fe SF5.exe    dbisamen 31175 +128 TDataCursor.ModifyRecord
00a238c8 +040 SF5.exe    dbisamtb  8967   +5 TDBISAMDataSet.InternalPost
009e63a9 +029 SF5.exe    DB       10816   +5 TDataSet.CheckOperation
009e6044 +048 SF5.exe    DB       10673   +7 TDataSet.Post
00a23a7a +00a SF5.exe    dbisamtb  9024   +1 TDBISAMDataSet.Post


The process here is importing data from .dat files in DBIsam Version 3 to
DBIsam Version 4.  What I'm doing is
copying the .dat, .blb and .idx files to a work directory, changing the file
extensions to .sfdat, .sfblb, .sfidx and
then upgrading the table using sql then importing the data into the version
4 table which has already been
created.  The sql that creates the table being imported into was created
with the default blob block size.  There
are 3 blob fields but the original tables upgraded from version 3 of dbisam
(the source table in the import) only
has 1 blob field.

Is the blob block size the problem here or is it something else ?  I know
the source table is nowhere near capacity.

The blob field stores a tiff file which is a typical black and white image
of a faxed page.  The image size of one
page would be 1728 by 2236 pixels, the tiff file will be at least 1 page but
perhaps many pages so it's not possible
to say how large a typical blob would be.  What is the best blob size for me
to use when creating the destination
table for this type of blob ? (if this is even the problem here)

Thanks

John Taylor
Mon, Dec 27 2010 3:32 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John


I don't have the source, and I switched to ElevateDB ages ago so I could be totally wrong here.

My bet would be a corrupted tiff or a damaged .blb file.

Blob block size is low for what you're storing but all that should do is slow things down as has to read the blocks.

First I'd recommend running a repair and see if that sorts it, if not its probably a matter of trying to find the damaged TIFF (or TIFFs). I don't know how robust export and import are but you could also try exporting the table, emptying it, upgrade and import the data.

Roy Lambert [Team Elevate]
Tue, Dec 28 2010 6:06 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

John,

<< Is the blob block size the problem here or is it something else ?  I know
the source table is nowhere near capacity. >>

How big is the source DBISAM 3.x .blb ?  Did you use a larger BLOB block
size in the source DBISAM 3.x table ?  That is the only way that there could
be that much of a difference in the resultant DBISAM 4.x table.

<< What is the best blob size for me to use when creating the destination
table for this type of blob ? (if this is even the problem here) >>

I would use a BLOB block size of 2-4k for the table, and turn on BLOB
compression for that BLOB field.

--
Tim Young
Elevate Software
www.elevatesoft.com
Wed, Dec 29 2010 8:24 AMPermanent Link

John Taylor

Tim,

The BLOB block size in the version 3 table is much larger.

Actually, I misstated the problem a bit , what I was doing was this:

1.  Copy version 3 table to work folder
2.  rename .dat,.blb and .idx files with new file extensions being used in
v4 (.sfdata,.sfblb,.sfidx)
3.  Upgrade table using SQL
4.  Step through table to save blob field to file, unzip then reload to blob
field like this...

Note: the v3 table BLOB contains tif file that was compressed with VCLZip
(no BLOB compression on BLOB field)

Table.First;
while not table.eof do
begin
 (Table.FieldByName('BLOB_FIELD') As TBlobField).SaveToFile(sZipName);
 if UnzipBlob(sZipName,sFileName) then
 begin
    Table.Edit;
    (Table.FieldbyName('BLOB_FIELD') As TBlobField).clear;
   (Table.FieldbyName('BLOB_FIELD') As TBlobField).LoadFromFile(sFileName);
   Table.Post;   <-- crash here with 9749 table too large
 end;
 Table.Next
end;

My intention was to then use SQL to insert into a newly created v4 table
from this upgraded v3 table with the
contents of the BLOB field now containing a tiff file that was not zipped.
The new v4 table uses BLOB compression.

I hope this make sense

John


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:F7FD3632-4713-41C4-BC3C-7CD0CCD64CD4@news.elevatesoft.com...
> John,
>
> << Is the blob block size the problem here or is it something else ?  I
> know the source table is nowhere near capacity. >>
>
> How big is the source DBISAM 3.x .blb ?  Did you use a larger BLOB block
> size in the source DBISAM 3.x table ?  That is the only way that there
> could be that much of a difference in the resultant DBISAM 4.x table.
>
> << What is the best blob size for me to use when creating the destination
> table for this type of blob ? (if this is even the problem here) >>
>
> I would use a BLOB block size of 2-4k for the table, and turn on BLOB
> compression for that BLOB field.
>
> --
> Tim Young
> Elevate Software
> www.elevatesoft.com
Thu, Dec 30 2010 5:28 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

John,

<< The BLOB block size in the version 3 table is much larger. >>

At what point during the conversion process do you change the BLOB block
size on the DBISAM 4.x table ?  And how much smaller is it ?

--
Tim Young
Elevate Software
www.elevatesoft.com
Fri, Dec 31 2010 6:35 AMPermanent Link

John Taylor

Tim,

I don't change the blob block size at all.

The process is operating on the version 3 table that was upgraded to version
4 using sql 'UPGRADE TABLE'.

After upgrading the table, I'm reading each record saving the blob field to
a zip file, unzipping the file, clearing the
blob field then reloading it with the unzipped file, then call .post on the
table and that is where the crash occurs.

The blob block size on the version 3 table is quite large about 30k+

After reloading the blob field for all records in the upgraded table, I'm
then using SQL to insert into a new table from the upgraded table,
but that is not where the problem occurs

John


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote in message
news:0B2D9EC3-B0C2-4560-B947-985A5B5384BC@news.elevatesoft.com...
> John,
>
> << The BLOB block size in the version 3 table is much larger. >>
>
> At what point during the conversion process do you change the BLOB block
> size on the DBISAM 4.x table ?  And how much smaller is it ?
>
> --
> Tim Young
> Elevate Software
> www.elevatesoft.com
Fri, Dec 31 2010 7:14 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John


Does the crash occur for all records when you're trying the write out / read back process or just some of them?

Roy Lambert [Team Elevate]
Fri, Dec 31 2010 12:26 PMPermanent Link

John Taylor

I've only gotten two bug reports from the field, I have not been able to
reproduce it here.  I don't know if it crashes on
the first record or what.

John


"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:C6C0E98B-D4C3-407C-AF27-F0A25ED0B7B0@news.elevatesoft.com...
> John
>
>
> Does the crash occur for all records when you're trying the write out /
> read back process or just some of them?
>
> Roy Lambert [Team Elevate]
>
Sat, Jan 1 2011 4:24 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

John


>I've only gotten two bug reports from the field, I have not been able to
>reproduce it here. I don't know if it crashes on
>the first record or what.

On the basis that the upgrade works and its the unload/reload that's the problem my bet is still on corruption in one or more of the TIFFs. Wether this could be caused by the DBISAM upgrade on a zipped TIFF I don't know.

Here's a wild and whacky idea.What happens if you unzip and restore into the V3 table and then upgrade?

Roy Lambert [Team Elevate]
Sat, Jan 1 2011 10:33 AMPermanent Link

John Taylor

Then the app would have to use v3 and v4 DBisam, is that even possible ?

Actually, I've taken a different approach to the import procedure...

read the upgraded table, extract blob, unzip blob then call procedure to
insert into the new table which
I hope fill resolve the issue.

John

"Roy Lambert" <roy.lambert@skynet.co.uk> wrote in message
news:33750269-BD5B-4D5F-B4B8-0CC7220287DE@news.elevatesoft.com...
> John
>
>
>>I've only gotten two bug reports from the field, I have not been able to
>>reproduce it here. I don't know if it crashes on
>>the first record or what.
>
> On the basis that the upgrade works and its the unload/reload that's the
> problem my bet is still on corruption in one or more of the TIFFs. Wether
> this could be caused by the DBISAM upgrade on a zipped TIFF I don't know.
>
> Here's a wild and whacky idea.What happens if you unzip and restore into
> the V3 table and then upgrade?
>
> Roy Lambert [Team Elevate]
Page 1 of 2Next Page »
Jump to Page:  1 2
Image