Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » DBISAM Technical Support » Support Forums » DBISAM General » View Thread |
Messages 1 to 9 of 9 total |
DBISAM Engine Error # 15002 Error uncompressing data |
Tue, Jun 26 2012 4:54 AM | Permanent Link |
Danie van Eeden | Hi,
currently running on DBISAM 4.29 build 4. We have a table containing large amounts of BLOB data (raw email data) and get this error every now and then. When it occurs we eventually end up with Buffers Corrupt / Header information corrupt (usually due to partial processing because of the 15002 error). The only remedy so far was to remove the table and start from scratch with a clean table. The table structure is as follows (we are using compression for the memo fields and for the indexes. We also have Large File Support enabled). Any help / insight would be greatly appreciated. It seems the problem is getting the best of me. With FieldDefs Do Begin Clear; Add('Number', ftAutoInc, 0, False); Add('From', ftString, 100, False); Add('To', ftString, 100, False); Add('Subject', ftString, 100, False); Add('Status', ftInteger, 0, False); Add('StatusData', ftMemo, 0, False, '', '', '', '', fcNoChange, BEST_COMPRESSION); Add('StatusDataSize',ftInteger, 0, False); Add('EmailAccount', ftInteger, 0, False); Add('Data', ftMemo, 0, False, '', '', '', '', fcNoChange, BEST_COMPRESSION); Add('DataSize', ftInteger, 0, False); Add('Created', ftDateTime, 0, False); Add('Modified', ftDateTime, 0, False); Add('Company', ftString, 3, False); End; With IndexDefs Do Begin Clear; Add('', 'Number', [ixPrimary, ixUnique]); Add('Status','Status;Number',[],'',icFull); Add('Company','Company',[],'',icFull); End; Many thanks Regards Danie van Eeden |
Tue, Jun 26 2012 5:11 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | Danie
Reading the help <<This error occurs when DBISAM attempts to uncompress a buffer and the uncompression fails. This error should never occur, so if you receive this error you should immediately contact Elevate Software for more information on how to resolve this issue.>> I suggest you contact Tim direct rather than waiting for him to look at the newsgroups. Roy Lambert [Team Elevate] |
Tue, Jun 26 2012 5:16 AM | Permanent Link |
Danie van Eeden | Hi many thanks for the reply.
I have emailed him. Awaiting his reply. Kind Regards Danie |
Tue, Jun 26 2012 8:18 AM | Permanent Link |
Raul Team Elevate | Out of curiosity have you tried it with newer build? While i see nothing to that error in the incident reports there are quite a few fixes especially in 4.30. Raul On 6/26/2012 4:54 AM, Danie van Eeden wrote: > currently running on DBISAM 4.29 build 4. We have a table containing large amounts of BLOB data (raw email data) and get this error every now and then. When it occurs we eventually end up with Buffers Corrupt / Header information corrupt (usually due to partial processing because of the 15002 error). The only remedy so far was to remove the table and start from scratch with a clean table. |
Tue, Jun 26 2012 8:23 AM | Permanent Link |
Danie van Eeden | Hi, no I have not yet tried with the newer build (reason being that we are compiling against source and have some local changes). I am hoping to get some kind of confirmation on whether the newer build has addressed similar issues before attempting the upgrade.
Kind Regards Danie Raul wrote: Out of curiosity have you tried it with newer build? While i see nothing to that error in the incident reports there are quite a few fixes especially in 4.30. Raul On 6/26/2012 4:54 AM, Danie van Eeden wrote: > currently running on DBISAM 4.29 build 4. We have a table containing large amounts of BLOB data (raw email data) and get this error every now and then. When it occurs we eventually end up with Buffers Corrupt / Header information corrupt (usually due to partial processing because of the 15002 error). The only remedy so far was to remove the table and start from scratch with a clean table. |
Tue, Jun 26 2012 8:39 AM | Permanent Link |
Raul Team Elevate | There is nothing directly referencing this error in incident reports as
far as i can see : http://www.elevatesoft.com/incident?category=dbisam I believe this happens either if there is some external influence - network issues for remote sessions, AV interfering with the communications,etc. If you're using a custom compiled DBISAM version then there is also a possibility of that affecting it - for example not all compiler flags being set properly or such. Elevate support direct as suggested by Roy is the best route here Raul On 6/26/2012 8:23 AM, Danie van Eeden wrote: > Hi, no I have not yet tried with the newer build (reason being that we are compiling against source and have some local changes). I am hoping to get some kind of confirmation on whether the newer build has addressed similar issues before attempting the upgrade. > > Kind Regards > Danie > > Raul wrote: > > > Out of curiosity have you tried it with newer build? While i see nothing > to that error in the incident reports there are quite a few fixes > especially in 4.30. > > Raul > > On 6/26/2012 4:54 AM, Danie van Eeden wrote: >> currently running on DBISAM 4.29 build 4. We have a table containing large amounts of BLOB data (raw email data) and get this error every now and then. When it occurs we eventually end up with Buffers Corrupt / Header information corrupt (usually due to partial processing because of the 15002 error). The only remedy so far was to remove the table and start from scratch with a clean table. > |
Tue, Jun 26 2012 8:53 AM | Permanent Link |
Danie van Eeden | Hi Raul,
thanks again for the reply. Our local changes aren't major changes - and would not be a processing depth (we set some properties and prevent programmers from doing things they shouldn't, perform some debug logging) This is the first time we have used compression on memo / blob fields and also the first time the error occurred. Will take your other advice into consideration. Danie Raul wrote: There is nothing directly referencing this error in incident reports as far as i can see : http://www.elevatesoft.com/incident?category=dbisam I believe this happens either if there is some external influence - network issues for remote sessions, AV interfering with the communications,etc. If you're using a custom compiled DBISAM version then there is also a possibility of that affecting it - for example not all compiler flags being set properly or such. Elevate support direct as suggested by Roy is the best route here Raul On 6/26/2012 8:23 AM, Danie van Eeden wrote: > Hi, no I have not yet tried with the newer build (reason being that we are compiling against source and have some local changes). I am hoping to get some kind of confirmation on whether the newer build has addressed similar issues before attempting the upgrade. > > Kind Regards > Danie > > Raul wrote: > > > Out of curiosity have you tried it with newer build? While i see nothing > to that error in the incident reports there are quite a few fixes > especially in 4.30. > > Raul > > On 6/26/2012 4:54 AM, Danie van Eeden wrote: >> currently running on DBISAM 4.29 build 4. We have a table containing large amounts of BLOB data (raw email data) and get this error every now and then. When it occurs we eventually end up with Buffers Corrupt / Header information corrupt (usually due to partial processing because of the 15002 error). The only remedy so far was to remove the table and start from scratch with a clean table. > |
Thu, Jul 12 2012 4:58 AM | Permanent Link |
Danie van Eeden | Just an update,
Haven't received any response thus far. So far I have found (testing with client) that by moving from Best_Compression to Default_Compression on the blob fields seems to have solved the "Error Uncompressing Data" issue. Still not sure why. I suppose less compression is better than a broken table - even though data will become big very quickly. |
Thu, Aug 2 2012 2:51 PM | Permanent Link |
Tim Young [Elevate Software] Elevate Software, Inc. timyoung@elevatesoft.com | Danie,
<< Haven't received any response thus far. So far I have found (testing with client) that by moving from Best_Compression to Default_Compression on the blob fields seems to have solved the "Error Uncompressing Data" issue. Still not sure why. I suppose less compression is better than a broken table - even though data will become big very quickly. >> My guess is that is just a coincidence and that the underlying cause is still there but not showing up. Thanks for the update, Tim Young Elevate Software www.elevatesoft.com |
This web page was last updated on Friday, April 19, 2024 at 07:09 AM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |