Icon View Incident Report

Serious Serious
Reported By: Kick Martens
Reported On: 4/30/2010
For: Version 2.03 Build 13
# 3205 Using IMPORT TABLE to Import a Text File Containing Large CLOB Columns Can Cause Memory Exhaustion

I am trying to import a large file into a table. But I get the following error message all the time:

Error #100 (a write operation did not complete and a repair is needed)

Its around the same record every time. But when I make a seperate file with just the records that are around the problem zone, they import fine. So it seems it has something to do with the size. I checked the "Support large files" in de server. I do not have any threads, just 1 connection to the server. The file I am trying to import is 1.8Gb, and contains a field that I import in a CLOB. When I skip this field in the import. It works fine.


Comments Comments
The problem was that ElevateDB was internally setting a high-water mark for the buffer managers incorrectly, causing the allocation size to constantly grow and eventually exhausting the 2GB address space for the process.


Resolution Resolution
Fixed Problem on 5/2/2010 in version 2.03 build 14


Products Affected Products Affected
ElevateDB Additional Software and Utilities
ElevateDB DAC Client-Server
ElevateDB DAC Client-Server with Source
ElevateDB DAC Standard
ElevateDB DAC Standard with Source
ElevateDB DAC Trial
ElevateDB LCL Standard with Source
ElevateDB VCL Client-Server
ElevateDB VCL Client-Server with Source
ElevateDB VCL Standard
ElevateDB VCL Standard with Source
ElevateDB VCL Trial

Image