Icon View Thread

The following is the text of the current message along with any replies.
Messages 21 to 30 of 40 total
Thread Out of memory.
Wed, Oct 25 2006 9:42 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Abdulaziz


Without the data its difficult to see what's happening. Just looking at the code I can't readily see a reason for what's going on.

I just extracted the code below from the Swiss Delphi Centre website. Try putting this in your RunSQL function (I'd write the results into a memo table) and see if the memory loss is progressive or just happens with a bang.

Roy Lambert


procedure TForm1.Button1Click(Sender: TObject);
var
 memory: TMemoryStatus;
begin
 memory.dwLength := SizeOf(memory);
 GlobalMemoryStatus(memory);
 ShowMessage('Total memory: ' + IntToStr(memory.dwTotalPhys div (1024*1024)) + ' Mb'+#13+
 'Available memory: ' + IntToStr(memory.dwAvailPhys div (1024*1024)) + ' Mb');
end;
Wed, Oct 25 2006 1:45 PMPermanent Link

Abdulaziz Jasser
Roy,


<<Without the data its difficult to see what's happening. Just looking at the code I can't readily see a reason for what's going on.>>

Yes you are right, but the problem is the size of those tables.  I am thinking of sending the database to Tim on a CD by mail.


<<I just extracted the code below from the Swiss Delphi Centre website. Try putting this in your RunSQL function (I'd write the results into a memo
table) and see if the memory loss is progressive or just happens with a bang. >>

I did it the way you said it and it is showing a lot of memory lost when running the INSERT clause.
Wed, Oct 25 2006 6:17 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Abdulaziz,

<< This was my first thing to be tested.  But the same problem is happing
even without opening a transaction. >>

There's no way that DBISAM will buffer all 100,000 inserts if you aren't
using a transaction.  It simply won't do it, nor will it expand the
in-memory table further because you're selecting data *from* it, not
inserting data into it.  IOW, without a transaction, if the first INSERT
into the in-memory table works, i.e. there is enough memory for it to
complete, then you should be home free without any memory issues.

--
Tim Young
Elevate Software
www.elevatesoft.com

Thu, Oct 26 2006 2:33 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Abdulaziz


What's the memory usage figures just before the INSERT and just after? Also what are the size of the files involved in the tables?


Roy Lambert
Thu, Oct 26 2006 10:06 AMPermanent Link

Abdulaziz Jasser
Roy,


<<What's the memory usage figures just before the INSERT and just after? Also what are the size of the files involved in the tables?>>

280 MGB before the insert and 350 after the first insert.  But when repeating the same code fore another table the used memory keep growing.
The tables size between 20 to 30 MGB.
Thu, Oct 26 2006 10:09 AMPermanent Link

Abdulaziz Jasser
Tim,

I know it's a wired problem.  Therefore I will send you a small project with the real data on a CD to the address below.  You should receive it in a
week.

Elevate Software, Inc.
168 Christiana Street
North Tonawanda, NY 14120
Thu, Oct 26 2006 6:55 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Abdulaziz,

<< I know it's a wired problem.  Therefore I will send you a small project
with the real data on a CD to the address below.  You should receive it in a
week. >>

Thanks.  I will take a look as soon as it arrives.

--
Tim Young
Elevate Software
www.elevatesoft.com

Sat, Oct 28 2006 6:37 PMPermanent Link

Charles Tyson
Abdulaziz,

I've made some experiments based on your code and using dummy tables to insert from (some
with .BLB files up to 75 MB on disk).  If the series of events isn't wrapped in a
transaction, I never see a memory allocation that isn't released as soon as the
DeleteTable method is called.  If there is a universal transaction, I do get an "out of
memory" error after the program has allocated several hundred megabytes of memory.

Maybe Tim will come up with something, but in the meantime have you considered discarding
memory tables and performing your operations on disk-based tables?  If Windows does its
file caching properly, the extra time required may be much less than you expect.

Here's the unit I tested with...




unit Unit1;

interface

uses
 Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs,
 Db, dbisamtb, StdCtrls;

type
 TdmAccount = class(TForm)
   Button1: TButton;
   DBISAMSession1: TDBISAMSession;
   DB1: TDBISAMDatabase;
   Label1: TLabel;
   Edit2: TEdit;
   procedure Button1Click(Sender: TObject);
 private
   { Private declarations }
 public
   { Public declarations }
 end;

var
 dmAccount: TdmAccount;

implementation

{$R *.DFM}

procedure RunSQL(const sSQL : String);
var
 qrySQL : TDBISAMQuery;
begin
 qrySQL := TDBISAMQuery.Create(Application);
 qrySQL.SessionName  := dmAccount.DBISAMSession1.SessionName;
 qrySQL.DatabaseName := dmAccount.DB1.DatabaseName;
 qrySQL.SQL.Clear;
 qrySQL.SQL.Add(sSQL);
 qrySQL.ExecSQL;
 qrySQL.Free;
end;

procedure TdmAccount.Button1Click(Sender: TObject);
var
 sSQL   : String;
 oTable : TDBISAMTable;
 x, y: integer;
begin
 Label1.Caption := '';
 Label1.Repaint;
 // DB1.StartTransaction;

 //Edit2 asks how many dummy tables are available to be inserted...they are named
 //Table1, Table2, etc.
 y := StrToIntDef( Edit2.Text, 2 );
 for x := 1 to y do begin

  //Create the memory table;
  oTable := TDBISAMTable.Create(Application);
  oTable.SessionName  := DBISAMSession1.SessionName;
  oTable.DatabaseName := 'MEMORY';
  oTable.TableName    := 'Table_M';

  //TableA, Table1, Table2 etc. all have this structure
  oTable.FieldDefs.Add('Field1',ftString ,20,False);
  oTable.FieldDefs.Add('Field2',ftInteger, 0,False);
  //BLOBfield requires lots more memory but otherwise doesn't seem to change
  //the results
  //oTable.FieldDefs.Add('Blob1',ftBlob,0 ,False);

  oTable.CreateTable;

  //Load the data from a local table.
  sSQL := 'INSERT INTO "MEMORY\Table_M" SELECT * FROM "C:\Table' + inttostr(x) + '.Dat"';
  Label1.Caption := 'Stage ' + inttostr( x ) + ': insert from disk to memtable';
  Label1.Repaint;
  RunSQL(sSQL);

  //Update some fields in the memory table.
  sSQL := 'UPDATE "MEMORY\Table_M" SET Field2 = 0';
  Label1.Caption := 'Stage ' + inttostr( x ) + ': update memtable';
  Label1.Repaint;
  RunSQL(sSQL);

  //Now move the data from the memory table to another table inside the database.

  //TableA is the table receiving the inserts--it should be emptied before
  //beginning the experiment
  sSQL := 'INSERT INTO "C:\TableA" SELECT * FROM "MEMORY\Table_M"';
  Label1.Caption := 'Stage ' + inttostr( x ) + ': insert from memtable to TableA';
  Label1.Repaint;
  RunSQL(sSQL);

  //Now delete the memory and free the memory table.
  oTable.DeleteTable;
  oTable.Free;
  oTable := Nil;

 end;
 // DB1.Commit;
 Label1.Caption := 'Finished';
end;

end.
Sun, Oct 29 2006 9:21 AMPermanent Link

Abdulaziz Jasser
Tim,

<<Thanks.  I will take a look as soon as it arrives.>>

I FOUND THE PROBLEM.

While copying the customer database to the CD I promised to send, I notice the sizes of the BLB files are so big.  So I opened one of the tables to
see if there is some data in the memo field and found NULL data in the whole 184,000 records in that table!!! The table size is almost 25 MGB and the
BLB file is 92 MGB!!!  I tried to fix the problem by running "Repair" from the dbsys but did not solve the problem.  So I dropped the field and add it
again (using ALTER clause) and now the size of the BLB file is 1K!!!  I run the program which causes the out-of-memory problem and it works like a
dream.  How did the file become so big without any data?  This problem is happing with almost all the tables that contains memo fields and has BLB
fields.  Attached is a sample table.

Regards,
Abdulaziz Jasser
Mon, Oct 30 2006 4:41 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Abdulaziz,

Please do not post attachments in this newsgroup, especially 17 meg
attachments.  I removed the attachment from the message that you posted.
All attachments must be posted in the Binaries newsgroup.

<< While copying the customer database to the CD I promised to send, I
notice the sizes of the BLB files are so big.  So I opened one of the tables
to see if there is some data in the memo field and found NULL data in the
whole 184,000 records in that table!!! The table size is almost 25 MGB and
the BLB file is 92 MGB!!!  I tried to fix the problem by running "Repair"
from the dbsys but did not solve the problem.  So I dropped the field and
add it again (using ALTER clause) and now the size of the BLB file is 1K!!!
I run the program which causes the out-of-memory problem and it works like a
dream.  How did the file become so big without any data?  This problem is
happing with almost all the tables that contains memo fields and has BLB
fields.  Attached is a sample table. >>

Which field did you drop and re-add in the ALTER TABLE ?  The memo field ?
If so, then that is what one would expect when you perform those operations.
As to how the NULL data go there in the first place, I have no idea.
However, I do know that DBISAM didn't just put it there by itself.  Do you
have any code that updates that memo field ?

--
Tim Young
Elevate Software
www.elevatesoft.com

« Previous PagePage 3 of 4Next Page »
Jump to Page:  1 2 3 4
Image