Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » DBISAM Technical Support » Support Forums » DBISAM General » View Thread |
Messages 1 to 10 of 11 total |
Mass Inserts |
Fri, Feb 17 2006 8:32 AM | Permanent Link |
methad | I'm doing a procedure that reads input fields from an xml file (4 fields), and I need to insert this data into an underlying dbisam table. This is just the standard TDBISAMTable component,
created at runtime. I'm using this, because it seemed faster than running a bunch of insert statements. I'm using v 2.06 (I believe). I've got a total of 50K records I need to insert, and it is taking over 15 minutes. I know I've got to be doing something wrong. I've been looking on the forum, and I've tried the database trans/commit stuff, at various increments 100, 500, 1000, etc. I've only been using the DBISAM for like 3 months, so I don't know all the tricks yet. Any help would be appreciated while read file do ... I've used tblTmp.Insert and .Append (ps, is one better than the other?) I'm trying the .InsertRecord right now. tblTmp.Fields[2].AsString := strEmpID; tblTmp.Fields[5].AsString := strPrdBegDate; tblTmp.Fields[6].AsString := strPrdEndDate; tblTmp.Fields[3].AsString := strTimeDate; ** I've got some code below this to count mod 1000 = 0, commit, start trans for a database Oh, yeah, I'm going this directly on my machine, this isn't even over a network. And, there are no secondary indexes, just one primary that is an auto increment. Thanks, and if I need to provide some more infor, let me know. |
Fri, Feb 17 2006 11:29 AM | Permanent Link |
"Donat Hebert \(WSI\)" | Search for Ole Willy Tuv's
TowDBISAMBatchMove Version 2.0 component .. I get between 3-4,000 records per second downloading ... The code is all there for you to review. Donat. |
Fri, Feb 17 2006 11:54 AM | Permanent Link |
Tim Young [Elevate Software] Elevate Software, Inc. timyoung@elevatesoft.com | << I'm doing a procedure that reads input fields from an xml file (4
fields), and I need to insert this data into an underlying dbisam table. This is just the standard TDBISAMTable component, created at runtime. I'm using this, because it seemed faster than running a bunch of insert statements. I'm using v 2.06 (I believe). I've got a total of 50K records I need to insert, and it is taking over 15 minutes. I know I've got to be doing something wrong. I've been looking on the forum, and I've tried the database trans/commit stuff, at various increments 100, 500, 1000, etc. I've only been using the DBISAM for like 3 months, so I don't know all the tricks yet. >> Could you post your code with the transaction code included ? Thanks, -- Tim Young Elevate Software www.elevatesoft.com |
Fri, Feb 17 2006 12:11 PM | Permanent Link |
methad | Could you post your code with the transaction code included ?
Thanks, -- Tim Young Elevate Software www.elevatesoft.com If you see anything else, other than the issue at hand, as far as how I'm handling the components, feel free to let me know. I come from a background of purely client development with paradox and delphi. I haven't done the transactional stuff before, nor the "real" client server, etc. Also, I just put the code together, and ran it, and had to do something else so I haven't tested it much, to maybe if I'm missing something, but here it is. //Create the table/db objects, before the loop dbTmp := TDBISAMDatabase.Create(nil); dbTmp.Directory := 'C:\Development\MSXMLLibParserTesting'; dbTmp.DatabaseName := 'Test'; dbTmp.Open; dbTmp.StartTransaction; //Wasn't sure if this was proper tblTmp := TDBISAMTable.Create(nil); tblTmp.DatabaseName := 'C:\Development\MSXMLLibParserTesting'; tblTmp.TableName := 'M125PDAR'; tblTmp.Active := true; //---------- Loop code, parsing up the xml, this is taking no time at all //---------- I timed it, the lag is in the inserts tblTmp.Append; tblTmp.Fields[2].AsString := strEmpID; tblTmp.Fields[5].AsString := strPrdBegDate; tblTmp.Fields[6].AsString := strPrdEndDate; tblTmp.Fields[3].AsString := strTimeDate; tblTmp.Post; if (iCnt MOD 1000) = 0 then begin dbTmp.Commit; dbTmp.StartTransaction; //This was out of ignorance, and desperation //I'm not too sure about the session stuff. //I saw where there was a db.flush buffers somewhere //in the samples on the forum, but I don't think //have the same db component, but I don't think this //line made a difference either way tblTmp.FlushBuffers; end; inc(iCnt); //-------------- after the loop if dbTmp.InTransaction then dbTmp.Commit; By the way, I changed the code to chunk out the data in the xml 10,000 records in 5 files, and it processed in like 4 1/2 minutes versus 20 minutes. So, we can still do what we need to do. The data may actually be coming into the logic in chunks like that anyway, but I'm still curious, I thought for sure it would go quicker than that, which is what led me to believe I wasn't coding something correctly. Thanks for the response. P.S. I love dbisam compared to paradox, I spent years with the BDE, it gave me nightmares :- ) |
Fri, Feb 17 2006 12:13 PM | Permanent Link |
methad | << what led me to believe I wasn't coding something correctly. >>
Ummm ... that would be incorrectly |
Mon, Feb 20 2006 6:49 AM | Permanent Link |
"John Hay" | > dbTmp := TDBISAMDatabase.Create(nil); > dbTmp.Directory := 'C:\Development\MSXMLLibParserTesting'; > dbTmp.DatabaseName := 'Test'; > dbTmp.Open; > dbTmp.StartTransaction; > > //Wasn't sure if this was proper > tblTmp := TDBISAMTable.Create(nil); > tblTmp.DatabaseName := 'C:\Development\MSXMLLibParserTesting'; I don't think you are using the dbTmp database so you are not getting the benefit of the transactions - change the line above to tblTmp.DatabaseName := 'Test'; John |
Mon, Feb 20 2006 8:00 AM | Permanent Link |
methad | "John Hay" <j.haywithoutdot@crbsolutionsremoveallthis.co.uk> wrote:
> dbTmp := TDBISAMDatabase.Create(nil); > dbTmp.Directory := 'C:\Development\MSXMLLibParserTesting'; > dbTmp.DatabaseName := 'Test'; > dbTmp.Open; > dbTmp.StartTransaction; > > //Wasn't sure if this was proper > tblTmp := TDBISAMTable.Create(nil); > tblTmp.DatabaseName := 'C:\Development\MSXMLLibParserTesting'; >I don't think you are using the dbTmp database so you are not getting the >benefit of the transactions - change the line above to > tblTmp.DatabaseName := 'Test'; >John I posted a message, right after this one, about that. I was moving back and forth, and I had actually run the code like that, with the table pointing to the 'Test' I'm not sure why that message didn't post. So, if you imagine that the table is set to the proper database, can you see anything else that would keep it from running that slow? Like I said, I iterated through five files, with 10,000, calling this procedure on each file, and it ran between 4/5 minutes for the 50,000 records, which is about what I expected, but I thought that the transaction stuff would emulate that? Thanks |
Mon, Feb 20 2006 9:29 AM | Permanent Link |
"John Hay" | > I posted a message, right after this one, about that. I was moving back and forth, and I had actually run the code like that, with the table pointing to the 'Test' I'm not sure why that message > didn't post. So, if you imagine that the table is set to the proper database, can you see anything else that would keep it from running that slow? Like I said, I iterated through five files, with > 10,000, calling this procedure on each file, and it ran between 4/5 minutes for the 50,000 records, which is about what I expected, but I thought that the transaction stuff would emulate that? 4/5 minutes still sounds excessive for this operation. On a reasonable P4 I would expect it to be well under a minute. Is the table structure a lot bigger than what you are including in the append? Can you post the table structure ? Can I suggest that you exclude the xml parsing etc, initialise the variables (strempid etc) to a fixed value and just try running something like the following to test the append speed. dbTmp := TDBISAMDatabase.Create(nil); dbTmp.Directory := 'C:\Development\MSXMLLibParserTesting'; dbTmp.DatabaseName := 'Test'; tblTmp := TDBISAMTable.Create(nil); tblTmp.DatabaseName := 'Test'; tblTmp.TableName := 'M125PDAR'; tblTmp.Active := true; dbTmp.StartTransaction; icnt := 1; for i:= 1 to 50000 do begin tblTmp.Append; tblTmp.Fields[2].AsString := strEmpID; tblTmp.Fields[5].AsString := strPrdBegDate; tblTmp.Fields[6].AsString := strPrdEndDate; tblTmp.Fields[3].AsString := strTimeDate; tblTmp.Post; if (iCnt MOD 1000) = 0 then begin dbTmp.Commit; dbTmp.StartTransaction; end; inc(iCnt); end; if dbtmp.intransaction then dbtmp.commit; John |
Mon, Feb 20 2006 11:07 AM | Permanent Link |
methad | "John Hay" <j.haywithoutdot@crbsolutionsremoveallthis.co.uk> wrote:
> 4/5 minutes still sounds excessive for this operation. On a reasonable P4 I > would expect it to be well under a minute. Is the table structure a lot > bigger than what you are including in the append? Can you post the table > structure ? No need to post the table structure, the time is coming in on the xml read through. I had timed that earlier, but I guess it was when it didn't have so much data in it. I'll have to work on that piece of the code. I guess after a certain amount, it starts to slow down, because with only around 10,000 records coming in on that xml file, it's quick, but with 50,000 it's very slow. I'm using the MSXML parser through the dom interfacing. I'm so bad about missing the obvious, thanks for your time. |
Mon, Feb 20 2006 12:26 PM | Permanent Link |
Tim Young [Elevate Software] Elevate Software, Inc. timyoung@elevatesoft.com | John,
<< I don't think you are using the dbTmp database so you are not getting the benefit of the transactions - change the line above to tblTmp.DatabaseName := 'Test'; >> Absolutely correct. He needs to do exactly as you stated, otherwise he won't be getting the benefit of the transaction. -- Tim Young Elevate Software www.elevatesoft.com |
Page 1 of 2 | Next Page » | |
Jump to Page: 1 2 |
This web page was last updated on Friday, September 20, 2024 at 05:39 PM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |