Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » ElevateDB Technical Support » Support Forums » ElevateDB General » View Thread |
Messages 1 to 10 of 16 total |
Importing a very lasrger 1.13million record test file |
Mon, Oct 1 2007 9:31 AM | Permanent Link |
Allan Brocklehurst | hello;
I am importing 1.1million records from a text file. There are 5 variatons of record layouts that I have to be aware of.,and I am I have to do a readlin on the record and parse out all of the fields. I then do a table.Insert then a post. The read/write process seems to be showing down and I'm only at 250K so far? Questions: should I do a flush buffer after a given number of writes say 3k or so? or Is there nothing I can do but wait? TIA Allan |
Mon, Oct 1 2007 9:32 AM | Permanent Link |
Allan Brocklehurst | The subject line should have read "Importing a very Large 1.13 million record text file" (~_~)
Allan Allan Brocklehurst <brock@ns.sympatico.ca> wrote: hello; I am importing 1.1million records from a text file. There are 5 variatons of record layouts that I have to be aware of.,and I am I have to do a readlin on the record and parse out all of the fields. I then do a table.Insert then a post. The read/write process seems to be showing down and I'm only at 250K so far? Questions: should I do a flush buffer after a given number of writes say 3k or so? or Is there nothing I can do but wait? TIA Allan |
Mon, Oct 1 2007 10:31 AM | Permanent Link |
Dave Harrison | Allan Brocklehurst wrote:
> hello; > I am importing 1.1million records from a text file. > There are 5 variatons of record layouts that I have to be aware of.,and I am > I have to do a readlin on the record and parse out all of the fields. I then do a table.Insert then a post. > The read/write process seems to be showing down and I'm only at 250K so far? > > Questions: should I do a flush buffer after a given number of writes say 3k or so? or Is there nothing I can do but wait? > > TIA > > Allan > Allan, I've had the same problem with all databases I've tried. The problem is caused by the table maintaining its indexes. As the index gets deeper, it takes longer to update. Solution: 1) Speed is pretty consistent if you remove all indexes from the table then insert the data and rebuild the indexes after all of the data has been loaded. Unfortunately building indexes in DBISAM isn't going to be fast. 2) Use transactions with Commit(false) so the transactions are not flushed. You will need to tune the number of transactions because performance will improve if you hit the sweet spot (anywhere from 500 to 50,000 and you have to be within say 500 records-that means a lot of benchmarking). Dave |
Mon, Oct 1 2007 10:35 AM | Permanent Link |
Dave Harrison | Dave Harrison wrote:
> Allan Brocklehurst wrote: > >> hello; >> I am importing 1.1million records from a text file. There are 5 >> variatons of record layouts that I have to be aware of.,and I am I >> have to do a readlin on the record and parse out all of the fields. I >> then do a table.Insert then a post. >> The read/write process seems to be showing down and I'm only at 250K >> so far? >> >> Questions: should I do a flush buffer after a given number of writes >> say 3k or so? or Is there nothing I can do but wait? >> >> TIA >> >> Allan >> > Allan, > I've had the same problem with all databases I've tried. The problem > is caused by the table maintaining its indexes. As the index gets > deeper, it takes longer to update. > > Solution: > 1) Speed is pretty consistent if you remove all indexes from the table > then insert the data and rebuild the indexes after all of the data has > been loaded. Unfortunately building indexes in DBISAM isn't going to be > fast. > > 2) Use transactions with Commit(false) so the transactions are not > flushed. You will need to tune the number of transactions because > performance will improve if you hit the sweet spot (anywhere from 500 to > 50,000 and you have to be within say 500 records-that means a lot of > benchmarking). > > Dave Allan, Oops, sorry I thought you were using DBISAM (too early Monday morning to be doing this. EDB can build indexes faster than DBISAM and the rest of what I said still goes. Don't use ForceFlush and build indexes after all data has been loaded. Dave |
Mon, Oct 1 2007 10:40 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | Allan
Try wrapping in a transaction and committing every 1k records. Roy Lambert |
Mon, Oct 1 2007 10:44 AM | Permanent Link |
Allan Brocklehurst | Thanks Boys.. this is just what Iwas looking for Allan Roy Lambert <roy.lambert@skynet.co.uk> wrote: Allan Try wrapping in a transaction and committing every 1k records. Roy Lambert |
Mon, Oct 1 2007 11:10 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | Dave
How's the test bed for Tim coming? I haven't seen any sign of a million record table hitting the binaries yet Roy Lambert |
Mon, Oct 1 2007 11:57 AM | Permanent Link |
Dave Harrison | Roy Lambert wrote:
> Dave > > > How's the test bed for Tim coming? I haven't seen any sign of a million record table hitting the binaries yet > > Roy Lambert > I'm working on it now. Nag, nag, nag. Dave |
Mon, Oct 1 2007 3:44 PM | Permanent Link |
"Jose Eduardo Helminsky" | Roy
Recently, I had to import 8M records using DBISAM and it took almost 12 hours. Eduardo |
Tue, Oct 9 2007 10:02 PM | Permanent Link |
Dave Harrison | Roy Lambert wrote:
> Dave > > > How's the test bed for Tim coming? I haven't seen any sign of a million record table hitting the binaries yet > > Roy Lambert > Roy, I've uploaded the test program "Benchmark program to time adding 20 million rows to table" to the binaries. If you think you can improve on the speed of the program, feel free to jump in with both feet. Dave |
Page 1 of 2 | Next Page » | |
Jump to Page: 1 2 |
This web page was last updated on Tuesday, May 14, 2024 at 07:14 PM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |