Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » ElevateDB Technical Support » Support Forums » ElevateDB SQL » View Thread |
Messages 1 to 3 of 3 total |
Inserting a large number of records / stored procedure limits |
Thu, Mar 27 2008 7:41 PM | Permanent Link |
"Carlton Craighead" | Hi:
I realize that SQL statements submitted to EDB can only contain a single statement and that procedures are the way to execute a number of statements en masse. I am having 2 issues with using procedures... 1. What is the best approach to inserting a large number of records remotely and quickly? The record count could be up to 1/2 million rows. A single large procedure. Or multiple procedures, done one at a time with a small subset of the records? Or perhaps via a CSV file that I somehow get to the server machine? In trying the batch approach with a procedure of 500 inserts and a record size of about 100 bytes, EDB seems to only insert records at a rate of about 8 per second. I did test the import approach and its speed is impressive, but one has to worry about creating enough infrastructre to get to the CSV file system to the file system. 2. In trying the batch approach with a large set of inserts (say 2000 records), I find that EDB produces an exception. I don't have the details handy but will run it again. What is the size limit of a procedure? Is there one? Thanks. C |
Fri, Mar 28 2008 3:13 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | Carlton
As long as you can guarantee the correctness of data in the csv file I'd probably go for that. Alternatively if its a lot easier to get the data in a straight text file I'd investigate a procedure passing a line at a time in as well as a mega procedure holding all the lines. I'd also look at running it in a transaction committing at various intervals say 500, 1000, 1500 Roy Lambert [Team Elevate] |
Fri, Mar 28 2008 2:47 PM | Permanent Link |
Tim Young [Elevate Software] Elevate Software, Inc. timyoung@elevatesoft.com | Carlton,
<< 1. What is the best approach to inserting a large number of records remotely and quickly? The record count could be up to 1/2 million rows. >> Well, the issue is to get the data to the server first, and then execute the insertions. Trying to perform the insertions over the wire is futile with that many rows. This leaves a CSV import or sending over a binary table with the rows to insert. << 2. In trying the batch approach with a large set of inserts (say 2000 records), I find that EDB produces an exception. I don't have the details handy but will run it again. What is the size limit of a procedure? Is there one? >> No, there isn't one. If you could find out what the exception was that you received, I can look into it further. -- Tim Young Elevate Software www.elevatesoft.com |
This web page was last updated on Tuesday, April 30, 2024 at 03:55 PM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |