Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » DBISAM Technical Support » Support Forums » DBISAM General » View Thread |
Messages 1 to 10 of 12 total |
Creating a WAN "Backup" routine for DBISAM |
Wed, Apr 25 2007 10:57 AM | Permanent Link |
adam | Dear All,
I have just had a distraught email from a customer of mine whos server has blown up (they work in Uganda & the machine was fried despite sitting behind a USP/Current regulator!). They have not backed up their data regularly (despite _very_clear_ instruction from me that they should) and now have lost about 3 weeks of data entry by 8 staff. Of course it is not my fault & there is no DBISAM error / fault either, but I am thinking about writing something into my DB framework: - Simply field on each table that TimeStamps the latest change. This TimeStamp would be updated on the Application side by Delphi each time a user called "Post". - Minor rewrite of the DBISAM DBSVR application to include a periodic event, controlled by a timer or the system clock to: - SELECT from all databases any record with TimeStamp > LastBackUpTime. - UPDATE/INSERT INTO remote database all the records from step 1. ... I realise that the above scheme is a gross oversimplification, but it gives the general picture. IF I wrote this I would be able to offer my users a "remote / safe backup" ... of course it would be dependent on their ability to connect to the remote DB, and it would provide only a snap-shot back up, but it would really help the user I am currently trying to console. -- QUESTION: 1. Has anyone else already written something like what I am suggesting above, or do you know of 3rd party tools that will do it? Adam |
Wed, Apr 25 2007 1:58 PM | Permanent Link |
"Robert" | "adam" <adam@nospamplease.fmfoods.co.uk> wrote in message news:6D688927-0C4C-49F5-981B-A0CCDD3EB6A5@news.elevatesoft.com... > > - SELECT from all databases any record with TimeStamp > LastBackUpTime. > - UPDATE/INSERT INTO remote database all the records from step 1. > What about deletes? Referential integrity? Application logic that relates one table's content to another's? You are getting into a process that might be more complicated than it seems. Without knowing your application, your scheme might work just fine. But this type of synchronization between databases is not trivial. Is it possible to just do a daily DBISAM databse backup? Just stop all process and backup the database? Robert |
Wed, Apr 25 2007 9:57 PM | Permanent Link |
David | I figured that simple was best. So I modified the server program and added
simple backup functionality to it myself. Here is a screenshot http://www.activebiometrics.com/products/activeserver/files/BIGblocks_image_ 7_1.png Click a button to disable the DB and click 'Backup' and it will compress the DB directory into a .CAB file. When they want to restore a backup, they click a button to disable the DB, click a button to pick the .CAB file and the DB is restored. On 4/25/07 10:57 AM, in article 6D688927-0C4C-49F5-981B-A0CCDD3EB6A5@news.elevatesoft.com, "adam" <adam@nospamplease.fmfoods.co.uk> wrote: > > QUESTION: > > 1. Has anyone else already written something like what I am suggesting above, > or do you > know of 3rd party tools that will do it? > > Adam > |
Thu, Apr 26 2007 2:44 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | David
I think that code should live on the binaries Roy Lambert |
Thu, Apr 26 2007 3:54 AM | Permanent Link |
Charles | I go with David's idea. Just before midnight I stop my kbmMW server and backup all the database files onto a USB drive. This USB drive is a small unit that can slip into a handbag, or some such. The company has a min. of three of them. One person in the office has the responsibly simply swapping the one in her handbag for the one next to the server as she leaves the office at 5pm. This means I have a backup at home, in transit and one in the office. In case of fire it is save the USB device then your own life A backup solution is only as good as how easy it is to do, as you'll probably never need it. The above is the best one I have yet found. Remote backups are complex, as has been said. You can't rely on anyone doing a backup, and even backing up once a day isn't ideal. Simply hot swapping a self powered 2.5 USB drive is as simple as it gets IMO. Downside of USB is the files need to be split if greater than 2Gb (or is it 4Gb). Now would be the best time to implement a good backup policy with your client lets not forget, so a quick fix might be in order and remote backups would not be classed a quick fix IMO. Good luck -- Charles. |
Thu, Apr 26 2007 12:27 PM | Permanent Link |
David | I would pull that hard drive and check it. Just because the machine is
fried does not necessarily mean that the drive is bad. If for some reason the drive isn't recognized then I would definitely look into data recovery. Losing 3 weeks of 8 people's work can easily justify the several thousand dollar cost of recovery. On 4/25/07 10:57 AM, in article 6D688927-0C4C-49F5-981B-A0CCDD3EB6A5@news.elevatesoft.com, "adam" <adam@nospamplease.fmfoods.co.uk> wrote: > Dear All, > > I have just had a distraught email from a customer of mine whos server has > blown up (they > work in Uganda & the machine was fried despite sitting behind a USP/Current > regulator!). |
Thu, Apr 26 2007 4:55 PM | Permanent Link |
adam | Thanks for all these responses.
Definitely the implementation would be a lot harder than my original post ... Referential Integrity etc., etc. However, the user I was thinking of is basically a government office that enters scads of data & it is pretty much one-way, i.e. enter data XXX post & forget ... move on to the next bit of data ... so a very simple scheme would actually achieve most of what they need. Having thought about it perhaps the simplest way to express the idea is something more like this: 1. Compress whole DB into a CAB. 2. FTP / Upload it over a Broadband internet connection to an "safe backup" location. -- The downside is that the transfer size would be very large. However, such a system would work like a dream if you were only Uploading the DELTA, i.e. creating the ZIP file & then somehow comparing it with the master & passing across the "new bits" ... is such an idea possible? I actually know of a commercial product that does it, but has anyone tried to implement anything similar? |
Fri, Apr 27 2007 2:20 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | adam
Why not implement a very simple form of transaction logging to a separate drive as well as having a "normal" backup system? Recovery is a major pain is the backup was taken to long ago but its not to difficult to set up and it works. Roy Lambert |
Fri, Apr 27 2007 7:48 AM | Permanent Link |
Tim Young [Elevate Software] Elevate Software, Inc. timyoung@elevatesoft.com | Adam,
<< Of course it is not my fault & there is no DBISAM error / fault either, but I am thinking about writing something into my DB framework: > Is there any reason why you can't just use the backup facilities in DBISAM within a scheduled event in the database server ? -- Tim Young Elevate Software www.elevatesoft.com |
Tue, May 1 2007 7:08 AM | Permanent Link |
adam | >Is there any reason why you can't just use the backup facilities in DBISAM >within a scheduled event in the database server ? Yes I can do this Tim & I do. However the user often backs up to the local machine, which is great if there is a table corruption & you want to roll back to an earlier version, but is no good if someone runs off with the server in the back of a white van ( Of course users should also back up to external drives etc., etc. but I am trying to make it _really_ easy for them, by taking that task off their hands. I would _like_ to write something that actually sends their data away at the end of the day to a remote location. Something as simple as a Backup + SendMail type routine would do a good job ... but for the size of the DB. My users don't have a massive internet connection, but they do have a fairly massive DB (several 100s of mb). Even zipped by your excellent back up facility (because there is compression on the larger fields anyway) it is still too big to send the whole thing over the internet. -- A schema to do it would be: 1. Find new records for day XXX. & Query these out into their own tables. 2. Use DB.backup to effectively "Zip" these & email them to the remote location. 3. Store a sequence of these ZIPS at the remote location. 4. It would then be possible (though I would have to scratch my head a bit) to re-create a DB for any period by un-Zipping the appropriate back-ups & writing SQL to merge their data. -- The reason I am posting is really to get advise from other users who have done this particular piece of work in the past, so I can avoid horrible pitfalls. The obvious problem is how to cope with simple changes of a record but I think I have a way round this my system already has an "auditing" table which logs every change made to every other table (just recording the date-time of the change, the record's ID & tablename + the UserID) So it is fairly easy for me to execute step 1. above, just by recalling all DISTINCT tables changed and all the IDs changed from this table. I will use INDY to transfer the data, probably just as an email to my company, with the ZIP as an attachment. We can look after these & keep them somewhere safe. Step 4. also looks pretty nasty ... Overall, I was just wondering whether anyone else has done this (probably better!) Once I have done it I will post it on the newsgroups with all the source ... but don't hold your breath! |
Page 1 of 2 | Next Page » | |
Jump to Page: 1 2 |
This web page was last updated on Wednesday, April 24, 2024 at 11:07 AM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |