Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 11 total
Thread How fast is "Quite Fast" for backup
Mon, Jul 14 2008 12:15 PMPermanent Link

Marcin
Hello,

I have an application running 24h a day. It may sometimes write couple of thousands
records per hour. Even with periodic cleanups 1GB for some tables is normal situation.
From the documentation for ElevateDB "When the backup executes, it obtains a read lock for
the entire database that prevents any sessions from
performing any writes to any of the tables in the database until the backup completes.
However, since the execution of a backup is quite fast, the time during which the tables
cannot be changed is usually pretty small."

1. How fast will be "quite fast" in my situation? Will my database be locked for 20
minutes an Hour or 3 hours?
2. I am allowed to include table list in SQL backup statement. Will it also narrow lock to
these tables only or will it still lock whole database?

Marcin.
Mon, Jul 14 2008 1:11 PMPermanent Link

"Eduardo [HPro]"
Marcin

> 1. How fast will be "quite fast" in my situation? Will my database be
> locked for 20
> minutes an Hour or 3 hours?
Backup database applications is not so easy (neither fast) as all of us
expect but you can be sure it will take only some minutes instead of hours.
Everything depends how big is the database.

> 2. I am allowed to include table list in SQL backup statement. Will it
> also narrow lock to
> these tables only or will it still lock whole database?
No, as I know the entire database is locked to prevent RI integrity break.

Eduardo

Mon, Jul 14 2008 1:45 PMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Marcin


As well as Eduado's points you have to think of the speed of connection between your data and where your backup will be, and the speed of the source and destination PC's.

EG my normal connection to my little DataTank server is a 100Mb LAN, but sometimes (say when sitting "watching the TV") I run over a 56Mb wireless connection and its about 20% of the speed.

Backup of my email & news in DBISAM over the wired LAN takes c20mins for 2.5Gb. The wireless LAN I don't even bother with.

Response timings etc were much simpler (but slower) back in the old mainframe days.

Roy Lambert [Team Elevate]
Mon, Jul 14 2008 3:39 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Roy,

<< EG my normal connection to my little DataTank server is a 100Mb LAN, but
sometimes (say when sitting "watching the TV") I run over a 56Mb wireless
connection and its about 20% of the speed.

Backup of my email & news in DBISAM over the wired LAN takes c20mins for
2.5Gb. The wireless LAN I don't even bother with. >>

I would never recommend backing up your database directly over a LAN.  You
should always back up the database locally on the server's hard drive, and
then transfer the backup at your leisure.  Otherwise, you're simply causing
the database engine to wait on the network most of the time.

--
Tim Young
Elevate Software
www.elevatesoft.com

Mon, Jul 14 2008 3:55 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Marcin,

<< 1. How fast will be "quite fast" in my situation? Will my database be
locked for 20 minutes an Hour or 3 hours?  >>

Our 4.9GB database takes about 12 minutes to back up.

<< 2. I am allowed to include table list in SQL backup statement. Will it
also narrow lock to these tables only or will it still lock whole database?
>>

No, it still has to lock the entire database in order to preserve any
referential integrity in the backup.  It must make sure that the backup is a
consistent snapshot of the database, even if it is only a partial snapshot.

--
Tim Young
Elevate Software
www.elevatesoft.com

Mon, Jul 14 2008 4:02 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Marcin,

Just to clarify - that 12 minutes estimate was wrong - I just checked our
server log and our 4.9GB database actually backs up in 9 minutes.   This is
on a dual-Xeon server with 1GB of memory backing up to a SCSI hard drive
that is local to the server machine.  The backup is then transferred to DVD
media directly afterwards.  We use a server job to do so.

Also, this is a DBISAM database, not an ElevateDB database.  However, your
times with an ElevateDB database should be the same or better than the same
times with a DBISAM database, so you can expect at least the same times as
above with your ElevateDB database.

--
Tim Young
Elevate Software
www.elevatesoft.com

Tue, Jul 15 2008 3:06 AMPermanent Link

Marcin
"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote:

>Just to clarify - that 12 minutes estimate was wrong - I just checked our
>server log and our 4.9GB database actually backs up in 9 minutes.   This is
>on a dual-Xeon server with 1GB of memory backing up to a SCSI hard drive
>that is local to the server machine.  The backup is then transferred to DVD
>media directly afterwards.  We use a server job to do so.

Thank you very much for all your answers. It was very helpful.
What is the advantage of using a server job? Do you plan to include "live backup"
functionality for ElevateDB as it would be best solution for my database?

Marcin.
Tue, Jul 15 2008 3:34 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Tim

>I would never recommend backing up your database directly over a LAN. You
>should always back up the database locally on the server's hard drive, and
>then transfer the backup at your leisure. Otherwise, you're simply causing
>the database engine to wait on the network most of the time.

Hence my comment about speed of connection between source and backup locations. TMaN (my my and news reader) resides on my notebook and occasionally I can be bothered to back it up. When I do it I usually start it running and get on with something else. I'm not bothered about the database engine - after all it needs rest as well Smiley

My business data resides on the DataTank, gets backed up on a weekday cycle using Acronis to a USB hard drive and once a week(ish) gets dumped unto dvd.

I'm fortunate that I don't have anything 24x7

Roy Lambert
Tue, Jul 15 2008 1:08 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Marcin,

<< Thank you very much for all your answers. It was very helpful. What is
the advantage of using a server job? >>

The advantage of using a server job is being able to automate the backup
process so that it occurs at night or during early morning when there is
less activity.  You can also automate the naming of the backups, etc. and
the copying of the backup file(s) from the hard drive to the removable media
such as DVD or tape.

<< Do you plan to include "live backup" functionality for ElevateDB as it
would be best solution for my database? >>

You can do live backups now.  You simply need to have a local store defined
as the target for the backup, and then you can just use the BACKUP DATABASE
statement to do so on demand.  You can see this in action in the ElevateDB
Manager by clicking on a database and then clicking on the Backup Database
task link or menu option.

Here's a script that does a "backup on demand":

SCRIPT
BEGIN
  DECLARE ConfigCursor CURSOR FOR ConfigStmt;

  USE Configuration;

  -- Create necessary store if not present

  PREPARE ConfigStmt FROM 'SELECT * FROM Stores WHERE Name = ?';
  OPEN ConfigCursor USING 'Backups';

  IF ROWCOUNT(ConfigCursor) = 0 THEN
     EXECUTE IMMEDIATE 'CREATE STORE "Backups" AS LOCAL
                       PATH ''c:\myapp\backups''';
  END IF;

  CLOSE ConfigCursor;

  EXECUTE IMMEDIATE 'BACKUP DATABASE "MyDatabase" AS "' +
                    CAST(CURRENT_DATE AS VARCHAR(10)) +
                    '-MyDatabase" TO STORE "Backups" INCLUDE CATALOG';
END

--
Tim Young
Elevate Software
www.elevatesoft.com

Wed, Jul 16 2008 10:14 AMPermanent Link

Marcin
Tim,

<<You can do live backups now.  You simply need to have a local store defined
<<as the target for the backup, and then you can just use the BACKUP DATABASE
<<statement to do so on demand.

I think you misunderstood me. Right now I create backups the way you showed in your script.
I works fine for me. Maybe I should use "Hot backup" instead. I want a backup that would not
lock database and create backup while users are using it.

Thank you for job usage clarification.

Marcin
Page 1 of 2Next Page »
Jump to Page:  1 2
Image