Icon View Thread

The following is the text of the current message along with any replies.
Messages 11 to 17 of 17 total
Thread monitoring metadata changes
Wed, Jul 3 2013 8:25 AMPermanent Link

Geziel

Roy,


I'll think about it. Many thanks to you and Tim for the help.
Wed, Jul 3 2013 11:56 AMPermanent Link

Barry

Geziel,

You do have access to Information.Tables, Information.Views, Information.TemporaryTables etc. so if these are the changes you need to monitor then you could write a job that runs every 5 minutes or so and compares the contents of these Information tables to a prior copy of these tables.

It might look something like this:
1) Create a temporary table from Information.Tables and call it _CurInformationTables.
2) Compare the rows from _CurInformationTables to _OldInformationTables.
3) If there are any changes, log them.
4) Copy _CurInformationTables to _OldInformationTables
5) Repeat every 5 minutes

There is also a Confguration.LogEvents table that you may find useful.

Barry
Fri, Jul 5 2013 10:00 AMPermanent Link

Geziel


Barry,

Thank you for help. My problem is that there are many objects in the database. I worry about performance issues with this approach.
Fri, Jul 5 2013 10:33 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Geziel


Can you post some details about what your application is and how / why users are making changes to it. There may be some better overall approach.

Roy Lambert [Team Elevate]
Fri, Jul 5 2013 1:08 PMPermanent Link

Barry

Roy,

I think you're right, there should be a better way.

Why not force all table changes to be passed through a Stored Procedure?

The Stored Procedure gets the DDL SQL as a Clob parameter and logs it in a table, then attempts to execute the SQL. If it succeeds, the SP updates the logged SQL row as having succeeded, otherwise it logs it as failed and the exception msg. This log table also contains the timestamp, user, ip address etc of the user. The log file can also store the before structure of the table/view etc. prior to the SQL being executed (it is fairly easy to decipher which object the SQL is acting on and then get the meta data from the correct table before AND after the SQL has been executed.)

If the CLOB parameter has several SQL statements, delimited by ';' or '!', then the SP parses them out and executes them one by one.

I think this would give you much better control over who, what, when, where starts changing your database structure and you will have a log of the changes. By preventing the user from modifying the database outside of this SP via ROLES/PRIVILEGES, you force them to use this SP which has its own ROLE defined with the necessary privileges. It can also act as a traffic cop with its own security table to restrict users to modifying only certain database objects at a certain time.

Just a thought.

Barry
Sat, Jul 6 2013 3:41 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Barry


Interesting take on it as always. I was thinking more about system design eg rather than have users altering the "standard" tables (which is what I assume is happening here) have a set of user tables that they can alter to their hearts content and that are linked to the standard tables with joins, or even just having CLOB columns with row specific tabels in them.

Roy Lambert
Mon, Jul 8 2013 3:39 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Geziel,

<< Users will alter the structure with DDL statements.  After that, I need
to recreate some data that should reflect these changes. >>

So you need an actual trigger for the DDL execution, not a log of the DDL
that occurred.  Is that correct ?

Tim Young
Elevate Software
www.elevatesoft.com
« Previous PagePage 2 of 2
Jump to Page:  1 2
Image