Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 13 total
Thread UPDATES table format
Thu, Nov 7 2013 5:33 PMPermanent Link

Jeff Cook

Aspect Systems Ltd

Avatar

Hi

I'm sure that I have seen somewhere in the Help or Docs details of the
UPDATES file format ... BUT I can't find it again!

In my current DBISAM3 application I log data changes in gruesome detail by
comparing OldValues/NewValues in the BeforePost (and BeforeDelete) events of
key tables.

This is useful in the occasional "discussion" with users when they claim
that e.g. "The Customer record just vanished, no one did anything to it ..."
and I can look at the log and say that Fred on COMPUTER101 at 2013-11-08
11:29:32 deleted it.

Converting to ElevateDB - mainly to get the replication facilities - I see
that we now have a file that contains all the data changes - so perhaps I
can drop all my logging stuff and save all the UPDATES files - but how do I
read those files?

Cheers

Jeff

--
Jeff Cook
Aspect Systems Ltd
www.aspect.co.nz

Fri, Nov 8 2013 3:21 AMPermanent Link

Uli Becker

Jeff,

> I'm sure that I have seen somewhere in the Help or Docs details of the
> UPDATES file format ... BUT I can't find it again!
>
> Converting to ElevateDB - mainly to get the replication facilities - I see
> that we now have a file that contains all the data changes - so perhaps I
> can drop all my logging stuff and save all the UPDATES files - but how do I
> read those files?

You can use a trigger (BeforeDelete) and write basic informations
(timestmamp, customerID etc.) in a log table.

If you want to use the update files created by replication you can read
these files like this:

CREATE Table Temp_Updates
FROM UPDATES "UpdateFile_2013-09-19 17-07-07.500" IN STORE Replication_In

Don't use the file extension in this statement.

Then you can filter the table or query it like you want.

I'd prefer a trigger. Easy to use. If you need a sample, just tell me.

Regards Uli


Fri, Nov 8 2013 4:13 AMPermanent Link

Uli Becker

Here a sample how to write a trigger using a log-table with the fields
"CustomerID", "TimeStampDeleted", "Computer":

CREATE TRIGGER "WriteLog" AFTER DELETE
ON "MyCustomers"
BEGIN

  Execute Immediate 'INSERT INTO Log (CustomerID, TimeStampDeleted,
Computer)
                     values
                    (?,?,?)' using OldRow.CustomerID,
CURRENT_TIMESTAMP, CURRENT_COMPUTER;

END

Regards Uli
Fri, Nov 8 2013 5:04 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Jeff


According to my locally stored data it was in a post by Tim on  6/9/12 (that's proper calendar order not the rubbish the Yanks use) in a thread started by Barry (21/08/12) and had a title Re: What is the best way to implement Audit Trail with EDB v2.09?

with the ID

<1ECF4DCD-988A-42A1-B406-E529273F4F79@news.elevatesoft.com>

Roy Lambert [Team Elevate]
Sun, Nov 10 2013 3:24 PMPermanent Link

Jeff Cook

Aspect Systems Ltd

Avatar

Thanks Uli

"Uli Becker" <johnmuller54@gmail.com> wrote in message
news:7DF7A8BD-3439-4CF7-BBA1-C10E666A4FCE@news.elevatesoft.com...
>
> CREATE Table Temp_Updates
> FROM UPDATES "UpdateFile_2013-09-19 17-07-07.500" IN STORE Replication_In
>


That's the bit I was missing!

I knew I could do it in Delphi with my BeforePost and BeforeDelete events
just like I have in DBISAM3, but I knew that I would have these Update files
lurking in the background - just didn't know how to access them.

Now that I know, I look in the manual and see that it is blindly obvious:-

[FROM UPDATES <UpdateName> IN STORE <StoreName>]

....

Now the debate as to whether to use these files or to use TRIGGERs  -  Or
ditch the logging entirely

Cheers

Jeff
--
Jeff Cook
Aspect Systems Ltd
www.aspect.co.nz

Sun, Nov 10 2013 3:43 PMPermanent Link

Jeff Cook

Aspect Systems Ltd

Avatar

"Roy Lambert" <roy@lybster.me.uk> wrote in message
news:282936BC-0EF9-4FE0-ADCE-0C91DE4CB8F7@news.elevatesoft.com...
> Jeff
>
>
> According to my locally stored data it was in a post by Tim on  6/9/12
> (that's proper calendar order not the rubbish the Yanks use) in a thread
> started by Barry (21/08/12) and had a title Re: What is the best way to
> implement Audit Trail with EDB v2.09?
>

Thanks Roy

Interesting discussion - both HOW TO and whether to log changes or not.  My
post stems from a discussion with my colleague who does the majority of the
customer support work.  My initial point was "why are we logging all this
stuff if the customers NEVER use it".   He reckons he uses it a few times a
year ...  Why do we double the number of postings by logging every change?

Tucking these UPDATE files away and developing an easy way of accessing them
for customer support seems like a good compromise.

I don't know the format of these tables at present, but if, as one of the
posts indicated, the real username and computername aren't available in the
UPDATES file, then adding those fields to every record that you are
interested in and setting them in the BeforePost would partially solve that
problem.

Someone suggested the same scheme as I currently use in DBISAM, logging the
date/time/who/where/EditOrDelete - mentioning that the actual change wasn't
posted.  I solve this by using a memo field to store
FieldName:OldValue=>NewValue#13#10.

Thanks for the pointers

--
Jeff Cook
Aspect Systems Ltd
www.aspect.co.nz

Mon, Nov 11 2013 3:48 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Jeff


Many many years ago before I moved into recruitment and I was a full time paid IT professional I had to set up full logging for a couple of systems. Back in those days disk was expensive and you didn't fill it with crap and I managed to get the logging taken out after a couple of years of the files just growing.

At another job where I was IT Manager we ran Unisys mainframes which had full keystroke logging built into the OS. We used it to proved that a plonker would lock up his terminal by putting in the wrong password umpteen times and once he'd locked up his terminal went around other peoples trying to log in there and failing. Still couldn't get him fired for stupidity though Frown

He used to come in early and the net result was people would turn up to work and find they couldn't access the system and an operator then had to clear down things.

You may think I was harsh trying to get someone fired for this but this was the same person who stayed on imperial when the rest of the factory went metric. Part of his job was to order pallets. He'd be told the measurement in metric, convert and order in imperial. The place that made them would convert back to metric. About 1 in a 100 was out.

Roy Lambert
Mon, Nov 11 2013 5:27 PMPermanent Link

Jeff Cook

Aspect Systems Ltd

Avatar

"Roy Lambert" <roy@lybster.me.uk> wrote in message
news:67AD20E0-7BE4-4684-B8A2-C3C475F0B5F7@news.elevatesoft.com...
> Jeff
>
>
> Many many years ago before I moved into recruitment and I was a full time
> paid IT professional I had to set up full logging for a couple of systems.
> Back in those days disk was expensive and you didn't fill it with crap and
> I managed to get the logging taken out after a couple of years of the
> files just growing.
>
> At another job where I was IT Manager we ran Unisys mainframes which had
> full keystroke logging built into the OS. We used it to proved that a
> plonker would lock up his terminal by putting in the wrong password
> umpteen times and once he'd locked up his terminal went around other
> peoples trying to log in there and failing. Still couldn't get him fired
> for stupidity though Frown
>
> He used to come in early and the net result was people would turn up to
> work and find they couldn't access the system and an operator then had to
> clear down things.
>
> You may think I was harsh trying to get someone fired for this but this
> was the same person who stayed on imperial when the rest of the factory
> went metric. Part of his job was to order pallets. He'd be told the
> measurement in metric, convert and order in imperial. The place that made
> them would convert back to metric. About 1 in a 100 was out.

We try to make our systems fool-proof, then idiot-proof and all it does is
train the idiots to be more cunning.  Then there are the plonkers ... What
can we do about them?  Go sit on a tropical island?

Cheers

--
Jeff Cook
Aspect Systems Ltd
www.aspect.co.nz

Tue, Nov 12 2013 3:27 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Jeff


>all it does is
>train the idiots to be more cunning.

<vbg>

Roy
Wed, Nov 20 2013 2:27 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Jeff,

<< Now the debate as to whether to use these files or to use TRIGGERs  -  Or
ditch the logging entirely >>

The only catch/decision with using published tables and update files for
audit logging is how often should you execute a SAVE UPDATES statement to
save the updates to an UPDATE file.  If you want fairly quick access to the
audit log, then you'll want a scheduled job executing that saves the updates
on a minute (or slightly higher) basis.  I would also recommend that you do
the following in the job:

1) Execute a SAVE UPDATES
2) Execute a CREATE TEMPORARY TABLE....FROM UPDATES to dump the update file
3) Insert the contents of that temporary table into a permanent table that
has the exact same structure
4) Delete the updates file.

This will give you a permanent log that will serve you well for such
purposes.

Tim Young
Elevate Software
www.elevatesoft.com
Page 1 of 2Next Page »
Jump to Page:  1 2
Image