Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Pakl
Starting Member
10 Posts |
Posted - 2005-06-13 : 08:49:30
|
Hi!I try to log changes in the database by logging the old and new values of columns.My update trigger looks something like this:declare @bit int , @field int , @char int, @FName varchar(50), @ID int, @OldValue nvarchar(100), @NewValue nvarchar(100) set @field =0 declare Cur cursor local for SELECT ID FROM inserted open Cur fetch Cur into @ID while (@@fetch_status=0) begin while @field < (SELECT MAX(colid) FROM syscolumns WHERE id = (SELECT id FROM sysobjects WHERE name = 'TestTable')) begin SET @field = @field + 1 SET @bit = (@field - 1) % 8 + 1 SET @bit = power(2,@bit - 1) SET @char = ((@field - 1) / 8) + 1 if (substring(COLUMNS_UPDATED(),@char, 1) & @bit > 0) begin SELECT @FName = name FROM syscolumns WHERE colid = @field and id = (SELECT id FROM sysobjects WHERE name='TestTable') --getOldValue (from deleted) --getNewValue (from inserted) EXEC LogUpdate 'TestTable', @ID, @FName, @OldValue, @NewValue end end fetch Cur into @ID endend My question is how can I get the value of a specific column from the tables inserted and deleted!?The problem is that I can't use a dynamic sql as the tables deleted and inserted are not available in those tables.Any idea?It would be really great if someone could give me a clue...thxpakl |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-06-13 : 10:12:23
|
Something like:INSERT INTO dbo.MyAuditTableSELECT CASE WHEN I.ID IS NULL AND D.ID IS NOT NULL THEN 'Deleted' WHEN I.ID IS NOT NULL AND D.ID IS NULL THEN 'Inserted' ELSE 'Updated' END as [Action], GetDate() as [AuditDate], MyTable.*FROM dbo.MyTable LEFT OUTER JOIN inserted I ON I.ID = MyTable.ID LEFT OUTER JOIN deleted D ON D.ID = MyTable.ID MyAuditTable needs to have an initial columns [Action] (VARCHAR) and AuditDate, followed by all the same columns as the original table.Alternatively JUST store the details from DELETED with a WHERE clause of "D.ID IS NOT NULL" - that way the actual MyTable stores the CURRENT value, and the Audit Table ONLY stores the previous values. Saves a bit of space ...Kristen |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2005-06-13 : 10:17:24
|
| Is your objective to use a single log table to capture changes from multiple tables? and use a a single column to hold all oldValues and a single column to hold all newValues no matter what the dataType? That is not the typical way to log changes. A more typical method is to have one log table for each table you're logging changes for. Then just put the new values in (for entire row) as inserts/updates/deletes occur.Looks like your logic is faulty anyway, even if you could retrieve the old and new values where you want to. Your cursor is looping through rows but your code seems to be working as if your were looping through columns (in a row). It's possible to do this but 1 big issue I would have is storing all old and new values as a single datatype. Am I mis-understanding your objective?Be One with the OptimizerTG |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-13 : 11:10:19
|
Cool approach....But!!! I think there are easier ways to accomplish logging from triggers.-------------------------------------------------------------------------You will have to tweak something like this into the trigger (yuk):use northwindgoset nocount on-- select * from orders-- I use northwind..orders as example-- You will be working with the inserted + deleted tables in the trigger.-- Say this is the current field we are parsing...declare @FName varchar(50)set @FName = 'ShipName' -- 'ShipName' 'CustomerID' 'OrderDate', -- get the data into a temp table, use 1 row as example (which will be available in dynamic sql)select top 1 * into #inserted from orders-- need som variables :)declare @dynamicsql nvarchar(1000)declare @paramterdef nvarchar(1000)declare @myvalue sql_variantset @paramterdef = '@myvalue sql_variant output'set @dynamicsql = 'select @myvalue=' + @FName + ' from #inserted'-- execute the dynamic sql and retrieve the value of the field as an output parameterexec sp_executesql @dynamicsql, @paramterdef, @myvalue output-- your value is available hereselect @myvalue as [dynamically retrieved value]-- cleanupdrop table #inserted Have Fun!Good Luck!rockmooseEDIT: I only did this so I could use the sql_variant datatype And TG is right about the logic:I think You need an inner loop inside the cursor in the trigger. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2005-06-13 : 11:21:25
|
Well if you wanted to take that logging approach, I'd probably go with something like this. (I would still use one log table per data table though)set nocount on--data tablecreate table junk (junkid int identity(1,1) primary key ,c1 varchar(10) ,c2 varchar(10) ,c3 varchar(10) ,c4 varchar(10) ,c5 varchar(10) ,c6 int ,c7 int ,c8 int ,c9 int ,c10 int)go--log tablecreate table EditLog ( EditLogId int identity(1,1) primary key ,tableName varchar(30) ,actionType varchar(6) ,junkid int ,ColName varchar(30) --big enough to ,OldValue varchar(30) ,NewValue varchar(30) ,updateBy varchar(30) ,updateDate datetime)go--insert sample datainsert junk (c1,c2,c3,c4,c5,c6,c7,c8,c9,c10)select '1','1','1','1','1',1,1,1,1,1 unionselect '2','2','2','2','2',2,2,2,2,2 unionselect '3','3','3','3','3',3,3,3,3,3 go--trigger for UPDATEcreate trigger trJunk on junk for update asbegin insert EditLog (tablename ,ActionType ,junkid ,colName ,oldValue ,newValue ,updateBy ,updateDate) select 'junk' ,'Update' ,i.junkid ,c.Column_Name ,OldValue = case when ordinal_position = 2 then convert(varchar(10),d.c1) when ordinal_position = 3 then convert(varchar(10),d.c2) when ordinal_position = 4 then convert(varchar(10),d.c3) when ordinal_position = 5 then convert(varchar(10),d.c4) when ordinal_position = 6 then convert(varchar(10),d.c5) when ordinal_position = 7 then d.c6 when ordinal_position = 8 then d.c7 when ordinal_position = 9 then d.c8 when ordinal_position = 10 then d.c9 when ordinal_position = 11 then d.c10 end ,NewValue = case when ordinal_position = 2 then convert(varchar(10),i.c1) when ordinal_position = 3 then convert(varchar(10),i.c2) when ordinal_position = 4 then convert(varchar(10),i.c3) when ordinal_position = 5 then convert(varchar(10),i.c4) when ordinal_position = 6 then convert(varchar(10),i.c5) when ordinal_position = 7 then i.c6 when ordinal_position = 8 then i.c7 when ordinal_position = 9 then i.c8 when ordinal_position = 10 then i.c9 when ordinal_position = 11 then i.c10 end ,system_user ,getdate() from inserted i join deleted d on i.junkid = d.junkid --cross join to column list (one row for each column) cross join information_schema.columns c where c.table_name = 'junk' --exclude the identityColumn and c.ordinal_position > 1 --only the columns updated and substring(columns_updated(), (c.ordinal_position/8)+1, 1) & power(2, (c.ordinal_position-1)%8) > 0endgo--Perform some updatesupdate junk set c2 = '10' ,c8 = 10 where junkid in (1,2)update junk set c1 = '11' ,c2 = '11' ,c10 = 11where junkid in (3)--check the log tableselect * from EditLoggo--clean up junkdrop trigger trJunkgodrop table editLoggodrop table junkgo Be One with the OptimizerTG |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2005-06-13 : 11:23:55
|
| >>Why does everyone over think this?Because it's there...I didn't overthink what I would do, I aggree with you Brett. But that was was fun challenge (for me). Plus I'm bored.Be One with the OptimizerTG |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-13 : 11:28:55
|
Very nice TG, envy. Another caveat in the original approach.Don't trust colid in syscolumnscreate table x(a int, b int, c int)goselect colid,name from syscolumns where id = object_id('x')alter table x drop column bgoselect colid,name from syscolumns where id = object_id('x')drop table xrockmoose |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-06-13 : 11:29:39
|
| We've done logging both ways.Of the two the full-row Audit table is MUCH MUCH MUCH better - i.e. one audit table per "real data" table.There are two reasons for this. The single-table-logs-all-changes-column-by-column suffers from:1. Nightmare maintaining the Triggers for the tables (the above methods would work, but IME the performance was dreadful, so we unwrapped the code to handle each column by name - the trigger script was many-MANY-megabytes big)2. It was a COMPLETE nightmare to write reports against.3. The Audit Table was HUGE and had TERRIBLE performance - so many rows with so little useful indexable information ...Don't do it. Just don't do it.Kristen |
 |
|
|
Pakl
Starting Member
10 Posts |
Posted - 2005-06-14 : 03:39:03
|
| Thank you all for the great input!!! (helped me a lot TG,rockmoose, Kristen)Currently I am trying to incorporate these suggestions - I will post again, when I am done.A few comments: 1. yes, I wanted (and still want) to log all changes into a single table, performance isn't an issue. I don't want to administer hundreds of log tables by hand... 2. in the original trigger there is the "column" loop (i forgot to post it) - sorry. -- I updated the code. 3. Why can't I trust colids in syscolumns? In your sample everything seems just fine!? These fields can't change while the trigger is executing, or can they? |
 |
|
|
Pakl
Starting Member
10 Posts |
|
|
spirit1
Cybernetic Yak Master
11752 Posts |
Posted - 2005-06-14 : 04:49:05
|
yeah... nigel rocks! Go with the flow & have fun! Else fight the flow |
 |
|
|
Pakl
Starting Member
10 Posts |
Posted - 2005-06-14 : 05:47:19
|
Okay.. after looking at nigels excellent example I created this trigger which does [at the moment] exactly what I wantCREATE TRIGGER TestTable_UpdateT ON [dbo].[TestTable] FOR UPDATEASdeclare @bit int ,@field int ,@maxfield int ,@char int ,@fieldname varchar(128) ,@oldvalue varchar(1000),@newvalue varchar(1000),@ID int,@TableName varchar(128) ,@sql varchar(2000),@IsOk intbegin create table #Audit (FieldName varchar(128), OldValue varchar(1000), NewValue varchar(1000), ID int) select * into #inserted from inserted select * into #deleted from deleted set @TableName = 'TestTable' select @field = 0, @maxfield = max(ORDINAL_POSITION) from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = @TableName while @field < @maxfield begin select @field = min(ORDINAL_POSITION) from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = @TableName and ORDINAL_POSITION > @field select @bit = (@field - 1 )% 8 + 1 select @bit = power(2,@bit - 1) select @char = ((@field - 1) / 8) + 1 if substring(COLUMNS_UPDATED(),@char, 1) & @bit > 0 begin select @fieldname = COLUMN_NAME from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = @TableName and ORDINAL_POSITION = @field if (dbo.GetDatatype4Field(@TableName,@fieldname) in ('binary','varbinary','char','varchar','nchar','nvarchar','datetime','smalldatetime','decimal','numeric','float','real','bigint','int','smallint','tinyint','bit')) begin set @sql = 'INSERT INTO #Audit (FieldName,OldValue,NewValue,ID) SELECT ''#FIELDNAME#'' as FieldName, convert(varchar(1000),d.#FIELDNAME#) as OldValue, convert(varchar(1000),i.#FIELDNAME#) as NewValue, i.ID as ID FROM #inserted i full outer join #deleted d on i.ID = d.ID' set @sql = REPLACE(@sql,'#FIELDNAME#',@fieldname) exec (@sql) end end end declare Cur cursor local for select FieldName, OldValue, NewValue, ID FROM #Audit open Cur fetch Cur into @fieldname, @OldValue, @NewValue, @ID while (@@fetch_status=0) begin exec LogUpdate @TableName, @ID, @fieldname, @OldValue, @NewValue fetch Cur into @fieldname, @OldValue, @NewValue, @ID end close Cur drop table #Audit drop table #inserted drop table #deletedendAnd I created this small function...CREATE FUNCTION dbo.GetDatatype4Field (@Table varchar(255), @Field varchar(255))RETURNS varchar(50) AS BEGIN declare @returnValue varchar(50) select @returnValue = DATA_TYPE from INFORMATION_SCHEMA.COLUMNS where lower(TABLE_NAME) = lower(@Table) and lower(COLUMN_NAME)=lower(@Field) return @returnValueEND Thanks to all!paklthings I didn't know before:1. You can just dump the data from inserted, deleted in local tables2. You can manipulate local tables in dynamic sql 3. You can use INFORMATION_SCHEMA instead of joining over syscolumns |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-14 : 07:29:56
|
Cool, I guess You prefer generality over performance Nigel rocks.<reminder to self> before doing anything check if Nigel did it better before... </>syscolumns is ok, it's just that the colid numbering sequence can contain gaps,but it's not an issue in this case.rockmoose |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-06-15 : 07:41:47
|
| "I don't want to administer hundreds of log tables by hand"Me neither ... all of our "Auditing Triggers" are mechanically generated - i.e. a couple of SQL statements will create them for all appropriate tables.Kristen |
 |
|
|
Pakl
Starting Member
10 Posts |
Posted - 2005-06-17 : 05:31:30
|
quote: Originally posted by Kristen "I don't want to administer hundreds of log tables by hand"Me neither ... all of our "Auditing Triggers" are mechanically generated - i.e. a couple of SQL statements will create them for all appropriate tables.Kristen
You're right of course. That's a great way to do that... |
 |
|
|
|
|
|
|
|