Centralized Asynchronous Auditing with Service Broker

By Mladen Prajdić on 16 July 2007 | Tags: Administration , Service Broker

Service Broker is a new feature in SQL Server 2005. It is an integrated part of the database engine and it provides queuing and reliable direct asynchronous messaging between SQL Server 2005 instances only. In the future this is planned to be extended to non-SQL Server instances. This article shows how to use Service Broker and triggers to capture data changes.

How Service Broker works

What Service Broker does is it talks or converses with other service brokers. It does that by exchanging messages in a dialog conversation between two service brokers. Imagine two people talking to each other. The words they exchange are messages and their conversation is a dialog. To fully understand the basics we must become familiar with the terminology and what does what. I will show only basic commands. More info can be found in SQL Server Help better known as Books Online.

Commands are explained in the order of needed object creation for Service Broker conversations.

The basis of everything is a message type. A message type defines the validation of the XML message that will be performed.

The general syntax is:

CREATE MESSAGE TYPE message_type_name
 [ AUTHORIZATION owner_name ]
} ]

This is how to create a simple message type with validation that conforms to well formed XML:


Next step is to create a message type contract. This contract defines which message types are allowed in a conversation. For example, if we take our two people talking from before, a contract means that they are only allowed to talk about sports. Anything else is rejected by both persons as garbage.

The general syntax is:

CREATE CONTRACT contract_name
 [ AUTHORIZATION owner_name ]
 ( { { message_type_name | [ DEFAULT ] }
} [ ,...n] ) 

In the SENT BY part we specify which message is allowed to be sent from each point of the conversation. So we can specify that the initiator can talk about sports and women, while the target can only talk about sports.

This is how to create a simple contract type with previously created message type that can be only sent by the initiator:

CREATE CONTRACT [//Audit/Contract] ([//Audit/Message] SENT BY INITIATOR)

Next comes a queue. A queue holds every message received by each point in the conversation. Each point of conversation has it's own queue in which the received messages are waiting for processing

The general syntax is:

CREATE QUEUE [ database_name. [ schema_name ] . | schema_name. ] queue_name

This is how to create a simple queue which takes:

  • a stored procedure name to execute when a new message arrives in the queue
  • maximum number of concurrently running stored procedures (for very busy queues)
  • and user under whose context the procedure will be run
CREATE QUEUE dbo.TargetAuditQueue 
 PROCEDURE_NAME = usp_WriteAuditData, -- sproc to run when the queue receives a message
 MAX_QUEUE_READERS = 50, -- max concurrently executing instances of sproc
 EXECUTE AS 'dbo' );

Every queue is associated with a service. A service exposes the functionality of contracts associated with the service to other contracts. It defines which message types the associated queue will receive. Other types are rejected. If no contract is specified then that service can only be an initiator of the conversation so we have to specify which queue it will use and which contrats are allowed.

The general syntax is:

CREATE SERVICE service_name
 [ AUTHORIZATION owner_name ]
 ON QUEUE [ schema_name. ]queue_name
 [ ( contract_name | [DEFAULT] [ ,...n ] ) ]

This is how to create a simple service with previously created contract on our previously created queue with dbo authorization:

CREATE SERVICE [//Audit/DataWriter] 
 ON QUEUE dbo.TargetAuditQueue ([//Audit/Contract])

Building the Centralized Asynchronous Auditing System

Now that we're familiar with the basics of Service Broker we can go on with building our auditing system. Auditing is the means of tracking changes of your data. It provides you with a log of who did what when. In the US and EU it is also required by law for sensitive data.

Auditing is usually done by inserting changed data into the accompanying audit table in the trigger of the source table. Or it can be a part of the Update, Delete and Insert stored procedures in which case we don't need triggers. Simplest way to implement auditing is for every table to have another audit table with the same structure. The downside is that these audit tables grow very fast. This increases the database size and backup/restore times which is a negative side effect. With a lot of tables this becomes cumbersome an hard to maintain. Imagine the work you have to do if you have 10 databases with each having 40 tables. That's 10 x 40 x 2 = 800 tables you have to create. And those are small databases table-wise.

Another option is to use third-party Log Readers but they aren't fun, are they? :)

I went about it differently. I wanted to have only one database that will hold all my audited data from every database I have on the server. This way my other databases would be free of bloated audited data. If i wanted to query the audited data i ccould simply select from one table that holds everything. I also wouldn't want this scenario to impact my performance. How to implement this? Service Broker to the rescue.

Service Broker's reliable asynchronous messaging was the perfect solution. I still used triggers but the technique can be easily used in stored procedures and OUTPUT clause. I created two databases each with its own Service Broker. The auditing database is called MasterAuditDatabase and the database to be audited is called TestDb1. Both databases have SET TRUSTWORTHY ON which enables us to use cross-database service broker communication without the use of certificates. Each database has its own error table that holds errors that happen in Service Broker communication. Yes, they may happen :)

When an insert, update or delete happens our audit trigger simply takes appropriate data from the inserted and deleted pseudo tables, turns it into our well formed XML message and uses Service Broker to send it to another Service Broker in the Master Audit Database which then saves it to our auditing table. Every time a new message arrives to our target queue, a stored procedure is executed that inserts the queued message into the audit table.

For busy systems only one stored procedure couldn't possibly cope with all incoming messages. That is why we set the MAX_QUEUE_READERS = 50. To allow 50 concurrent (parallel) queue readers. Because this is a completely asynchronous operation there's no impact on performance on the initiator end. The triggers return immediately.

The code can be downloaded and is well documented so I hope it speaks for itself. The code is broken into two scripts.

  • The Master Audit Database (the database that holds all audited information) and its Service Broker infrastructure
  • Sample audited database and its Service Broker infrastructure

Note: You will need to copy the GUID returned by the first script and use it in the second script.

This method of auditing proved to be very good with no noticeable impact on performance. Of course the master audit database grows fast and horizontal partitioning will be surely needed. It can also be used with multiple servers where one server serves as keeper of audited data from other servers. To do this we'd have to create TCP/IP endpoints and create users, certificates and other security measures which I will show how to do in the next article.


Service Broker is a great addition to SQL Server and it's use has just barely scratched the surface. The future is leaning to service oriented architecture (SOA) and loosly coupled applications. With the coming of .Net 3.5 and adoption of Windows Communication and Workflow Foundations, Service Broker will fit in nicely. So learn it.

Related Articles

Advanced SQL Server 2008 Extended Events with Examples (25 May 2009)

Introduction to SQL Server 2008 Extended Events (19 May 2009)

Monitoring SQL Server Agent with Powershell (24 March 2009)

SQL Server Version (29 January 2009)

Scheduling Jobs in SQL Server Express - Part 2 (1 December 2008)

Scheduling Jobs in SQL Server Express (27 August 2008)

Alerts for when Login Failures Strike (14 July 2008)

Using xp_ReadErrorLog in SQL Server 2005 (12 May 2008)

Other Recent Forum Posts

Backing up partial data to a new DB (13h)

Automated Patching of SQL Servers (21h)

CASE Statement EXISTS Query Check Assist (1d)

Managed Instance - Storage Blob Container Backup file Upload issue (2d)

Need help with nested list in SQL (3d)

Can't restore bak file from MS SQL Server 2000 (5d)

Connecting to SQL Server from externally with proxy server (6d)

How to Find if Java is Installed (6d)

- Advertisement -