May 282008
 

A post came up a few days ago on the MSDN SQL Forums, wanting to know how to audit server log offs.  He mentioned that there is a Log In trigger, but couldn’t find a log out trigger.  After a bit of thought, and a quick discussion with Rob Farley, we figured that Service Broker should be able to process the AUDIT_LOGOUT events and capture the results to a table.

Here’s the code:

 
USE master;
GO
CREATE DATABASE AuditLogOuts;
GO

-- Enable Service Broker for the new database
ALTER DATABASE AuditLogOuts SET ENABLE_BROKER;
GO

USE AuditLogOuts;
GO
-- Create a queue
CREATE QUEUE Queue_Audit_LogOut; 
GO
-- Create a service
CREATE SERVICE Service_Audit_LogOut
ON QUEUE Queue_Audit_LogOut([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]);
GO
-- Create a route
CREATE ROUTE Logon_Triggers_Route 
WITH SERVICE_NAME = N'Service_Audit_LogOut', 
ADDRESS = N'LOCAL';
GO
-- Create the event notification at the server level for the AUDIT_LOGIN event
CREATE EVENT NOTIFICATION Notification_Audit_LogOut ON SERVER 
FOR AUDIT_LOGOUT TO SERVICE 'Service_Audit_LogOut', 'current database';
GO

CREATE TABLE AuditLogOuts(
      DateAudited datetime not null default getdate()
    , PostTime datetime
    , SPID int
    , DatabaseID int
    , NTUserName nvarchar(128)
    , NTDomainName nvarchar(128)
    , HostName nvarchar(128)
    , ApplicationName nvarchar(128)
    , LoginName nvarchar(128)
    , Duration bigint
    , SourceData XML
);
GO

-- Create the stored procedure that will handle the events
-- First set the options required to work with the XML data type
CREATE PROCEDURE usp_Process_Audit_LogOut AS
BEGIN
    SET NOCOUNT ON;
    -- Use an endless loop to receive messages
    WHILE (1 = 1)
    BEGIN
        DECLARE @messageBody VARBINARY(MAX);
        DECLARE @messageTypeName NVARCHAR(256);
        WAITFOR ( 
            RECEIVE TOP(1) @messageTypeName = message_type_name, @messageBody = message_body
            FROM Queue_Audit_LogOut
        ), TIMEOUT 500
        -- If there is no message, exit
        IF @@ROWCOUNT = 0
        BEGIN
            BREAK;
        END;

        -- If the message type is EventNotification do the actual work 
        IF (@messageTypeName = 'http://schemas.microsoft.com/SQL/Notifications/EventNotification')
        BEGIN
    
            DECLARE
                  @xmldata xml
                , @PostTime datetime
                , @SPID int
                , @DatabaseID int
                , @NTUserName nvarchar(128)
                , @NTDomainName nvarchar(128)
                , @HostName nvarchar(128)
                , @ApplicationName nvarchar(128)
                , @LoginName nvarchar(128)
                , @Duration bigint

            SET @xmldata = CAST(@messageBody AS XML);

            SET @PostTime = @xmldata.value('(/EVENT_INSTANCE/PostTime)[1]', 'datetime');
            SET @SPID = @xmldata.value('(/EVENT_INSTANCE/SPID)[1]', 'int');
            SET @DatabaseID = @xmldata.value('(/EVENT_INSTANCE/DatabaseID)[1]', 'int');
            SET @NTUserName = @xmldata.value('(/EVENT_INSTANCE/NTUserName)[1]', 'nvarchar(128)');
            SET @NTDomainName = @xmldata.value('(/EVENT_INSTANCE/NTDomainName)[1]', 'nvarchar(128)');
            SET @HostName = @xmldata.value('(/EVENT_INSTANCE/HostName)[1]', 'nvarchar(128)');
            SET @ApplicationName = @xmldata.value('(/EVENT_INSTANCE/ApplicationName)[1]', 'nvarchar(128)');
            SET @LoginName = @xmldata.value('(/EVENT_INSTANCE/LoginName)[1]', 'nvarchar(128)');
            SET @Duration = @xmldata.value('(/EVENT_INSTANCE/Duration)[1]', 'bigint');

            INSERT INTO AuditLogOuts(SourceData, PostTime, SPID, DatabaseID, NTUserName, 
                NTDomainName, HostName, ApplicationName , LoginName, Duration) 
            VALUES (@xmldata, @PostTime, @SPID, @DatabaseID, @NTUserName, 
                @NTDomainName, @HostName, @ApplicationName , @LoginName, @Duration)

            -- Process your cleanup, based on your username, or SPID, or whatever
        END;
    END;
END;
GO

-- Link the stored procedure to the Queue_Audit_LogOut 
ALTER QUEUE Queue_Audit_LogOut WITH STATUS=ON,
   ACTIVATION (STATUS=ON, PROCEDURE_NAME = usp_Process_Audit_LogOut, MAX_QUEUE_READERS = 1, EXECUTE AS SELF);
GO


-- Open and close some connections, then have a look in the table:
SELECT * FROM AuditLogOuts;


-- Rollback - drop the database and remove the event notification.
/*
USE Master
GO
DROP DATABASE AuditLogOuts
GO
DROP EVENT NOTIFICATION Notification_Audit_LogOut ON SERVER
*/

Special thanks to Cristian Lefter’s article for the original framework for using Service Broker.

May 272008
 

I was tuning a SQL Server 2000 query this afternoon which had a function inside the WHERE clause.  The query was well indexed, and the only reason for the performance problems was due to the WHERE clause (additionally, there were six other functions being called in the SELECT clause, but as no rows were being returned, they weren’t a problem).  Additionally, the function in the WHERE clause called another function.  As there were 4,500 rows to be processed in the WHERE clause, that’s 9000 function calls.

In SQL Server 2005, the engine would cache those calls, so the function would only need to be compiled/looked up in the proc cache once per query.  However, in SQL Server 2000, this is not the case, and the function is compiled/looked up every time it’s called – in this case, 9000 recompilations of functions.

There are a number of things that can be tried to improve this performance – such as removing the function calls and replacing them with in-line SQL code (depending on the complexity of the function).  The downside to this is the query becomes less maintainable, which is probably the reason a function was added in the first place!  Another option is to process the code from the function for the entire set of possible data (such as in a view) and then join this onto the row set, restricting it to the rows you want.  This minimises function calls, but may not be as flexible as needed.

The final option is to upgrade to SQL Server 2005.

TIP: When tuning queries with a function call, try running the query without the function call and compare the duration.  This allows you to quickly see if the function call is responsible for the performance issue, or if it’s something else.

May 252008
 

On Friday night, Phil and I went to Swinburne University’s Centre for Advanced Internet Architectures (CAIA) to act as guinea pigs for Warren’s PhD.  Essentially, he is researching how to monitor and manage large networks by using a 3D game engine.  The engine being used is Quake3, as it is open source and runs on all major platforms.  L3DGEWorld is the name of the tool created. A video is available that shows how a port scan can show up in the 3D world.

The exciting part of the usability test consisted of five scenarios which Phil and I had to analyse and try to figure out what was going on.  We flew around a virtual world which contained items in the world that represented laptops, routers, greynet hosts, and VOIP phones.  As each device reported activity, the visual representation of it would change.  More active devices would be larger, and spin faster, whereas broken devices would bounce to get attention.  By flying around the virtual world, Phil and I were able to direct each other’s attention at things that looked out of place.

We had a beer and discussion about how this could be useful in the SQL world, and decided that it really comes into its own when you have a large amount of data to look at visually.  For example, a graphical representation of sp_who2 would come in handy, so you could see exactly which processes were affecting the server.  However, a simple list ordered correctly could give you the same information.

But it’s a lot more fun blowing away an offending process with a nailgun.

May 212008
 

I went to the Melbourne SQL Server User Group  meeting last night to hear Darren Gosbell give a lightning fast overview of the new and improved BI features in SQL Server 2008.  I haven’t had too much to do with the BI side of SQL Server (with the exception of Reporting Services), but it was an excellent presentation, and very informative.

Darren broke the presentation up into three sections – highlighting the performance enhancements of the SSRS, SSIS and SSAS engines, the changes to the designers and wizards, and the new disaster recovery features.

SSIS benefits from an automatic threading model that will detect if a second thread can be started to help process rows.  The example was one million rows to be processed in a "complex" manner.  SSIS in 2005 would perform the one millon rows in batches, while 2008 detected that the complex operation was taking time, and would start an additional thread to make better use of the available CPU.  This resulted in the task taking half the time of 2005.

SSRS has had massive architectural changes, with the removal of IIS as a requirement.  Report Manager has also been combined with the web services, and it also manages its memory in a much more efficient way, preventing out of control memory conditions.  It also has the ability to spool pages immediately, and can output to Word documents.  Additionally, reports now have the ability to create a "tablix", which is a combination of a table and a matrix.

All up, an excellent presentation.  I’m looking forwards to getting my hands on the Reporting Services side of things!