Improving SQL Service Broker Performance

A while back I was talking to a Microsoft Support Engineer and he had mentioned that in a high load Service Broker environment such as ours there can be some impressive performance improvement can be achieved by reusing the service broker sessions.

The cost of creating and closing a new conversation for every message is about 4x, while the performance increase when receiving messages is about 10x.  Remus Rusanu talks more about the numbers and shows a solution for reusing conversations on his blog posting Reusing Conversations.

I liked Remus’s solution, but an issue that I had with it was that I didn’t want to have a different conversation for each spid.  If I used this method I would end up having hundreds of conversations open and I’d need a job to close them.  Within our application any number of events can trigger a service broker message to be sent off.  We usually have a few hundred threads logging in and out of the database at any one time.

This required that I take Remus’s solution and make it more flexible before moving it into our environment.  The solution that I’ve come up with supports a single conversation for each process within our application.  And at random intervals those conversations are closed, and new ones are started.

First, let me provide you a little background about our Service Broker setup.  I’ve got two queues and two services.  Each service is tied to it’s own queue.  I’ve got a single message type which is used to send the message from the source to the destination.

My Message Type is called MT_ObjectDelete
My Contract is called CT_ObjectDelete
My Queues are called Q_ObjectDelete_Source and Q_ObjectDelete_Destination
My Services are called Svc_ObjectDelete_Source and Svc_ObjectDelete_Destination
I also have another Message Type called MT_ConversationSwitch which is also part of the CT_ObjectDelete contract.

In order to store the conversation handles which I’ll be generating I have an already existing table called Setting which holds a variety of settings. I’ve added new rows to this table to hold the conversation endpoints. The look up name in the Setting table is SSB_Session_Delete. I use a function called GetSystemSettingValue to look into the Setting table and get the value. My stored procedure for sending data looks like this.

CREATE PROCEDURE [dbo].[SendDelete]
@FileName varchar(1024)
AS
DECLARE @DialogHandle uniqueidentifier,
@msg XML,
@date nvarchar(100)
BEGIN
IF @msg IS NOT NULL
BEGIN

    SET @DialogHandle = cast(dbo.GetSystemSettingValue('SSB_Session_Delete') as uniqueidentifier)

    IF CAST(RAND()*100 AS INT) = 0 OR @DialogHandle IS NULL
BEGIN
IF @DialogHandle IS NOT NULL
SEND ON CONVERSATION @DialogHandle
MESSAGE TYPE [MT_ConversationSwitch]


BEGIN DIALOG CONVERSATION @dialogHandle
FROM SERVICE [Svc_ObjectDelete_Source]
TO SERVICE 'Svc_ObjectDelete_Destination'
ON CONTRACT [CT_ObjectDelete_Multi];

        UPDATE Setting
SET DefaultValue = @DialogHandle
WHERE SettingName = 'SSB_Session_Delete';
END;
SEND ON CONVERSATION @dialogHandle
MESSAGE TYPE [MT_ObjectDelete]
(@msg)


IF @DialogHandle <> cast(dbo.GetSystemSettingValue('SSB_Session_Delete') as uniqueidentifier)
SEND ON CONVERSATION @DialogHandle
MESSAGE TYPE [MT_ConversationSwitch]
END
END
GO

As you can see from the code, I do a look up to get the existing conversation handle. If no handle exists I create a new conversation and store the value in my settings table. If my CAST(RAND()*100 AS INT) returns 0 I send a message using the MT_ConversationSwitch message type on the existing conversation which will trigger logic in my receiving procedure telling it to end the conversation. After I end the conversation I create the new conversation and store the value.

I then send a message on that conversation handle. I then check the Settings table one more time to ensure that the value I’m using matches the table. If it does not I close the conversation as I assume that another thread has been running the same block of code and I don’t want to leave this conversation hanging out there for ever.

My receive procedure is fairly basic.

CREATE PROCEDURE [dbo].[util_ProcessDelMonitorData]
@MsgToRead INT = 1000
AS

DECLARE @message_type_name sysname;
DECLARE @message_body VARBINARY(max)

DECLARE @msgTable TABLE
(
message_body VARBINARY(max),
[conversation_handle] UNIQUEIDENTIFIER
);
BEGIN
DECLARE @conversation_handle UNIQUEIDENTIFIER


WHILE 1=1
BEGIN
WAITFOR (RECEIVE TOP (1) @Message_Body = message_body,
@conversation_handle = [conversation_handle],
@message_type_name = message_type_name
FROM [Q_ObjectDelete_Destination]), TIMEOUT 1000

    IF @conversation_handle IS NULL
BEGIN
break
END


IF @message_type_name = 'MT_ConversationSwitch'
END CONVERSATION @conversation_handle

    IF @message_body IS NOT NULL
BEGIN
INSERT INTO @msgTable
(message_body, [conversation_handle])
values
(@message_body, @conversation_handle)
END

    SET @conversation_handle = NULL
SET @message_type_name = NULL
END

/*Business Logic happens here*/
END

GO

As you can see, I do a standard receive of the first message. If there is no value in the message I break out and process the data which I have logged into the @msgTable table variable.

If the message_type_name is the MT_ConversationSwitch message type I do an END CONVERSATION on the conversation.

If there is data in the @message_body variable I dump the value into the table variable for later processing by the business logic.

There is also an activated procedure on the sending queue which simply does an END CONVERSATION on each message that it receives.

Hopefully you will find this example helpful. If you have any questions on it please post them and I’ll get a response to you as quickly as I can.

Denny

Share

One Response

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?