• Madeira Team

Service Broker Sample: Multi-Threading

Introduction

Service Broker is a cool new feature since SQL Server 2005. Yes, SQL 2012 just came out so it’s hardly “new” anymore. But in reality, many DBAs and DB users are still not aware of this feature, or don’t know how to utilize it, or are afraid to do so because they’re not familiar with it.

Using this short series of posts, I’m hoping to ease the minds of such people.

In short, Service Broker is an asynchronous queuing engine working inside SQL Server. Using this engine we can send and receive binary or XML messages between queues. When a message is inserted in a queue, Service Broker executes an “activation procedure” that you specified, which is then supposed to handle messages from that queue.

Since we have a lot to cover in our sample, I won’t go into too much detail introducing Service Broker. But Microsoft has an excellent and detailed chapter about Service Broker available right here: http://msdn.microsoft.com/en-us/library/bb522893.aspx

Sample 1: Multi-Threading

I managed to encounter several times a scenario in which an application needed to execute several queries independent of each other, and then group them together somehow. However, each of these queries takes a while to execute.

One might wonder: If only we could use some kind of multi-threading mechanism to execute them in parallel, and then the total running time will be that of the slowest query only and not the total of them.

Well, it just so happens that our first sample will show how you can implement “multi-threading” in SQL Server using Service Broker queues!

The Design

First, I’ll explain the general idea of the design:

Since Service Broker only works with binary or XML messages, we’ll use XML to send a “query request”, and wait for a “query response”.

Specifically, we’ll send a message containing some kind of a dynamic query. But that dynamic query must return its result in XML form using an output parameter. Then, Service Broker will execute the query, retrieve the result from the output parameter, and send the result back as the response.

After that, it’s up to us to gather all the responses together and do whatever it is we wanted to do with it.

Also, for the sake of our sample, we’ll need to simulate long execution times. We’ll do this simply by adding a WAITFOR DELAY command into our dynamic queries which will cause the query to wait for a random number of seconds (between 10 and 30).

The Scripts

Available below is a ZIP file with the two scripts we’ll use:

One is the installation script, responsible for creating all the objects we require – the database, the service broker procedures, endpoints and queues. Everything you need to implement multi-threading in SQL Server.

The other is a sample of how to use this mechanism, using simple queries and a WAITFOR DELAY command to simulate long execution times.


zip

The scripts are commented to help you understand what’s going on. But we’ll go over them bit by bit just in case.

The Installation Script

In order to explain the installation script, I’ll list all the objects that we’re creating and explain the purpose of each object. More detailed information about the implementation can be found in the script itself.


SB_PQ_Test:

That’s the database we’ll be using to create all the necessary objects.


SB_PQ_ServiceBrokerLogs:

This is a logging table which will be used to log the start and end of each query execution, and any errors that may occur.


SB_PQ_ExecuteDynamicQuery:

This is the procedure which will actually execute the dynamic queries. It builds the dynamic command, uses sp_executesql to execute it, and returns the result as an output parameter. The name of the output parameter itself can be dynamic.


SB_PQ_HandleQueue:

This is the activation procedure that Service Broker executes when it creates a “reader instance”. Or in other words, when there’s something in the request queue.

This procedure is the most important part because it’s responsible for retrieving messages from the request queue, execute the query (using SB_PQ_ExecuteDynamicQuery), and handle any errors that may occur in-between.

SB_PQ_Request_Queue:

This is the queue where all query requests will be waiting to be processed. Its activation procedure is SB_PQ_HandleQueue.


SB_PQ_Response_Queue:

This is the queue into which query responses will be sent. Its doesn’t have an activation procedure because query response extraction is done manually by the client process.


[//SB_PQ/Message]:

This is a simple message type of “Well Formed XML”. We’ll be using it to verify our messages.


[//SB_PQ/Contract]:

This is a contract defining that both target and initiator must send messages of type [//SB_PQ/Message].


[//SB_PQ/ProcessReceivingService]:

This is the service endpoint working as the “address” of request queries. We will be sending messages into this service. The queue it’ll be using to receive messages is SB_PQ_Request_Queue.


[//SB_PQ/ProcessStartingService]:

This is the service endpoint working as the “address” of response queries.

We will be sending messages from this service. The queue it’ll be using to receive messages is SB_PQ_Response_Queue.


SB_PQ_Start_Query:

This is the public procedure that will be used for starting an execution thread, by sending a request query to [//SB_PQ/ProcessStartingService].

You can think of it as an “API” simplifying our use of all the above objects.

It receives as parameters the SQL command that should be executed, the name of the output parameter (without the @), and the conversation group ID.


SB_PQ_Get_Response_One:

This is the public procedure that will be used to retrieve the results of our queries. You may also think of it as part of the “API” simplifying our use of the above objects. It receives as a parameter the conversation group ID to be used, and returns the first result it finds as an output parameter. This procedure is also responsible for ending the dialog conversations which are no longer needed.

Whew, that was a long list! Please note that since Service Broker works asynchronously in the background, error handling is extremely important. Notice all the error handling that I implemented, especially in the SB_PQ_HandleQueue procedure.

The Sample Script

Now we can really get down to business. I’ll go over the important parts in the script and explain each of them.

REMEMBER! This is only the sample script, it is not an integral part of our “API”, but is an example of how to use it:

Transact-SQL

-- random workload simulation settings SET @MinDelay = 10 SET @MaxDelay = 30 SET @ConvGroup = NEWID(); -- the messages will be grouped in a specific conversation group SET @NumOfSegments = 3; -- number of "threads" to use -- create several segments SET @Segment = 1; WHILE @Segment <= @NumOfSegments BEGIN

1

2

3

4

5

6

7

8

9

-- random workload simulation settings

SET @MinDelay = 10

SET @MaxDelay = 30

SET @ConvGroup = NEWID(); -- the messages will be grouped in a specific conversation group

SET @NumOfSegments = 3; -- number of "threads" to use

-- create several segments

SET @Segment = 1;

WHILE @Segment <= @NumOfSegments

BEGIN

In our sample, we create several “segments” (or threads) where each of them will have a random delay between 10 and 30 seconds, and execute a simple query returning its details. This will simulate a situation of “multiple queries”, where each “parallel query” will be represented by a “segment”. In the above script we define the minimum and maximum delay in seconds, set the conversation group ID by which our threads will be grouped, total number of threads (3 in our case), and we start looping per each of the threads.

Transact-SQL

-- random delay between @MinDelay and @MaxDelay seconds to simulate long execution time SET @Delay = '00:00:' + CONVERT(varchar(8), ROUND(RAND() * (@MaxDelay - @MinDelay),0) + @MinDelay)

1

2

-- random delay between @MinDelay and @MaxDelay seconds to simulate long execution time

SET @Delay = '00:00:' + CONVERT(varchar(8), ROUND(RAND() * (@MaxDelay - @MinDelay),0) + @MinDelay)

The above script simply prepares the delay representation by generating a random number of seconds as we defined earlier (between 10 and 30).

Transact-SQL

-- build our dynamic SQL command. note the use of XML as the result. SET @SQL = N' WAITFOR DELAY ''' + @Delay + '''; SET @SB_PQ_Result = ( SELECT Segment = ' + CONVERT(nvarchar(max), @Segment) + N', Delay = ''' + @Delay + N''', StartDate = GETDATE(), Name = QUOTENAME(name), object_id, type, modify_date FROM SB_PQ_Test.sys.tables AS Tab FOR XML AUTO, ELEMENTS ); ';

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

-- build our dynamic SQL command. note the use of XML as the result.

SET @SQL = N'

WAITFOR DELAY ''' + @Delay + ''';

SET @SB_PQ_Result =

(

SELECT

Segment = ' + CONVERT(nvarchar(max), @Segment) + N',

Delay = ''' + @Delay + N''',

StartDate = GETDATE(),

Name = QUOTENAME(name),

object_id, type, modify_date

FROM

SB_PQ_Test.sys.tables AS Tab

FOR XML AUTO, ELEMENTS

);

';

In the script above we create the dynamic SQL command which will be executed by the current thread. Note the use of XML AUTO, ELEMENTS to format the results in XML and put them in the output parameter.

Also note the use of WAITFOR DELAY at the start to simulate long execution time.

Transact-SQL

-- Send request to queue EXEC SB_PQ_Start_Query @SQL, @OutputParam, @ConvGroup;

1

2

-- Send request to queue

EXEC SB_PQ_Start_Query @SQL, @OutputParam, @ConvGroup;

The above script executes the API procedure SB_PQ_Start_Query with the SQL command we prepared, the name of the output parameter, and the conversation group ID. This will put the message in the request queue and continue executing without waiting for a response (the magic of asynchronous processing!).

Transact-SQL

-- increment segment index SET @Segment = @Segment + 1; END

1

2

3

-- increment segment index

SET @Segment = @Segment + 1;

END

The above script increases the segment (thread) index so we can continue to the next one.

Transact-SQL