Monday, August 23, 2021

Performance tuning a stored procedure with user defined function calls in SQL Server

UPDATE: It turns out that there are easier ways of doing this. Take a look at this post instead of following the below.

******************************************* *******************************************

Have you ever tried to do performance tuning on a stored procedure that has user defined function calls?  It can be very challenging. This is mainly because one of the of the most basic tools of performance tuning—checking the number of logical reads using Set Statistics IO—is just not accurate in this scenario. The logical reads of any user defined functions called during a stored procedure are not reported using Set Statistics IO, so you get output that doesn't reflect their true cost.  

Before I get into the details of how to work around this, let's just get this out of the way—yes, user defined functions have a very mixed reputation because of performance issues. Functions in the Select or Order By clause are likely to be okay. But functions have no place in the Where clause, and can cause huge performance issues.

Yet you may be working on a system that has hundreds of user defined functions scattered everywhere, including the Where clause, within hundreds of stored procedures. That's what I was recently faced with.

It's not possible to fix all these issues quickly, so you need to target the the worst performers, and figure out how to improve them. I use Query Store (usually using the Top Resource Consuming Queries report) to find the worst performers in general. But once you have a stored procedure targeted for performance improvement, how do you get the full performance picture if it contains user defined functions?

Here's a process you can use in order to get data that accurately reflects performance (including Logical Reads), even when there are user defined functions inside your stored procedure. The key is to use  the built-in reporting in Query Store, and not Statistics IO.

Step 1: Purge Query Store data

Make sure Query Store is turned on for your database. Information on how to do this is easy to find online. After ensuring it's turned on, click on the Purge Query Data button. (This is, of course, assuming you're not working on a production server). This will give you a blank slate in the Query Store.



Step 2: Run the stored procedure

Execute the stored procedure in a query window. This will populate the previously empty Query Store with just the performance metrics for this specific call.

Step 3: Check the Query Store: Top Resource Consuming Queries

Next, go to the Top Resource Consuming Queries report in Query Store, and open it. By default it gives you the graph output, which doesn't give you numbers. Instead, switch to the simple grid output, and copy and paste the results to a spreadsheet. I usually do this for both Logical Reads and CPU.

Now what? When I first ran through this with my stored procedure, I found that the user defined function calls in the stored procedure were actually not as big of a problem as I had suspected. Instead of focusing on user defined functions, I targetted other areas of the stored procedure and got a very substantial performance improvement. 

To validate the improvement, I ran through the same steps as above, but using the updated stored procedure. Comparing Logical Reads and CPU numbers between the old and new versions of the stored procedure gave me the results I was looking for. 

No comments:

Post a Comment