Now that I'm using it, I'm blown away at how useful it is. And the truth is that few people know how to use it well. There's not much written up on it. It takes quite some time for articles and blog posts (like this one) to appear that tell people when and how to use the new features.
Before Query Store was available, here's what I often did when I had to analyze long stored procedures for performance:
UPDATE dbo.HotelFact
SET SalesForceLastUpdateDate = ##SalesForce.LastUpdateDate
FROM
dbo.HotelFact
INNER JOIN dbo.dimhotel
ON dimhotel.HotelKey = HotelFact.HotelKey
INNER JOIN ##SalesForce
ON ##SalesForce.SalesForceAccountNumber = dimHotel.SFAccountNumber
WHERE
isnull(HotelFact.SalesForceLastUpdateDate,0) <> isnull(##SalesForce.LastUpdateDate, 0)
exec MessageLog @DatabaseName, @ProcedureName , 'Info', 'Update records in HotelFact with SalesForce updates', @@Rowcount
Notice the call at the end, to MessageLog? After every significant statement, I would call it, and it logged the duration to a Message table. I'd check the Message to see what took longest.
That was a hassle. And it made the stored procedure harder to read. And of course, if the stored procedure didn't already use MessageLog, you'd have to update it.
But Query Store data blows this away. It gives you all the information you need to troubleshoot stored procedure performance at the statement level, with any metric, such as logical reads or CPU.
Say you have a stored procedure that's hundreds of lines long, and does dozens of reads and writes. And it's slow. You look at it and groan, imaging what a pain it will be to even figure out where you need to focus your attention.
No need to worry. Once you have Query Store turned on, you can analyze it in moments. This query gave me, for the specific procedure that was causing a problem, all the query plans, ordered by cpu time, for the last 24 hours.
-- Query plan details on one object, perhaps a stored procedure that has a high duration
-- In this case, getting details on SE_SaveBuffer
SELECT qsq.query_id
,qsp.plan_id
,OBJECT_NAME(qsq.object_id) AS ObjectName
,SUM(rs.count_executions) AS TotalExecutions
,AVG(rs.avg_duration) AS Avg_Duration
,AVG(rs.avg_cpu_time) AS Avg_CPU
,AVG(rs.avg_logical_io_reads) AS Avg_LogicalReads
,MIN(qst.query_sql_text) AS Query
FROM sys.query_store_query qsq
JOIN sys.query_store_query_text qst
ON qsq.query_text_id = qst.query_text_id
JOIN sys.query_store_plan qsp
ON qsq.query_id = qsp.query_id
JOIN sys.query_store_runtime_stats rs
ON qsp.plan_id = rs.plan_id
JOIN sys.query_store_runtime_stats_interval rsi
ON rs.runtime_stats_interval_id = rsi.runtime_stats_interval_id
WHERE qsq.object_id = OBJECT_ID(N'dbo.SE_SaveBuffer')
AND rs.last_execution_time > DATEADD(HOUR, - 24, GETUTCDATE())
AND rs.execution_type = 0
GROUP BY qsq.query_id
,qsp.plan_id
,OBJECT_NAME(qsq.object_id)
ORDER BY
AVG(rs.avg_cpu_time) DESC
Once I got this, I knew exactly where to look—at the top of the list, the ones that are taking the longest. This is where the analysis needs to occur. The following query will allow you to see the query plan for a specific statement. You need to update it with your plan_id.
SELECT
ConvertedPlan = TRY_CONVERT(XML, [query_plan])
,*
FROM sys.query_store_plan
WHERE
plan_id IN (1830037)
At this point in my troubleshooting, it was easy. The worst performing statement, that took most of the CPU time, had a missing index. Adding that index improved the overall performance dramatically.
So there you have it—a new tool that can make performance troubleshooting of long-running stored procedures a piece of cake.
I haven't gone into all the details of how to actually set up the Query Store. The Microsoft documentation is great, and I also like the articles by Erin Stellato (https://sqlperformance.com/author/erinsqlskills-com),