I read this awesome post by Joe Sack, titled “Detecting Cardinality Estimate Issues with sys.dm_exec_query_stats”.
I was amazed by the beauty of his detection query, ran it on my environment, and discovered some interesting things.
Then I revised it a little to reveal more information and filter out more irrelevant data.
You can find the revised query below. For queries where the difference between the estimated and actual rows is greater than 30,000, It returns the batch text, the specific problematic query’s text, the problematic query’s plan, the last execution time, the last elapsed time, the estimated number of rows from the execution plan and the last row count the query returned.
[code lang=”sql”] –Based on a post by Joe Sack: –http://www.sqlskills.com/blogs/joe/detecting-cardinality-estimate-issues-with-sys-dm_exec_query_stats/
SELECT TOP 100 * FROM ( SELECT t.text as [batch_text], SUBSTRING(t.text, statement_start_offset/2+1, ((case when statement_end_offset = -1 then datalength(text) else statement_end_offset end – statement_start_offset)/2) + 1) as [query_text], CAST(p.[query_plan] AS XML) AS [query_plan], s.[last_execution_time], S.[last_elapsed_time], CAST(p.[query_plan] AS XML).value(‘(//@EstimateRows)’, ‘VARCHAR(128)’) AS [estimated_rows], s.[last_rows] FROM sys.[dm_exec_query_stats] AS [s] CROSS APPLY sys.[dm_exec_sql_text] (sql_handle) AS [t] CROSS APPLY sys.[dm_exec_text_query_plan] (plan_handle,statement_start_offset,statement_end_offset) AS [p] ) AS InnerQuery WHERE TRY_CONVERT(DECIMAL,InnerQuery.[estimated_rows]) IS NOT NULL AND ABS(CONVERT(DECIMAL,InnerQuery.[estimated_rows]) – [last_rows]) > 30000
Three notes about the query:
As Joe stated in his post, it detects inaccurate estimates only at the root level of the query. It won’t identify such inaccuracies in earlier parts of the plan.
Try to limit the query output like I did, as it can take some time to run.
The query will work only on SQL Server 2012 and on,.
Thank you Joe for posting this.