performance degradation of some queries after update statistics.

Heisenberg 261 Reputation points
2022-06-27T19:30:26.757+00:00

We have a big table that holds around 350G of data, this is a OLTP application database. Recently we have updated statistics of all the indexes on the table using 70% sampling, we left system statistics (the one starts with WA) on this table as is. After this we are seeing some queries performing slowly as compared to before updating statistics. What could be the reason, how can i fix this issue.

I was under impression update statistics should be better for all the queries. Can the slow performance attributed to not updating system statistics (the one starts with WA).

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
14,481 questions
0 comments No comments
{count} votes

4 answers

Sort by: Most helpful
  1. Tom Phillips 17,741 Reputation points
    2022-06-27T19:54:01.78+00:00

    You need to look at the query plan for the poorly executing query.

    Sampling at 70% may have caused the query to use a different plan/index. You may try a full scan and see if that fixes your issue.

    0 comments No comments

  2. Erland Sommarskog 117.1K Reputation points MVP
    2022-06-27T21:55:02.913+00:00

    Keep in mind that the optimizer make estimates of what is the best plan. More accurate statistics will give your better plan in most cases, but statistics are still statistics, and will not give a complete description of the data. So sometimes less accurate statistics just by chance gives a better result than more accurate statistics.

    To answer what happened in your specific case(s), we would need to see actual execution plans before and after, and we would probably also need to see the histograms, before and after. But even without knowledge of the data, it can be difficult to give a good answer.

    Then again, it may be better to look at the query and index as things are now. It is not unlikely that there is some flaw in any of these, and you were just lucky before you updated statistics.

    Finally, a 70% sample rate is quite pointless. I vaguely recall that too high sample rates takes more resources than FULLSCAN.

    0 comments No comments

  3. Heisenberg 261 Reputation points
    2022-06-28T02:42:50.45+00:00

    215516-old-plan.xml215573-new-plan-1.xmlthank you both for your inputs. @Erland Sommarskog im attaching both the plans here when you get time it will be great if you can give some inputs. Also if 70% sampling is too high what is the best sampling. If i understand fullscan is equal to 100% sampling isnt it so shouldnt fullscan take more resources than 70% ?


  4. YufeiShao-msft 7,141 Reputation points
    2022-06-28T09:00:35.687+00:00

    Hi @Heisenberg ,

    About the sample, the default sample need create a good histogram, so the challenge is whether to run FULLSCAN to tru to create the better histogram, or sample a smaller percentage to minimize the performance inpact of the update.

    You can see its conclusion from the doc:
    https://sqlperformance.com/2013/04/t-sql-queries/update-statistics-duration

    You need konw your data enough, sometimes a sampled update takes longer than a FULLSCAN, this is dependent on table size and the resources available

    For the way to konw query performance, you should correlate data to the informantion you are capturing about performance problems


    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.