SQL Server: How to determine parameter sniffing problem and how to handle it

Sudip Bhatt 2,271 Reputation points
2021-02-07T19:28:16.877+00:00

I have seen few articles on parameter sniffing. i saw they are suggesting to use recompile option in SP or use OPTIMIZE FOR hint

SELECT SalesOrderDetailID, OrderQty
FROM Sales.SalesOrderDetail
WHERE ProductID = @ProductID
OPTION (OPTIMIZE FOR (@ProductID=945));

please tell me is it best approach to use OPTIMIZE FOR hint where i need to hard code value against parameter?
here hard coded like (OPTIMIZE FOR (@ProductID=945));

if i have millions different ProductID then how can i mention multiple productid in optimize for hint ?

Thanks

Transact-SQL
Transact-SQL
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,554 questions
0 comments No comments
{count} votes

Accepted answer
  1. MelissaMa-MSFT 24,176 Reputation points
    2021-02-08T06:08:55.217+00:00

    Hi @Sudip Bhatt ,

    The parameter sniffing problem does not occur frequently. It only occurs when the data distribution in some tables is very uneven, or the parameter values brought in by the user are very uneven.

    For example, a stored procedure for querying data for a period of time, if most users only check data for 1 day, SQL Server caches this kind of execution plan. For those users who want to check data for a year, they will encounter parameters The sniffing problem is at risk. If most users in the system have to check data for one year, but SQL Server happens to cache a stored procedure that only checks data for one day, most users will encounter parameter sniffing problems.

    There are several solutions to deal with parameter sniffing as below:

    1. Use EXEC () to run dynamic SQL statements.
    2. Use local variables.
    3. Use query hint in the statement to specify the execution plan.
      A:With Recompile
      B:Specify the Join operation ({ LOOP | MERGE | HASH} JOIN)(not recommended)
      C:OPTIMIZE FOR ( @Kritivasas _name = literal_constant [ ,...n ] )
      D:Plan Guide

    OPTIMIZE FOR:

    After confirming the Parameter Sniffing problem, you will find that the execution plan generated according to some variable values will vary greatly between fast and slow, while the execution plan generated based on other variables has similar performance.

    Just for example, in a join statement, when the variable @i is 50,000, it uses the best Nested Loops execution plan, which takes more than 10 milliseconds, and the execution plan with Hash Join is only more than 300 milliseconds. But when the variable is equal to 75124, the Hash Join execution plan takes more than 500 milliseconds, and when using Nested Loops, it takes more than 4,000 milliseconds.

    In absolute terms, it is generally good to have users wait a few hundred milliseconds. But after a few seconds, it is easy to receive complaints. So Hash Join is a relatively "safe" execution plan. If SQL Server always uses the value of 75 124 as the execution plan, it will be safer for most queries.

    option (optimize for (@i = 75124))  
    

    But as mentioned in this article, this solution is not always the best one, it depends on actual situations.

    This solution is good when:

    • The distribution of data seldom changes.
    • You have development resources dedicated to tuning queries.
    • You have resources available to review data distribution.

    This solution is not good when:

    • Data distribution can change rapidly.
    • Data distribution change is not predictable.
    • There are limited or no resources for tuning.

    Best regards
    Melissa


    If the answer is helpful, please click "Accept Answer" and upvote it.
    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    3 people found this answer helpful.

3 additional answers

Sort by: Most helpful
  1. Erland Sommarskog 101.4K Reputation points MVP
    2021-02-07T19:47:37.947+00:00

    There is no short simple answer to parameter sniffing problems. For one thing, you need to have an understanding of why parameter sniffing is a problem to you.

    What I can say is that the snippet you have above is rarely a solution.

    I have a longer article on my web site Slow in the Application, Fast in SSMS?, where I discuss parameter sniffing. Part of the article tries to straighten out a particularly confusing situation. But I also give some examples - several drawn from real life - of parameter sniffing and possible ways to address them.

    However, in the end, you will have to make the analysis for your case. There is no simple one-size-fits-all solution.

    1 person found this answer helpful.

  2. Dan Guzman 9,211 Reputation points
    2021-02-07T22:14:30.443+00:00

    if i have millions different ProductID then how can i mention multiple productid in optimize for hint ?

    Adding to what Erland said, there is never a need to specify more than one value for a given parameter with the OPTIMIZE FOR hint. The hint ignores the the actual parameter value during compilation and uses the hint value instead to generate the plan. If a list of values were permitted, that could result in multiple (different) plans and no clear winner which one to use for all queries. The least costly plan for one value could be very expensive for another, which is essentially why parameter sniffing is a challenge.

    1 person found this answer helpful.
    0 comments No comments

  3. Kaplun, Brian 1 Reputation point
    2022-08-30T10:08:31.953+00:00

    My input on this...Came across this while saving some pages for teaching others. I agree with everyone above especially that there are multiple methods and types of cases. I data we work on can vary in number of rows a lot and developers test using small datasets for the most part. However, we can have tables start at a few hundred rows and in a short time be in the millions and keep growing. When the execution plan is first created it is often setup for a small dataset and as time goes forward and the data grows, ...slower and slower we go. The practice I started following was to always pass my proc parameters into a local variable. It adds a little time to the development but in the long run it helps reduce the issue from occurring and then not having resources that know the symptoms and never find the issue. Now, if you absolutely know the data boundaries and how the sniffing may occur, then apply the method that best fits. However, in my world it is not easy to determine and I have seen a good bit of parameter sniffing issues over the years. Reasons for that are known but not something that can be resolved easily.