Define performance benchmarks and success criteria

Completed

A performance benchmark is a metric or point of reference that gives evidence that the solution being built during implementation can achieve business performance objectives and constraints.

Performance benchmarks confirm that the solution can process the targeted transactions/user’s volume within an acceptable duration/response time with a specific data starting point.

Performance benchmarks answer questions that are related to handling real-life workloads and thousands of users concurrently. They also answer questions about performance and scalability in years after go-live, performance in rollouts to other countries after the first go-live, and so on.

Develop a performance tuning process to meet performance objectives

Performance testing is an iterative approach and requires a defined process that should have a life cycle and clear steps. Some tests need to be run in a loop until the required solution is achieved. Make sure that you're clear on performance goals and prioritize tuning scenarios.

The typical performance tuning process includes the following steps:

  1. Narrow it down - This step is the first for each scenario. Find out where you lose the most time and then focus your efforts on that point. For example, validate whether few or many calls exist, validate whether process is running or waiting, and so on.
  2. Troubleshoot - Analyze why that part of the process is slow. It could be configuration, looping, row-by-row operations, or resource contention, such as locking or single threading.
  3. Solution - create a fix - Consider lead time for Microsoft or partner/provider hotfixes. You might be able to fix by extension.
  4. Evaluate it – Validate that the performance goal has been met.
  5. Test the new solution.
  6. Repeat or deploy the solution.

Sample performance benchmark activities RACI

Part of the strategy definition is to define roles and responsibilities. Samples of the performance benchmark activities and responsibilities between customer and implementation partner are included in the following table.

RACI abbreviations:

  • R - Responsible
  • A - Accountable
  • C - Consulted
  • I - Informed
ACTIVITY PARTNER (sample) CUSTOMER (sample)
Define the target/projected business goals I AR
Define the detailed benchmark scenarios RIC A
Take task recordings and document the reproduction steps I AR
Provide the environment artifacts (code build and database to use) I AR
Build the benchmark environment R A
Create test scripts and data scripts R A
Run the performance benchmark R A
Deliver the performance benchmark report R A
If bugs occur in the standard solution, open a support request to Microsoft C AR

Performance benchmarks outcomes

Performance benchmarks will confirm that the solution will perform the critical business scenarios as expected. Key benchmark deliverables include performance benchmark report, issues detected/fixed in each iteration, and optimizations performed in each iteration.