Bagikan melalui


Test Dashboard (Agile)

By using the Test dashboard, you can monitor test activities, report on progress, find gaps in test coverage, and identify test areas that may require additional investigation. This dashboard displays five reports that provide information about testing that has occurred in the most recent four weeks.

Note

You access dashboards through your team project portal. You can access the Test dashboard only if that portal has been enabled and is provisioned to use Microsoft Office SharePoint Server 2007. For more information, see Dashboards (Agile) or Access a Team Project Portal and Process Guidance.

In this topic

  • Data That Appears in the Dashboard

  • Required Activities for Tracking Testing

  • Monitoring Test Progress

  • Determining Gaps in Testing

  • Monitoring Test Failures and Regressions

  • Customizing the Test Dashboard

You can use this dashboard to answer the following questions:

  • Is the authoring of Test Cases on track?

  • Has the team defined Test Cases for all User Stories?

  • What are the proportions of Test Cases that are passing, failing, and blocked?

  • Do test failure metrics indicate a problem that requires further investigation?

  • What is the status of last night's build?

  • What are the most recent check-ins?

Required Permissions

To view the dashboard, you must be assigned or belong to a group that has been assigned the Read permission in SharePoint Products for the team project. To modify, copy, or customize a dashboard, you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project. For more information, see Add Users to Team Projects.

To modify a report in Office Excel, you must be a member of the TfsWarehouseDataReaders security role in SQL Server Analysis Services and you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project. For more information, see Grant Access to the Databases of the Data Warehouse for Visual Studio ALM.

To view a work item, you must be a member of the Readers group or your View work items in this node permission must be set to Allow. To create or modify a work item, you must be a member of the Contributors group or your Edit work items in this node permission must be set to Allow. For more information, see Managing Permissions.

Data That Appears in the Test Dashboard

You can use the Test dashboard to understand how well the team is progressing in testing the User Stories. Specifically, this dashboard displays the Web parts that the following illustration shows and the following table describes.

Web Parts for Test Progress Dashboard

Note

The Test Plan Progress, Test Case Readiness, User Story Test Status, and Test Activity reports are available only when the team creates test plans and runs tests by using Test Runner and Microsoft Test Manager. For information about how to define test suites and test plans, see Organizing Test Cases Using Test Suites.

Burndown, progress, trend charts, and reports Step 1 through Step 5 do not appear when the server that hosts Analysis Services for the team project is not available.

Web part

Data displayed

Related topic

Step 1

Stacked area graph of the test results of all Test Cases grouped into their most recent recorded outcome during the past four weeks. Outcomes include Never Run, Blocked, Failed, and Passed.

Test Plan Progress Excel Report

Test Plan Progress Report

Step 2

Stacked area graph that shows how many Test Cases have been in the Design or Ready state for the most recent four weeks.

Test Case Readiness Excel Report

Test Case Readiness Report

Step 3

Horizontal bar chart that shows the count of test results for each combination of Test Case and test configuration that is defined for each User Story. The chart groups the test results according to their most recent test run, where the options are Passed (green), Failed (red), Blocked (purple), or Not Run (gray).

User Story Test Status Excel Report

User Story Test Status Excel Report (Agile)

Step 4

Line chart that shows the cumulative count of all results run for all manual Test Cases during the most recent four weeks.

Test Activity Excel Report

Test Activity Excel Report

Step 5

Stacked area graph that shows the cumulative count of all failed outcome results for Test Cases, sorted by failure type, during the most recent four weeks. Failure types include Regression, New Issue, and Known Issue.

Failure Analysis Excel Report

Failure Analysis Excel Report

Step 6

List of upcoming events. This list is derived from a SharePoint Web part.

Import Events Web part

Not applicable

Step 7

Count of active, resolved, and closed work items. You can open the list of work items by clicking each number. This list is derived from a Team Web Access Web part.

Project Work Items Web part

Work Items and Workflow (Agile)

9

List of recent builds and their build status. You can view more details by clicking a specific build. This list is derived from a Team Web Access Web part.

Recent Builds Web part

Legend:

Build in Progress: Build in Progress

Build Not Started: Build Not Started

Build Succeeded: Build Succeeded

Build Failed: Build Failed

Build Stopped: Build Stopped

Build Partially Succeeded: Build Partially Succeeded

Managing and Reporting on Builds

10

List of the most recent check-ins. You can view more details by clicking a specific check-in. This list is derived from a Team Web Access Web part.

Recent Checkins Web part

Develop Code and Manage Pending Changes

Required Activities for Tracking Testing

For the reports in the Test dashboard to be useful and accurate, the team must perform the following activities:

  • Define Test Cases and User Stories, and create Tested By links from Test Cases to User Stories.

  • Define Test Plans, and assign Test Cases to Test Plans. For more information, see Defining Your Testing Effort Using Test Plans.

  • For manual tests, mark the results of each validation step in the Test Case as passed or failed.

    Important

    Testers must mark a test step with a status if it is a validation test step. The overall result for a Test Case reflects the status of all the test steps that the tester marked. Therefore, the Test Case will have a status of failed if the tester marked any test step as failed or did not mark it.

    For automated tests, each Test Case is automatically marked as passed or failed.

  • (Optional) To support filtering, assign Iteration and Area paths to each Test Case.

Monitoring Test Progress

You can use the first three reports in the Test Dashboard to monitor test progress and answer the questions in the following table.

Report

Questions answered

Notes

Test Case Readiness

  • How many Test Cases has the test team defined?

  • How many Tests Cases are ready to run today?

  • How many Test Cases must the team still write and review?

  • Does the overall number of Test Cases appear to be sufficient for the number of User Stories that the team is implementing?

  • What percentage of Test Cases can the test team run today?

  • Will the team be able to prepare all the Tests Cases by the end of the iteration?

  • Healthy progress shows a steady increase in the number of Test Cases that the team is designing and moving to the ready state.

  • Unhealthy progress shows that no or few Test Cases are ready to be run.

    When all test cases remain in a design state for a long time, an issue may block progress. You might want to investigate the cause of the blockage.

  • A gap in testing may develop if the number of Test Cases does not appear sufficient.

    The number of Test Cases that are defined for a project should be equal to or larger than the number of User Stories that the team is implementing. The number of Test Cases does not appear sufficient.

Test Plan Progress

  • How many Test Cases are passing?

  • How many Test Cases are failing?

  • How many Test Cases are blocked?

  • How many Test Cases have never run?

  • What percentage of Test Cases are passing across all Test Plans?

  • How much testing has the team completed?

  • Is the team likely to finish the testing on time?

  • As the development cycle progresses, the more Test Cases should pass, and fewer Test Cases should stay in other states.

  • Unhealthy progress occurs when too many Test Cases fail. Depending on where you are in the product cycle, you might investigate why so many test cases are failing.

  • If the number of Test Cases that are failing or never run is flat, you might want to investigate the specific causes that affect each area.

User Story Test Status

  • Are Test Cases being run for each User Story?

  • If Test Cases are blocked or not being run, does the team understand the blocking issues and are they being addressed?

  • Healthy progress shows most Test Cases for each User Story are passing.

  • Unhealthy progress is indicated too many Test Cases for a specific User Story, which are in a Never Run, Blocked, or Failed state. You might want to investigate the causes that keep the Test Cases that are defined for a User Story from passing.

Determining Gaps in Testing

You can use the User Story Test Status report to determine whether tests are covering all the code and to answer the following questions:

  • Which User Stories have a low overall count of Test Cases?

  • Which User Stories have a high overall count of Test Cases that are blocked or have never been run?

  • Does the Test Case coverage for each User Story meet expectations?

  • Which User Stories have a high rate of test failures?

  • What is the average number of Test Cases that are defined for each User Story?

Monitoring Test Failures and Regressions

By monitoring test failures, you can identify and address problems in the code early. You can use the last two reports in the Test Dashboard to gain better insight into the number of tests that are failing.

Report

Questions answered

Notes

Manual Test Activity

  • Is the number of tests that the team has never run decreasing?

  • Is the team minimizing the overall number of blocked tests?

  • Are fewer tests failing over time?

  • Are more tests passing?

  • Does the test activity contain spikes that you cannot account for?

The Manual Test Activity report indicates the results for each Test Case run for each test configuration and for all test plans. Spikes that might occur may be early indicators of problems in either the test activity or the quality of code that the team is checking in.

You might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes.

Test Failure Analysis

  • How many tests are regressing?

  • Is the team keeping the overall number of regressions or test failures within expected ranges or team goals?

  • Is the team addressing issues as they are identified and known issues in a timely manner?

A healthy Test Failure Analysis report shows moderate numbers of new issues, known issues, and regressions. If any spikes occur in these areas, the team might need to investigate further. Spikes may indicate problems in either the test activity or the quality of code that the team is checking in.

Also, you might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes.

Customizing the Test Dashboard

You can customize the Test dashboard in the following ways:

  • Change the filters of each report in Office Excel to focus on specific product areas or iterations. 

  • Filter the Manual Test Activity report in Office Excel for specific test plans or on Test Cases that are either manual or automated.

  • Add existing Excel reports such as Bug Status, Code Churn, and Code Coverage to the dashboard.

  • Create and add reports in Office Excel that show progress by specific members of the team. For an example, see Bugs by Assignment Excel Report.

For more information about how to work with and customize reports in Office Excel, see the following pages on the Microsoft Web site:

See Also

Concepts

Defining Your Testing Effort Using Test Plans

Running Manual Tests Using Test Runner

Running Automated Tests

Test Case

User Story (Agile)

Test Case Readiness Report

Test Plan Progress Report

Dashboards (Agile)

Artifacts (Agile)