Dylans Wiki
Advertisement

VS ALM provides 4 different ways to surface reporting capabilities:

  • Work Item Queries (WIQ) - Work Item Queries can be created and saved for reuse.  Work Item Queries are easy to create, and can answer the majority of questions that users have.
  • Excel Reporting - Work Item Queries can be exported to Excel to provide more advanced capabilities around analyzing the data resulting from the Query.  Various different views of the data along with charts/graphs can be created using standard Excel functionality.  Creating Excel Reports requires some expertise in Excel, but the resulting file can be saved and the data can be automatically refreshed from TFS.
  • SSRS (SQL Server Reporting Services) - This is the most advanced Reporting option.  SSRS reports require a developer to create, but it provides the flexibility to report and analyze any data stored in TFS in almost any way imaginable.
  • TFS Web Access (TWA) - TFS Web Access provides some basic charts/graphs that can be viewed in the browser (Burndown, Cumulative Flow, Velocity Chart, etc).  In addition, web-based Charts/Graphs can be generated based on the results of Work Item Queries.  This is less flexible than the Excel option, but also more user-friendly and usable by any TFS user.


Reports are designed to answer specific questions.  In this section we'll attempt to list the questions that CoreLink users may need to answer, and the various reports that are necessary to provide answers.  This list of questions/roles/reports should also help drive the various custom fields that are added to the Work Items.  Fields should only be added if they support one or more of the questions that CoreLink has deemed important.


Roles

  • Project Manager [PM]
  • Release Coordinator [RM]
  • Stable Team Tester [TST]
  • Stable Team Developer [DEV]
  • Enterprise Test [ET]


Bugs

How many Bugs are we finding in production? Broken down by stable teams? [ET, PM]

WIQ - Bugs WHERE FoundIn = Production GROUP BY Team


How many bugs are being caught during Implementation testing? [PM, TST]

WIQ - Bugs WHERE FoundIn = Implementation


How many bugs are being caught during release testing? Broken down by stable teams? [PM, TST, RM]

WIQ - Bugs WHERE FoundIn = Release GROUP BY Team


How many bugs are related to new functionality vs regressions? [ET, TST]

WIQ - Bugs GROUP BY Regression(Y/N)


How many bugs are being found by manual tests vs automated tests? [ET]

WIQ - Bugs CONTAINS LINK TO (Test Cases WHERE AutomatedTestType = CodedUI)


Is our bug find-rate (aka quality) going up or down over time? [ET]

Excel - Graph of bugs found per month over last 12 months


What bugs have been found recently? Do they have test cases? [PM, TST, ET]

WIQ - Bugs WHERE DateFound > (@Today - 30), Show linked test case(s)




Work Management

Is the team on track to meet the sprint goal/commitments? [PM]

Web Access / Reports - Burndown Graph(s)


Sample TFSWA Burndown






How much work should we commit to for the next sprint? [PM]

Web Access / Reports - Velocity Chart

TFSWA - Velocity






Is anybody on the team overloaded/underloaded? [PM]

Web Access - Work Assigned Chart

TFSWA - Work Assigned














Which user stories are ready for release? [PM, RM]

WIQ - User Stories WHERE (State = Resolved OR State = Closed) AND Linked Release = 123.456


Which user stories are in progress? [PM, RM]

WIQ - User Stories WHERE State = Active


What is the progress/status of all in-progress user stories? [PM]

Web Access - Task Board

SSRS - Stories Overview

TFSWA - Task Board
Stories Overview Report - Graphic












What is the testing status of all the User Stories in a specific release? [PM, RM, TST]

  • Number of Test Cases authored
  • Number of Test Cases passing/failing/pending

SSRS - Stories Overview (Filtered by Release)


Are user stories being implemented with zero or too few tests? [PM, TST, ET]

WIQ - User Stories WHERE State = Closed AND No Linked Test Cases

SSRS - Stories Overview (Filtered by Release)


What are the cycle times/lead time? Primarily from Dev Complete to Released [PM, ET]

SSRS - Custom Report (Cycle Time Report)


Test Planning

Which tests are the most/least stable? [ET]

SSRS - Custom Report (Regression Test Planning)


Which tests are the most/least frequently run? [ET]

SSRS - Custom Report (Regression Test Planning)


Which tests are the most/least fragile (discover the most Defects)? [ET]

SSRS - Custom Report (Regression Test Planning)


Which tests haven't been run in the longest time? [ET]

SSRS - Custom Report (Regression Test Planning)


What is the progress/status of the Release Testing efforts (burndown/burnup against test plan)? [ET, RM]

SSRS - Test Plan Progress

Test Plan Progress Report







What tests are related to a specific screen, interp, or Domain/Sub-Domain? [ET, TST]

WIQ - Test Cases WHERE Screen = XYZ123


General Testing Maturity

(Once automated tests are run nightly) How often are the teams deploying code and breaking automated tests? Do they "get back to green" quickly? [DEV, ET]

SSRS - Build Success Over Time

Build Success Over Time






How many automated tests do we have? How is this growing over time? [ET]

SSRS - Custom (Test Automation Trends)


How many Manual Tests Cases at the various quality levels do we have? How are they growing over time? [ET]

SSRS - Custom (Test Case Trends)


How many total tests are being run every week/month? What about every release testing phase? [ET, RM]

SSRS - Custom (Test Results By Release)


By Role

Project Manager

How many bugs are we finding in production? Broken down by stable teams?

How many bugs are being caught during Functional testing?

How many bugs  are being caught during release testing? Broken down by stable teams?

What bugs have been found recently? Do they have test cases?

Is the team on track to meet the sprint goal/commitments?

How much work should we commit to for the next sprint?

Is anybody on the team overloaded/underloaded?

Which user stories are ready for release?

Which user stories are in progress?

What is the progress/status of all in-progress user stories?

What is the testing status of all the ready for release user stories?

  • Number of Test Cases authored
  • Number of Test Cases passing/failing/pending

What is the testing status of all the in progress user stories?

Are user stories being implemented with zero or too few tests?

What are the cycle times/lead time? Primarily from Dev Complete to Released.


Release Coordinator

Which user stories are ready for release?

Which user stories are in progress?

What is the testing status of all the ready for release user stories?

  • Number of Test Cases authored
  • Number of Test Cases passing/failing/pending

What is the progress/status of the Release Testing efforts (burndown/burnup against test plan)?

How many total tests are being run every week/month? What about every release testing phase?


Stable Team Tester

How many bugs are being caught during Functional testing?

How many bugs are being caught during release testing? Broken down by stable teams?

How many bugs are related to new functionality vs regressions?

What bugs have been found recently? Do they have test cases?

What is the testing status of all the ready for release user stories?

  • Number of Test Cases authored
  • Number of Test Cases passing/failing/pending

What is the testing status of all the in progress user stories?

Are user stories being implemented with zero or too few tests?

What tests are related to a specific screen or interp?


Stable Team Developer

(Once automated tests are run nightly) How often are the teams deploying code and breaking automated tests? Do they "get back to green" quickly?


Enterprise Test

How many bugs are we finding in production? Broken down by stable teams?

How many bugs are related to new functionality vs regressions?

How many bugs are being found by manual tests vs automated tests?

Is our bugs find-rate (aka quality) going up or down over time?

What bugs have been found recently? Do they have test cases?

Are user stories being implemented with zero or too few tests?

What are the cycle times/lead time? Primarily from Dev Complete to Released.


Which tests are the most/least stable?

Which tests are the most/least frequently run?

Which tests are the most/least fragile (discover the most bugs)?

Which tests haven't been run in the longest time?

What is the progress/status of the Release Testing efforts (burndown/burnup against test plan)?

What tests are related to a specific screen or interp?


(Once automated tests are run nightly) How often are the teams deploying code and breaking automated tests? Do they "get back to green" quickly?

How many automated tests do we have? How is this growing over time?

How many Manual Tests Cases at the various quality levels do we have? How are they growing over time?

How many total tests are being run every week/month? What about every release testing phase?



Data Requirements

In order to support all these reports/questions, there are specific pieces of data that need to be collected.  This includes:

  • Test Cases must be linked to a User Story which they test
    • Note: A Test Case may be linked to more than one User Story
    • Note: There may be some non-functional or general purpose tests created that aren't linked to a User Story.  However, the vast majority of functional tests created by Stable Teams should be linked to the relevant User Stories.
  • A Release should be tracked using a TFS Work Item
  • User Stories should be linked to the Release Work Item
    • Note: It will need to be decided whether a User Story can span multiple Releases or not.  This has significant implications on Reporting.
    • Option #1 is every User Story has One Release.  If a User Story needs to span Releases, multiple User Stories are created.
    • Option #2 is a User Story may be linked to multiple Releases, and the Reports/Queries will need to be updated to deal with this additional complexity.
  • Bugs should be tracked as TFS Work Items with the following important pieces of metadata
    • Team that owns the Bug
    • Found In Phase (Implementation Testing, Release Testing, Production)
    • Regression (Yes/No)
    • WI Link to Test Case if found as the result of a Test Run

Custom Report Development

Based on the above list the following report customization tasks would need to be completed:

  • New Excel Custom Report - Bugs Found Over Time
  • Customize SSRS Report - Customize default Stories Overview Report to add additional filtering capabilities (by Release)
  • New SSRS Report - Cycle Time Report
  • New SSRS Report - Regression Test Planning Report
    • Fragile Tests (tests wil lots of defects linked and/or test run failures)
    • Infrequently Run Tests
    • Stable Tests (tests where the test case doesn't frequently change)
  • New SSRS Report - Test Automation Trends
    • Show the total number of automated tests over time
  • New SSRS Report - Test Case Trends
    • Show the total number of Test Cases in the various quality tiers over time (cumulative flow diagram)
  • New SSRS Report - Test Results By Release
    • Show the total number of Test Results run as part of each Release Test Plan
    • Probably need to differentiate by type of Release (Hotfix, Minor, Major)
Advertisement