Use the dimension pages—Productivity, Predictability, Responsiveness, and Quality—to get a more detailed view of your team’s progress. Each dimension is a portion of your team’s overall performance. Each dimension
has several charts, based on the metrics you specified in your applied scorecard.
The first chart is a percentile comparison with your
and all other companies that use CA Agile Central
. To see all the individual metrics for a dimension, select the Balance – Show All Metrics scorecard or configure your scorecard to display the desired metrics. Hover over any data point on the charts for point-in-time details.
The charts of the individual metrics compare your team’s (
) metrics against your workspace average. Hover over the to see how much a particular metric contributes to the overall score for that dimension. If there is no pie chart icon, that metric does not contribute to your overall score and is there for viewing purposes only.
Select the Drill Down arrow at the bottom of a metric chart to view details such as the individual
(and their statuses) that contributed to the metric’s score and the equation used to derive data. This is only available when viewing by month.
On metrics dependent on team size, you can select to view all the full time equivalent team members. You can optionally export this list of FTEs to a .csv spreadsheet.
Additionally, raw metrics allow you to export the drill down’s data to a .csv spreadsheet from .
Measure the amount of work your team completes in a given time period (throughput), based on either the number of items or
accepted. You can also include
in this percentile score. The higher your productivity score, the better it is. Understanding what drives your team’s productivity score helps with
Throughput is normalized for team size so you may accurately compare teams of different sizes.
stories contribute to your team’s productivity score for built-in scorecards. However, you can still view defects and features in the drill down charts.
Measure how consistently your team finishes work over time, based on the mean and standard deviation of throughput during a three-month timeframe (variability of throughput). You can also include variability of velocity or time in process in this percentile score. The more consistent your team performs, the more predictable they are.
A high predictability score indicates your team consistently delivers the same throughput or velocity. A low Coefficient of
(CoV) means a high rate of predictability.
Since the metrics are based on a three-month timeframe, the chart typically has a curve to it. Only user stories are used to derive the predictability score.
Measure how quickly your team can deliver functionality after it is requested, based on time-in-process of user stories. Responsiveness is central to Agile: a high responsiveness score indicates a faster time-to-market and a communicative relationship with customers.
Time in Process (TiP) is the time it takes a
to go from In Progress to Accepted. The median TiP (P50) of all the work items that completed in a given timeframe is the default value used for TiP in the Balanced scorecard.
Weekends, holidays, and non-work hours are not included in the TiP calculation.
Measure your team’s ability in preventing defects from occurring and how quickly they resolve any that do occur, based on defect density. You can also include cumulative defect aging in this percentile score. A high Quality score can indicate quick defect resolution and less time and resources spent on rework, few defects are introduced in the project, or quick defect resolution.
Defect density (normalized) is the number of defects per team member, both released and unreleased, including all priorities and environments (test, production, and so on). Released defect density only represents defects in the production environment. If you use the Environment
in your defects consistently, the released defect density metric will be more accurate.