Integrate Pivotal Tracker with Jama Analyze and pull over more than a dozen recipes that will automatically update each day across all of your Tracker projects.
Combine your Pivotal Tracker data with qualitative metrics from your team using Jama Analyze Team Polls to get the full picture of your team’s performance.
Connecting to Pivotal Tracker
To set up Jama Analyze for Pivotal Tracker, you'll first need an existing account in Pivotal Tracker. On the Pivotal Tracker connector page—which can be accessed from the Integrations page—click Connect Pivotal Tracker and enter your Pivotal Tracker API token.
Your initial import may take several minutes, depending on the size of your projects. You will receive an email notification when your import is complete.
Please note, each project that you have access to in Pivotal Tracker will be part of the initial import. If you are only wanting to include certain projects, we recommend creating a new user in Pivotal Tracker and assign them to the projects you wish to import. Then use the API token associated with that account.
Adding recipes to your dashboards
Once your import is complete, when you return to the Pivotal Tracker page in Jama Analyze, you can click through the recipe list to preview all of the pre-defined recipes.
To add a recipe to your dashboards or directory, check the box next to the recipes you wish to add, then hit the "Add recipes to dashboard" link to the right of the recipe list.
Selecting Recommended Dashboards
To help you get started, we have created a few recommended sets of metrics to add to your dashboard. Above the list of metrics, you'll see a dropdown where you can quickly pre-select development, product management, quality assurance, or backlog management focused metrics.
Once selected, simply add them to an existing or new dashboard as described above.
Switching between projects
If your Pivotal Tracker account has multiple projects, you will see the ability to toggle between projects—allowing you to see data for each one.
A breakdown of Pivotal Tracker recipes
Backlog burn rate
This Jama Analyze custom metric calculates how many sprints worth of stories are estimated in your backlog, based on your current velocity. It’s a highly valuable KPI that you won’t find in your Pivotal analytics.
Backlog burn rate goes beyond velocity and burn down charts to show how ready your product team is at any given moment.
It’s not just a warning sign of falling behind; it can help you shape and communicate the timing of new features.
Learn more about this metric and other best practices for your team dashboard with the School of Little Data.
These recipes will help your team better understand how accurately you are delivering stories throughout a sprint:
Cycle Times by Story Status
This Recipe tracks the median amount of time that tickets spend in each story state during the reported sprint. Adding cycle times recipes to your Pivotal Tracker Dashboard will help round out your ability to successfully plan your roadmap.
You should expect teams with shorter cycle times to have a higher throughput and teams with consistent cycle times across many issues to be more predictable in delivering work.
Using this metric, will allow you to spot the result of changes in the team's processes almost immediately. You'll be able to react and make any adjustments right away.
The end goal should be to have a regular and short cycle time, despite the type of work.
When you track this metric, you should watch out for increasing or erratic cycle times.
Cycle Times by Story Type
This metric shows the median time it took your stories to go from Started to Accepted during the reported sprint. The graph will display the cycle time for each one of the story types: bugs, features, and chores.
Compare cycle times to velocity and throughput in your Pivotal Tracker Dashboard. Finding the correlations between these three metrics will provide a much more reliable estimate of your team’s productivity, and will help to make a better roadmap forecast.
You can use this metric to identify stories with a cycle time exceptionally longer than the project's average.
For instance, if a story is delivered and then rejected several times, this could mean that the development team and the product team don't have a shared understanding of how the feature should work.
Introducing a new technology with a steep learning curve could also be a reason for a long cycle time. You can add this qualitative input to the Recipe in the expanded view.
Just click on the Recipe card and create a comment and help everyone on your team to understand the reasons behind the numbers.
This Recipe is a daily snapshot showing what state your points are in for the current iteration, so you can track how your sprint progresses over time.
A good sprint is one where stories are delivered and accepted at a steady pace, and only a few stories are started at a time. This means that you should expect to see a consistent climb of green columns, and small columns of yellow, blue and purple. Beware of red columns!
Irregularities or plateaus can help you identify blockers or trouble spots in your sprint.
This Recipe tracks the total number of features, bugs, and chores that have been accepted during the previous iteration.
In the extended view of this recipe, you'll see additional information like the mean, standard deviation, and volatility. To be reliable, throughput requires you to have similar-sized stories in your backlog. If your stories are not of a comparable size, your standard deviation will be higher and the range of your completion date estimates will be larger.
Like velocity, throughput is a team specific number and cannot be compared across teams.
Velocity is a measurement of how fast your team completes stories. Pivotal Tracker stories are assigned point values instead of due dates, so to calculate velocity, we average the number of points completed over the past few iterations.
Velocity measures the effort required to deliver value. Tracking how this metric evolves over time will reveal how consistent the team’s estimates are from sprint to sprint.
You can expect fluctuations in velocity within a reasonable range. Be alert if these fluctuations vary widely for more than one or two iterations. This behavior could mean that your estimates need to be reviewed.
Velocity is a simple but powerful metric, but when it’s tracked in isolation, it's a harbinger of misconception. Track both velocity and throughput on your Pivotal Tracker Dashboard. By comparing these metrics, you'll be able to identify the causes for a deviation in velocity. For example, if one sprint you deliver twice as many bugs as you normally do, you’ll do less feature work.
Story Cycle/Lead/Reaction time
- Story Cycle Time: Track the 6-month rolling average of the time (in days) it takes for stories to go from Started to Accepted.
- Story Lead Time: Use this recipe to monitor the 6-month moving average of the time (in days) it takes for stories to go from Created to Accepted.
- Story Reaction Time: Display in your Team KPI Dashboard the 6-month rolling average of the time (in days) it takes for stories to go from Created to Started.
Bug Cycle/Lead/Reaction Time
- Bug Cycle Time: This Recipe shows the 6-month rolling average of the time (in days) it takes for bugs to go from Started to Accepted.
- Bug Lead Time: Monitor the 6-month moving average of the time (in days) it takes for bug to go from Created to Accepted.
- Bug Reaction Time: Use this Recipe to track the 6-month moving average of the time (in days) it takes for bugs to go from Created to Started.
- Chore Cycle Time: Track the 6-month rolling average of the time (in days) it takes for chores to go from Started to Accepted.
- Chore Lead Time: This Recipe shows the 6-month moving average of the time (in days) it takes for chores to go from Created to Accepted.
- Chore Reaction Time: Monitor the 6-month rolling average of the time (in days) it takes for chores to go from Created to Started.
- Feature Cycle Time: Track the 6-month moving average of the time (in days) it takes for features to go from Started to Accepted.
- Feature Lead Time: Use this Recipe to display the 6-month rolling average of the time (in days) it takes for features to go from Created to Accepted.
- Feature Reaction Time: Monitor the 6-month moving average of the time (in days) it takes for features to go from Created to Started.
This Recipe is a 6-month rolling total of the number of bugs and features that were rejected after their status was set as Delivered.
Delivering a quality product to the customer and keeping the customer happy is most likely the final shared goal of everybody in your company. Track this metric while you keep Defect Removal Efficiency (link) in check to make sure that you are appropriately testing your work at all stages and to help you communicate product quality organization-wide.
This is another Jama Analyze custom metric. We calculate this with your Pivotal Tracker data to display the 6-month rolling average of the time (in days) it takes for features and bugs to go from Delivered to Accepted. Add this recipe to your Team KPI Dashboard to track possible bottlenecks in your product team.
A long review time can be frustrating for developers, who deliver their work on time but see the deployment delayed. It can also affect your customer success if you are blocking the release of a requested feature and ultimately compromise the product roadmap.
If you see spikes in this graph, you should start a conversation with the product owner to determine the cause for the delay: Is the product team understaffed? Are they having issues to review the stories? Is the staging environment experiencing any issues?
This Recipe tracks the current number of stories that have been set to Delivered, but have not yet been accepted. Like the QA review time, you should expect a steady behavior in this metric since spikes can indicate bottlenecks.
- Story Cycle Time: This recipe tracks the 6-month rolling average of the time (in days) it takes for stories to go from Started to Accepted.
- Story Lead Time: Add this metric to your dashboard to display the 6-month rolling average of the time (in days) it takes for stories to go from Created to Accepted.
- Story Reaction Time: This shows the 6-month rolling average of the time (in days) it takes for stories to go from Created to Started.
The current number of tracked bugs / chores / features across the current iteration, the Backlog, and the Icebox. You can either track all the stories or split by type.
The present number of stories that have been started but have not yet been set to Delivered. Jama Analyze automatically calculates this Recipe from your Pivotal Tracker data, turning your data into actionable metrics.
Add this metric to your dashboard to track how much work your team is doing at any given moment.