How To Do Effective Capacity Planning on The Scrum Team
One major factor that drags focus factor is people being allocated to multiple projects, overhead of task switching comes to play. To the developer in this case, done means: This factor lies in the range 0. Do I regret it? When adopting agile as a new technique for a team, frequently there will be a large backlog of stories that need to be estimated all at once. The smallest user story is one story point , and the biggest one is 21 story points. The rest was observations mixed with some advice, and this was a story.
What is Planning Poker?
The reason an exponential scale is used comes from Information Theory. The information that we obtain out of estimation grows much slower than the precision of estimation. In fact it grows as a logarithmic function. This is the reason for the higher uncertainty for larger items. Determining the most optimal base of the exponential scale normalization is difficult in practise. The base corresponding to the Fibonacci scale may or may not be optimal.
Here is a more detailed explanation of the mathematical justification: Out of the first six numbers of the Fibonacci sequence, four are prime. This limits the possibilities to break down a task equally into smaller tasks to have multiple people work on it in parallel. Doing so could lead to the misconception that the speed of a task could scale proportionally with the number of people working on it.
The Fibonacci sequence in fact forces one to re-estimate the smaller tasks one by one. According to this agile blog. I think it's because they add an air of legitimacy Fibonacci! You definitely want something exponential, so that you can express any quantity of time with a constant relative error. The precision of your estimation as well is very likely to be proportional to your estimation. Now why Fibonacci instead of, 1 2 4 8? My guess is that it's because fibonacci grows slower. It is difficult to accurately estimate large units of work and it is easy to get bogged down in hours vs days discussions if your numbers are too "realistic".
I like the explanation at http: As we adds up all the uncertainties we are less sure of what the hours actually should be. I would be happily taking an estimate of 13 hours for a task that seems twice as large as one I've previously estimated at 5 hours. Why is the Fibonacci series used in agile planning poker?
Probably just because the Fibonacci sequence is "cool". Project and Program Management. Working With Project Sponsors. What Is Stakeholder Management? Planning Your Stakeholder Communications. Why Do Projects Fail? Learning How to Avoid Project Failure. Managing Stakeholders Scenario Training. Another challenge I find with the FF approach and estimating the story based on the hours per tasks within that story: The team can think about the number of hours and the tasks and most of the time agree, with their own estimation, they just did.
Fair enough, with the hours estimated for each task, we have a scientific approach. Developers are pragmatic people and would tend to follow the scientific reasoning hours estimated rather than their guts feeling, especially at the end of a long meeting when everybody wants to leave for lunch break.
The latest approach is crucial. Because, there are numerous number of tasks that will be forgotten or might not be directly related to building the feature, but non the less useful. It could be some troubleshooting, some issues with a framework, last minute fix on the automation of deployment, testing the feature locally, testing the feature on the integration environment, walk through with the BA and so forth. Most developer focus on the coding time, not the testing and deployment time.
As you said, adding buffer is not a good idea neither. I like the term focus factor, but as you are using it there is no relation to real data which, by the way, can be readily available. So as you use it, I would call it fudge factor. Your instructions are to adjust it until things work out. You provide some excellent and reasonable rationales for adjusting it up or down, but they are not grounded in real data. Real data is available, so why not use it? Here are two ways: Divide this by the potential maximum and you have your focus factor.
Subtract this from the potential maximum, divide the remainder by the potential maximum , and you have your focus factor. The first choice may or may not be more accurate but the second is less threatening to team members and it may be accurate enough.