Recently I was asked by one of GigaSpaces’ customers about
our experience with Scrum. I thought it would benefit others as well if I publish it
here. I have removed the customer‘s name and contacts, but this is what he wrote:
In addition to the technology, at the time we were impressed with the development processes you had put in place and in particular your implementation of Scrum.
I‘m putting together a proposal for adoption of Scrum inside my organization for a new project we are kicking off. Can you please share your input on a few Scrum related questions and how it‘s implemented at GigaSpaces would be appreciated.
- Requirements gathering and capture; Do you still have traditional R-Specs from a product management group or do you receive stories/backlog items? Does Sprint introduce any new issues with requirements capture and tracking?
- QA (who does it, when it gets done): My initial assumption and reading was QA were an active part of the Sprint and the Sprint does not complete until QA is done. I see some teams suggesting QA is external to the Sprint. How have you approached this? Does it raise integration issues? Do you have a separate integration/acceptance phase?
- Sprint (who is involved, how they are managed) : Somewhat related to above, do you reform teams every sprint depending on content and expertise? What about management issues and the role of traditional team leader‘.
- Managing a customer focused version release plan : How do you satisfy marketing requirements to publish plans for future release content and timelines?
- What works well?
- What doesn‘t (or didn‘t) work well?
Thanks in advance, for any time your able to spend on this.
I’ll do my best trying to articulate the lessons we learned.
Requirements gathering and capture; Do you still have traditional
R-Specs from a product management group or do you receive
stories/backlog items? Does Sprint introduce any new issues with
requirements capture and tracking?
We use user-stories to capture requirements. A person in the team, usually with
field interaction is defined as “Customer Representative”. We called it the “User Rep”. The person playing this role is responsible to convey the requirement to the feature team. The user rep takes part in user story definitions and is the one who “accepts” the feature as it is complete.
Initially, we captured all the stories in a single product backlog that I was responsible for. Now that we have introduced a PM team, we are managing those stories and the backlog in a single JIRA project.
I must admit, that it‘s easier to manage a single Excel product backlog; however, this is not scalable enough. So the best option I‘ve found so far is to capture future releases in a JIRA project, and manage the current release in a single
Sprints create a challenge, in requirement gathering and in general, as your team may tend to focus on the current sprint, and lose sight of the bigger picture of the project/product. However, this can be mitigated when there are at least three stakeholders who maintain a long-term view. They are the “product owner” the “system architect” and the “release manager”.
QA (who does it, when it gets done): My initial assumption and
reading was QA were an active part of the Sprint and the Sprint does
not complete until QA is done. I see some teams suggesting QA is
external to the Sprint. How have you approached this? Does it raise
integration issues? Do you have a separate integration/acceptance phase?
There are several approaches and we’ve tried many of them. Our current approach is that QA is part of a sprint. Every sprint deliverable has defined release criteria. In our case it is considered “beta quality”. This means, for example, that there are no regressions in any of the automated tests. One of the keys to success here is to make sure most tests are automated and the coverage of the automated tests is as wide as possible. If you can’t do that up-front, then you need to have several sprints along the way that focus solely on system-wide tests.
One of the areas I found that Scrum books do not address is the transition to Scrum. It takes time and discipline to automate many of the manual QA tasks. This means, at least initially, that it is not practical to execute a full QA cycle in each sprint. To overcome this, we have done the following:
- All new content is completed and tested during the sprint, including
unit, integration and sanity testing
- We invested heavily in automation and coverage increase
- The project plan leaves time for system-wide testing coverage at the end of
Sprint and Structure
Sprint (who is involved, how they are managed) : Somewhat related to
above, do you reform teams every sprint depending on content and
expertise? What about management issues and the role of traditional
We started with two teams and “Scrum of Scrums” managed by myself. Now we have five teams and Scrum of Scrums. The team leads act as Scrum Masters. Not all of them “master scrum”, however, having a single rhythm to the entire project makes the entire team practice Scrum.
We‘ve started with four-week Scrums, and gradually transitioned to two-week Scrums. The fundamental change team leaders have to go through is to facilitate and review. This is a different style of leadership.
The expectation from the feature team is accountability: accountability for delivering content on time and for quality and completeness. No politics, and no “time stealing” by intentionally underestimating task durations to ensure “favorite tasks” are in the plan.
Plans and Roadmaps
Managing a customer focused version release plan : How do you
satisfy marketing requirements to publish plans for future release
content and timelines?
This is an actual pain you‘re touching on. Although all of us are in the software business, when it comes to customer/supplier type of relationships, we forget our pains and ask for magic. Well, if this is how the game is played, than we‘re playing along.
What I‘m trying to say is, there is a high enough level of product backlog that is constantly maintained by Product Management and is the basis for the roadmap. We update this plan periodically (every 4 weeks) and keep an updated “uncommitted” product roadmap. The part that is committed is a release that has been planned and being actively worked on. Like the 6.1 release of GigaSpaces XAP which we are working on now.
What works well?
In general, Scrum works very well for us. I believe we have improved our practices dramatically just by retrospecting on every sprint and fixing the most immediate and critical process and tools problems.
I recommend initally applying scrum by the book, without alterations, and adapt based on periodic feedback from the team.
Finally, it’s important to remember that Scrum is a mind set.
I hope this information was helpful to the readers of this blog. If anyone has more questions about our experiences at GigaSpaces, please post a comment on this blog and I will be happy to answer.