Agile Panel Discussion
Questions Remaining at End of the Panel Discussion
1. What kind of business lines do you support and business products/software do you create?
Lawrence Ludlow: Intelliware develops custom software for the Financial Services, e-Health, Services and Not-For-Profit and Regulatory sectors.
Declan Whelan: I am an independent consultant and I consult with clients that deliver these types of software:
Commercial business applications (automotive, aerospace, pre-fab construction)
Enterprise software (logistics)
Financial (retail banking)
Startup in semantic search
Medical (device manufacturers)
Embedded firmware (consumer electronics)
Alistair McKinnell: I've worked with teams to deliver advanced supply chain planning applications, workflow automation for pharmacists, and a registry for patients with chronic diseases.
2. What metrics do you use to report project status? How do you know if the project is in trouble?
Declan Whelan: Good agile teams will select metrics that are most important in their context. For some organizations time to market may be much more important that quality and so the key relevant metrics will be different. So, make sure you understand the business value in your organization and select metrics that inform how you are doing against those values.
The best "general" metric is stakeholder happiness. If your stakeholders are pleased with the team output then you are likely in good shape.
Teams will know they are in trouble if they consistently do not meet their sprint objectives. Many teams will use burn-down charts to track and estimate release dates based on how quickly they are burning through the product backlog. Because this is measured every sprint agile teams usually have a sense if they are in trouble after three iterations or so.
Deborah Hartmann wrote an excellent paper on agile project metrics. She suggests that metrics should be chosen to measure and improve some aspect of the process. Once the desired correction has been achieved then she suggests discontinuing the metric collection. This is because there is a cost to gathering and reporting metrics. The team should be focused on delivering value rather than generating metrics.
That said many teams use these metrics and reporting mechanisms:
- Sprint/iteration burn down to track team velocity
- Release burn down charts
- Number of open defects
- Software metrics
- Cyclomatic complexity
- Code coverage
- Size (LOC)
Alistair McKinnell: Working software at the end of each iteration is the fundamental metric. A project is in trouble when it has no sense of rhythm.
When a team is working well with Agile, every iteration is much like the next. There should be a steady pace. There should be no surprises. Customers should see working software at the end of each iteration. That is what I mean by rhythm.
Back to top
3. Do Agile projects generally have exit criteria for each scrum/iteration? What are the exit criteria?
Lawrence Ludlow: Key elements of Agile development include frequent builds of the application and delivery to a customer environment. An iteration is generally considered complete when the completed functionality has been delivered to and signed off by the customer.
Declan Whelan: The exit criterion is simply the end of the sprint/iteration. The work is time-boxed. Agile teams will defer any incomplete work to the next sprint rather than extending the sprint. Doing this focuses teams on making commitments that they can meet and being disciplined on completing work. Also, in Scrum the team may choose to terminate the sprint early if the sprint objectives cannot be met.
Alistair McKinnell: I believe that the notion of an exit criteria is embedded in the definition of both a Sprint and an Iteration. Completing a Sprint or an Iteration is meaningless unless the work is done. Yet so many teams have failed with Agile because they couldn't make the end of a Sprint or Iteration crisp. That is, they couldn't define what it means to be done. I've worked with developers who think that done means: no more typing for me to do.
Extreme Programming practitioners have coined the phrase Done Done to firm up what it means to be done. Done Done means that everything is done and that the software is potentially shippable.
4. How does test automation fit into an agile environment?
Declan Whelan: The answer below depicts how a high performing agile team works. Many teams do not do all of the automated testing I suggest. I encourage you to look at the Agile Testing matrix in my presentation as it identifies at a high-level the types of test and where automation can help.
Developer Tests (Q1)
Developers write automated unit tests and use these tests to drive design and refactoring. These tests are included in automated builds so they also serve as regression tests.
Story Tests (Q2)
Customers (usually with help from the team) write story (functional) tests that are often automated through tools like Fit, FitNesse, Robot etc. These do *not* typically test through the user interface and instead test against the domain or business objects in the system. These tests are included in automated builds so they too serve as regression tests.
UI Tests (Q3)
Finally, if the product has a UI then additional automated UI testing should be done. However, the intent here is not to functionally test the business logic because this is better accomplished with functional tests. Instead these tests make sure that the UI is correctly wired to the business logic. Automating testing of installers may also be beneficial
Para-functional Tests (Q4)
There are many para-functional tests that can be automated such as stress tests, performance tests etc.
A challenge with many organizations that have heavily invested in automated testing often lies with the tools and the mindset behind the test automation. The agile mindset is to use tests as the success criteria and therefore the driver for outstanding work. Regression testing is an important but secondary purpose. Therefore the tests need to be written before development and as the work is being done which precludes many UI centric tools that really require a completed UI before test scripts can be generated.
It turns out that unit tests are the cheapest and easiest tests to write so most agile teams have lots of these. Some teams use unit test technology for story tests for this reason. Typical UI automated tests are in my opinion the most expensive and brittle of the tests so I try to minimize them. I encourage teams to replace them with story tests and exploratory testing wherever possible.
Alistair McKinnell: Test automation is essential. Manual testing is essential. Both kinds of testing can go horribly wrong. Your team may lack essential test automation technique. Watch out! You may find that the costs of automation exceed the benefits.
Developer Tests, also known as Unit Tests, are a win whether or not a project is Agile. My experience is that most teams already have or can learn sufficient skill to make this practice work for them.
Automated Developer Tests are essential because they
- allow pairs to communicate and to stay focused;
- support merciless refactoring;
- enable emergent design;
- define a necessary condition for release readiness.
Two books that I like are Test Driven Development: By Example and xUnit Test Patterns: Refactoring Test Code.
Customer Tests, also known as Acceptance Tests, may not be worth automating. My experience is that most teams do not have the skill to automate Customer Tests so that the ongoing costs outweigh the benefits. Brian Marick has a detailed post that explores this possibility.
Back to top
5. In the Agile iteration, when should we start to automate the scripts? (i.e. right after the requirements are ready or after the developer finishes the code?)
Lawrence Ludlow: Functionality is built on a Story by Story basis, a Story representing a small chunk of functionality that is valuable to the customer. Automated tests are coded as part of developing a Story. For a Story to be considered complete the tests must be completed and must run successfully.
Declan Whelan: Agile teams normally speak about story tests which really are the requirements. I think of these tests as executable specifications.
They are often written in way that supports automation (see Fit, FitNesse etc.). Initially they would not be automated and the teams, usually the developers, write the "fixture" code that links the story tests to the business logic under test. So the automation is hooked in as the story is being completed.
As I described in question 4 above I try to minimize automated UI testing but when it is necessary it is typically written after the code is done in my experience. I will typically do a record and playback and once I am satisfied with the test, I will refactor it into a more maintainable test.
Alistair McKinnell: The beauty of automation is that ambiguity is chased away. The pain of automation is that it can't really be done until there is working code. Also, premature attention to speculative solution details may lead to overly constraining the eventual implementation and to wasted automation efforts. The key is to require precision prior to implementation. Brian Marick has a detailed post that explores this possibility.
6. How many days should be used to develop the automated scripts and execute the script (in general)?
Lawrence Ludlow: The time required to code automated tests for a Story can vary widely from hours to days depending on the size of the Story and the complexity involved.
Declan Whelan: The answer is... "it depends". Agile methods tend to be not too prescriptive about such things. The amount of time is really context dependent.
In my experience, good agile testers tend to focus away from scripting. Instead they focus on the story tests which define how the system should work and on exploratory testing for everything else. There is really little traditional UI-centric automated scripting on agile teams.
7. Do Agile team members need to be dedicated to one 'sprint' or can they work effectively being multi-tasked?
Declan Whelan: You should really strive for dedicated team members. There is a heavy hidden price for multi-tasking. Here are some strategies that some teams follow if this is a challenge in their environment:
- Time-box work done outside their "core" team.
- Dedicate team members to a sprint. So a team member that must work on say 2 teams might end up alternating sprints with each team. Some teams make their sprints really short to support this. I have even heard of 2 days sprints!
- Consider forming a larger team than takes responsibility for more than one product
The suggestions above may or may work in your context. If this is a challenge for your organization then work with stakeholders to see if there one of these suggestions or perhaps something that comes from your team members may help reduce the negative impacts.
Alistair McKinnell: I would say that it is an implicit belief of Agile that you want your team members to achieve a state of flow and that multi-tasking works against this. Any organization that wants to succeed with Agile must come to understand How Two Hours Can Waste Two Weeks.
Back to top
8. How is personality conflicts addressed in an agile environment?
Lawrence Ludlow: At Intelliware teams work together in project rooms. Personality conflicts are resolved in the room by team members. Sometimes others such as Senior Management or Human Resources may also become involved, but this is very rare.
Declan Whelan: Agile methods don't prescribe specific conflict resolution strategies. However, agile methods tend to bring such conflicts to the surface because of the need for more face-to-face contact and teamwork. And there are some practices that can help:
Use a project charter to gain clarity and consensus on what is important to the team. Often conflict is a symptom of some underlying differences in assumptions about what is important.
If possible, have people volunteer to be on agile teams. This is not always possible but if you can you will likely have motivated individuals that will be willing to work through personality conflicts as they arise.
Agile teams should have a retrospective at the end of each sprint/iteration and reflect on what is going well and what is not. This is a good time to let such conflicts be aired in a non-blaming but direct manner. Teams are encouraged to highlight specific things that are impeding them rather than pointing fingers. For example "design discussions are taking too long" would be preferable to "Jerry is a long-winder blabbermouth". I have found that regular retrospective builds trust and understanding on teams which can often server as a foundation for dealing with personality conflicts.
ScrumMaster or Coach
Many teams will have a coach that is able to objectively observe destructive behaviour. They may be able to facilitate resolving this.
Sometimes the best solution may is for someone to exit from a team ... and the sooner you can identify this, the better! If you have personality conflicts poisoning your team it is important to correct this as soon as possible. If the team is unable to resolve it then seek professional help from outside the team.
Alistair McKinnell: The short answer is: just like on any other project.
Now, I have noticed that on Agile projects there are many opportunities for people to get to know each other. For example, I always advocate the practice of Team Lunch: once a week the team should make time to have lunch together.
Consider this principle from the Agile Manifesto: the most efficient and effective method of conveying information to and within a development team is face-to-face conversation. A team following this principle should get to know each other.
I've noticed that once a team gets to know each other by solving work related tasks together, team members are better equipped to resolve personality conflicts. Of course, YMMV.
The Extreme Programming Planning Game, used for release and iteration planning, defines a structured environment with well defined rules for a team to collaborate while performing the conflict inducing tasks of estimating and making commitments. A team using the Planning Game will get to know each other.
Back to top
Questions Addressed During Panel Discussion
1. Does size of project and duration add any constraints to use or not to use the Agile methodology? Example: My project has about 400 developers, complex application and duration of about 4 years.
Declan Whelan: Typical Scrum/XP teams should not be greater than 10 or 12 people. So for a program with 400 developers you need to divide and conquer with multiple agile teams. There is a Scrum of Scrums methodology for scaling agile practices up through a hierarchy of Scrum teams. You may also want to look as Crystal Orange from Alistair Cockburn that is targeted for teams of about 40.
Also you may want to check out the work of Jutta Eckstein who wrote a book called Agile Software Development in the Large.
2. On an Agile project, how do you explain the value of fixing defects to developers for each iteration instead of solely focusing on development of the next set of new features?
Declan Whelan: As the team starts they should have a project charter that sets common goals around code quality and that bar should be set high. I like to use a goal like we will ensure that the number of defects decreases over time and put in place ways to measure this.
So, if defects are not being addressed this will show up as the team reviews their sprint. And then as a team effort they need to figure out how to address this quality problem.
The team's work needs to be aligned with the business as this is primarily the role of the product owner or customer. The team should only selects work from what has been agreed upon during sprint planning. So no one should be working on new features without finishing the work for the current sprint.
In practice, developers on agile teams tend to produce higher quality code through the use of test-driven development practices. They learn how much easier and faster it is to work on code with well designed unit tests.
If a developer's view of technical debt and quality is not aligned with the rest of the organization then perhaps the effort should shift from getting them to work on defects to instead on finding them another company to work for.
3. Should defects be logged and tracked or fixed immediately?
Declan Whelan: Do the simplest thing that could possibly work and do more if it is not working.
For example, for defects found during a sprint the simplest thing would be to simply fix the bug, rebuild and move on. If this is not possible then the next simplest thing might be to write it up on an index card in plain view of the team. If you are a distributed team then it may make sense to enter the defect into an issue tracking system.
For defects outside a sprint I think the simplest thing would be to enter the defect into a defect tracking system because fixing the issue is not part of the current sprint.
The important thing to decide early a mechanism and augment this is it either too unwieldy or is too informal. My rule-of-thumb is to avoid writing formal bug reports if I can because of the overhead to enter, scope, prioritize, track and report on them. My personal experience is that it can be up to 4 hours works per defect. This can be a tremendous cost over the lifetime of a project. Instead I would prefer to focus that effort on the software itself.
Back to top
4. If you don't record the defects even when fixed immediately, haven't we lost a log of "lessons learned" data?
Declan Whelan: I don't think so. The learning comes not from the recording of the defect but from the root cause analysis. The goal is to have high quality software and if defects are continually tripping the team up then it will be painfully obvious. This will surface in retrospectives whether or not a bug is recorded and the learning and recovery steps will come out of those retrospectives. Now if defects kept cropping up time and time again then I would suggest that the team put in place a strategy to fix this and that strategy may well include recording defects found after check ins. But once this was corrected and the situation stabilized then I would suggest that the team stop tracking such defects formally.
There is perhaps another way to look at this. An agile team is focused on delivering high quality valuable work. So it is important to minimize defects delivered outside the team and it is critical that these be managed well. But it is a team effort. What happens inside the team is their business. I believe the focus on the team should be on delivering working software. Focusing on writing formal defects against missed requirements, bugs etc on work in progress will just divert attention away from more important work.
Finally, if I was working with a team and they really wanted to record defects during a sprint and the product owner was supportive of this then I would recommend that they do so.
5. Since Agile testing does not focus on the user interface, how can we ensure the quality of the UI?
Declan Whelan: Agile testing should focus on all aspects of the system including the UI. I recommend against automating functional testing through the UI. I suggest some UI automation testing that provides feedback that the UI still works. I would also strongly suggest UI session-based testing and if applicable usability testing.
6. How do you manage teams in Agile? How is it different from the traditional way?
Declan Whelan: The big difference is that you don't manage teams. The team manages itself and managers help the team by identify and solving problems that the team can't do by itself. Managers also serve as communication conduits from the team to senior management and vice-verse.
Back to top
7. Does Agile development support system documentation and are there challenges trying to transition coding to new development teams?
Declan Whelan: Agile practices do not address this directly and suggest doing the minimum documentation that you can. In lean terms, system documentation is considered "waste" but in some cases it may be necessary waste.
The intent here is focus attention on high quality code rather than documenting poor quality code. So, if the team has bandwidth there is higher value in improving code etc. rather than writing detailed design documents. This is because the cleaner code base makes transitions much better. Great documentation on crappy code helps but not very much in the long run.
However, I do recommend having some minimum high level architecture documentation that serves as a guide for new development. And teams should decide what documentation artifacts they feel are important to help them deliver new software. If the team is co-located this could be as simple as taking snapshots of whiteboards and archiving them.
I think it is important to shift the thinking from "moving to agile" to "moving to delivering better business value". Agility is only a means to an end and that end needs to be clearly articulated. If you are interested in making this transition you may want to read a white paper I wrote recently on this: http://dpwhelan.com/services/transitions.htm.
For documentation work I suggest you ask "Who is going to read this?" and "When are they going to read it?" and only do work that is going to read by some stakeholder right away. When there is documentation I suggest treating it as a "story" or "task" so it becomes planned work. Alistair Cockburn recommends doing documentation as a "parallel and resource competing thread of the project" (Agile Software Development 2nd Edition, pg. 221).
8. What is the cost associated for moving to Agile?
Declan Whelan: This is really context dependent. The costs for a small start-up will be significantly less than a large multi-national development team.
Back to top
Post-Conference: Agile Panel Discussion | Better Test Planning Solutions Workshop | Risk Management Workshop
Building Effective Measurement Solutions Workshop | Challenges of Test Automation Solutions Workshop