I organised a workshop on Specification by Example yesterday at the Progressive .NET mini-conference in London, demonstrating how realistic examples are a very effective tool to flush out incorrect assumed rules and point to real business rules in software requirements.
This was the first time I did something similar at a public event. The workshop is one of the central parts of my Specification by Example workshop but I always presented that on-site in companies, where we work on the domain of that particular company and all the participants know quite a bit about it. With a public event, that is a challenge as participants work for different companies and on different domains. I chose the Blackjack game for the domain as it is one of the most popular casino games, but not a lot of people go to casinos, so I supposed that there will be 5 or 6 people in the room that know how to play and they will be able to act as domain experts or customer representatives. Blackjack rules are fairly simple so one hour of demonstrating with playing cards will be enough to introduce the rules to people who had no previous exposure to the game. It was a bit of a challenge to buy ten packs of playing cards on a Sunday evening where I live, but I think that it was worth it.
The experiment
To simulate a situation where a customer points the development team to a competitor site and asks them to copy some functionality, I went to a popular blackjack online games site and copied the game rules. The interesting thing about these rules is that, although they fit on a single A4 paper, there were quite a few inconsistencies and functional gaps there. For example, one rule stated that the house always wins if the dealer has a Blackjack and another rule stated that the player gets his money back if both the player and the dealer have the same cards, so a case where both the player and the dealer have a Blackjack was ambiguous. There were some edge cases that were not properly explained which left a lot of space for ambiguity and misunderstanding. One particular rule dealing with the value of the Ace card was very uncommon, as if it was intended to give more advantage to house over players. I’m pretty sure that it was not implemented like that even on the original site, but I was interested in finding out whether the workshop participants will complain about it.
The workshop
There were seven people in the room who were blackjack players, so I asked them to be domain experts and other people to form teams around them. Teams were then given the task to demonstrate chosen blackjack rules for financial payout using a pack of playing cards and realistic examples. During the first hour I mostly let the people in the teams explain the rules to each-other. During the second hour I asked them to focus more on specifying financial outcome rules with realistic examples and demonstrating individual rules from the requirements document. If they found any inconsistencies or missing functionality, the domain expert on the team was supposed to resolve the issue and in particular write that down as an example. On the end, one representative from each team – not the domain expert – presented the examples and answered the following three questions:
- was there any functionality missing from the requirements that you discovered?
- were there any conflicting rules that you discovered?
- was there something that you scrapped and decided not to do?
The results
Although none of these people had any previous experience with example-driven specification workshops, the examples that all teams produced were pretty good which shows that it is fairly easy to get started with specification workshops. Most of the examples ended up being in the form of “with this hand of the player and this hand of the dealer, the outcome is this”. One team used examples to define domain terms such as “bust” and “blackjack” and then used them in the examples demonstrating game outcome. Other teams only used examples of game outcome and defined the domain terms implicitly using that. With specification workshops on real projects, we often end up cleaning up examples to produce the specification (I call this distilling the specification in my book Bridging the communication gap), so this would give people a chance to refine examples and focus them.
One of the teams used the rules on the paper just as a guideline, they worked on Blackjack rules that the domain expert demonstrated during the workshop and ignored the details on the paper, which is an interesting example of the team ignoring written requirements and going with their understanding of the matter. A domain expert in another team pulled out a card with Blackjack rules and card counting strategies from somewhere and they used that to clear up any questions. Two teams decided to ignore the offending rule about Ace values and just specify the Ace value as it was in a normal Blackjack game, which is an example of a domain expert overruling a requirement that makes no sense. One team had not reached that rule during the workshop at all so they haven’t noticed it. All the teams discovered almost all the inconsistencies and functional gaps, and one team even noticed something that I missed while preparing the workshop. It turned out that one of the designated domain experts wasn’t really a Blackjack player so they spent lots of time discussing cases which are irrelevant for the game, such as the dealer splitting cards. That might seem logical from a perspective of someone who values symmetry (and developers typically do) so this demonstrated what can happen when developers try to implement acceptance testing on their own without customers or domain experts.
Measuring the effectiveness
I asked for a quick show of hands to count who considered that discussing realistic examples helped flush out inconsistencies and functional gaps. All the workshop participants except one raised their hands, and the one that did not raise his hand said that he had not heard the question, then said that he agrees with that as well.
I also ran a quick feedback exercise. Feedback exercises are a very effective way of measuring shared understanding of the domain, working on the same principle as Planning poker. Team members write down answers to a question that requires understanding of the domain, typically a difficult edge case, and then compare the results. Differences in answers point to differences in understanding. Before the workshop, I selected three particularly difficult edge cases that were not explained precisely on the requirements sheet. After the workshop, I asked all the participants to write down the outcome in those three cases and then compare it with other members of their team. Different teams will probably have different answers because their domain experts made different decisions, but people in the same team should have the same answers to these questions if they share the understanding of the domain. After they compared the results, I asked people to raise their hands if they had different answers within the team – only one team raised their hands. The explanation was that they still have not had time to discuss a particular rule that was demonstrated by the edge case.
As six out of seven teams had a shared understanding good enough to come up with the same answers to really difficult edge cases after a quick workshop, even though most people on the team had no previous exposure to the target domain, for me the experiment was a very effective demonstration of how specification workshops and specification by example help improve clarity in software projects and give us a much better foundation for implementation and testing than traditional requirements.