Concordion: Agile Acceptance Testing with free-form text

I finally had some time to take a look at Concordion, an acceptance testing tool that I’ve heard about on several conferences. Concordion is an interesting alternative to FIT. It is developed by David Peterson and released under the Apache opensource license. Similar to FIT, Concordion uses HTML documents as an executable specification and requires some glue code (fixtures) to connect the executable elements of that specification to the domain code. Unlike FIT, Concordion does not require the specification to be in any particular format — you can write examples as normal sentences, without any restrictions.

Concordion is really simple. Its instrumentation only allows programmers to set global test variables, execute fixture methods and compare actual results with expected values. Programmers can use special HTML element attributes to mark words or phrases that are used as test inputs or compared to test results. Web browsers will just ignore unknown element attributes, so Concordion test instrumentation is effectively invisible to people that are not interested in test automation. For example, here is a HTML document that I’ve used as a simple test:

<html xmlns:concordion="http://www.concordion.org/2007/concordion">
<body>
<h1>Free delivery</h1>
<ul>

 <li concordion:execute="#offer = checkFreeDelivery(#type, #books)">
 Free delivery <span concordion:assertEquals="#offer">is</span> 
 offered to a <b concordion:set="#type">VIP</b>
 customer with <b concordion:set="#books">10</b> books in the cart. 
 </li>

 <li concordion:execute="#offer = checkFreeDelivery(#type, #books)">
  It <b concordion:assertEquals="#offer">is not</b> offered to 
  a <b concordion:set="#type">regular</b> customer 
  with <b concordion:set="#books">10</b> books</li>

 <li concordion:execute="#offer = checkFreeDelivery(#type, #books)">
  It <b concordion:assertEquals="#offer">is not</b> offered to 
  a <b concordion:set="#type">VIP</b> customer with 
  only <b concordion:set="#books">9</b> books.</li>

</ul>
</body>
</html>

The concordion:execute command on the LI element specifies that the offer variable is set by executing the checkFreeDelivery method and passing type and books variables set in the element text. The concordion:set command is used to set a variable based on the inner text in the element. The concordion:assertEquals command inspects the inner text in the element and compares that with the result of a method or current variable value. For repetitive specifications and calculation rules, Concordion also supports attributes for tables similar to the FIT ColumnFixture.

At the moment, Concordion only supports Java fixtures — It actually works as a JUnit extension. This provides direct integration with JUnit test runners, making it much easier to execute Concordion tests from popular development environments and integrate into continuous build systems. It also shortens the learning curve required to start using Concordion. The fixture is simply a JUnit test class. It should be called the same as the HTML file (in this case, the file was ConcordionTest.html, so the class is ConcordionTest.java), and declare the methods that are used by concordion:execute.

package conctest;

import org.concordion.integration.junit4.ConcordionRunner;
import org.junit.runner.RunWith;

@RunWith(ConcordionRunner.class)

public class ConcordionTest {
	public boolean freeDelivery(String type, int books){
			if (books<9) return false;
			if (!"VIP".equals(type)) return false;
			return true;
	}
	public String checkFreeDelivery(String type, int books){
		return freeDelivery(type, books)?"is":"is not";
	}
}

My first impression of this is that the fixture model of Concordion does not depend on inheritance so it is a lot simpler to learn than FIT and somewhat more flexible to write Concordion than FIT fixtures. On the other hand, Concordion lacks the extensibility and powerful model of type adapters and cell handlers that enable FIT to bind domain objects and business services directly to the specification.

When the JUnit test is executed, Concordion runs through the files and executes commands, verifying expected outputs with actual results in assertEquals. JUnit test run will tell you whether all tests passed or were there failures, and Concordion also saves the results in the system temporary folder in HTML form, making it easier to see what actually went wrong. The example above intentionally has an error — here is the screenshot of the result:

Concordion does not have a test management tool and relies completely on the programmer’s IDE to manipulate and execute tests. I have mixed feelings about this. Although file management within an IDE makes it much easier to put tests in the same version control system as the domain code, I miss the ability to re-use parts of the test specification such as common set-ups or test components with macro variables that are available in FitNesse. It also puts tests completely under the control of programmers.

Concordion takes some ideas that have evolved as best practices to use FIT/FitNesse and makes them very explicit, actively discouraging other ways of working. For example, acceptance tests have to be stored in the same version control system as the code, there are no pre-built test building blocks that would encourage scripting. On the other hand, while FIT/FitNesse community is moving firmly towards using domain objects directly in acceptance tests and reducing the amount of glue code, to eliminate translation between layers and promote domain-driven design, Concordion model is stuck in requiring you to create an explicit test method for every verification in the fixture.

The thing that really worries me is the fact that HTML files are stored next to Java classes and that developers need to add non-standard HTML attributes to the test page. This smells to me of a hand-over of tests from business people to developers at some point, which is a practice that I don’t approve at all. In my opinion, tests should be shared between the whole team and not handed-over down the pipeline. I guess that this can be avoided with some discipline and research into tools that will surely not overwrite non-standard HTML attributes. But I would still like to see a proper test management tool for business people to use.

I’ve been hearing a lot about Concordion recently, mostly being mentioned by people as an interesting tool to look into. Safari has no books about it, all I could find on Google and Technorati blog searches are reports of people coming across the tool and doing some basic things with it, such as this post. I would be really interested in hearing stories from people that have real-world experiences with this tool to share. What happens six months later — are tests easier to manage than with FitNesse, has it turned out to be good enough tool to talk to business people? Is my fear of hand-over real, or is it not a problem at all?

I'm Gojko Adzic, author of Impact Mapping and Specification by Example. My latest book is Fifty Quick Ideas to Improve Your User Stories. To learn about discounts on my books, conferences and workshops, sign up for Impact or follow me on Twitter. Join me at these conferences and workshops:

Specification by Example Workshops

Product Owner Survival Camp

Conference talks and workshops

6 thoughts on “Concordion: Agile Acceptance Testing with free-form text

  1. I’ve used both FIT and Concordion. I have a couple of comments:

    “On the other hand, while FIT/FitNesse community is moving firmly towards using domain objects directly in acceptance tests and reducing the amount of glue code, to eliminate translation between layers and promote domain-driven design, Concordion model is stuck in requiring you to create an explicit test method for every verification in the fixture”

    I find that that to be a very good thing. Coupling domain models to FIT documents becomes a huge drag on refactoring.

    Also, the way that an external user or business expert thinks about the domain and the way that a system *implements* the domain so that it can simulate it are quite different. For example a business expert will usually model the domain by categorisation while a system implements the domain by composing objects that collaborate. Trying to use the same model for both aspects of the system makes for a very brittle system with an unnatural architecture in my experience. FIT fixtures act as a good translation layer that maps between how the business expert thinks about the domain and how the system implements the domain.

    “This smells to me of a hand-over of tests from business people to developers at some point, which is a practice that I don’t approve at all.”

    Again, from experience I can say that if this does happen it has nothing to do with Concordion; it’s an organisational problem. Developers and business experts should collaborate on writing the documents, just as with FIT. And, again just as with FIT, the developers should help the business experts format and organise the documents so that the test fixtures pick them up.

    The test documents do not have to sit alongside the Java code. They just have to be named and collected into directories so that they can be found as resources at runtime when the root directory containing the documents is added to the Java class-path.

    One other plus point: Concordion has an excellent web site. A lot of the tips on that site apply just as well to FIT.

  2. Hey Gojko,

    I made a .NET port of Concordion that is in an alpha stage right now.

    It does not have a restriction of where the specs are in relation to the fixtures (you can specify this with attributes but I’m moving that code into a config file).

    I believe that with OGNL in the backend you can make calls to methods on domain objects from the fixtures if you like, at least with no more work than would be required of Fitnesse right now. While It is a nice idea to keep domain objects in the fitnesse tests we found the scripting to be very hard to maintain in the long run if code is refactored and the wiki model is a real pain to work into continuous integration.

  3. Having trialed both Concordion and FitNesse I have recommended that my organisation use Concordion. The reasons for this have been:

    – FitNesse results in a test scripts essentially, and we felt this would cause too much pain when refactoring was required. Whereas Concordion promotes that all the scripting is done in the fixture (Java) code, thus making code resuse and refactoring simpler.

    – Concordion’s integration with JUnit makes it very simple to hook into a Continuous Integration system.

    – The ‘active’ specifcations that result from Concordion are more suitable (and readable) to develop with the business than the wiki pages in FitNesse. This also lends itself more to a Acceptance Test Driven Development approach.

    – Source control is considerably easier with Concordion as you’re only dealing with HTML and Java files and writing of both is done with an IDE.

    We are yet to employ Concordion in earnest yet, but that is about to change as the first iteration of the new project is due to start soon. So it will be interesting to see how well it performs in a full real world environment.

  4. We successfully use Concordion together with TestNG in a recent project for more than half a year now. Knowing FitNesse from another project I can agree to all the points Gareth listed before. Writing the specs is done by testers, developers and business people together before implementing a feature. Because the specs are deployed immediately and display the current state of development and testing to all stake-holders, that means no “hand-over” at all. It enables the product owner to continuously review how a story gets implemented and tested. Our reviews at the end of each sprint are very short since then.

    <>

    Our tests do not address any domain object or service directly. All domain dependencies are wrapped by a Testing API. This way we minimize refactoring and maintenance troubles. Because the Testing API is written in Java (or Groovy) you can have as much includes, reusage, type-adapters or whatever you need at this level. And you keep this technology needs apart from the testing needs. So the tests stay very readable, maintainable and stable – even if your domain changes.

    <>

    Definitely: Yes

    <>

    Yes

    <>

    It’s a matter of your approach – as always. To us Concordion is the perfect tool to adopt the Agile Testing approach we want to use.

  5. To me if your testers cannot code they have no business writing an automated acceptance tests. Automation is engineering plain and simple.

    You might be able to get away with writing a handful of tests, but a suite of 500-1000 acceptance tests needs to be maintainable otherwise it will die. This requires coding skill.

    In our team the testers take responsibility for defining the acceptance criteria with the customer. They read use cases/stories and convert them into Concordion specs.

    A developer can then work with the tester if necessary to automate depending on the testers coding ability. The essence of the spec (user’s intentions are unaffected). So there is no handover. I think the previous commenter is right – the throw it over the wall approach can apply to any of the ATDD frameworks.

    If anything because testers may rely more on developers this encourages collaboration in my team.

    Tying your acceptance tests to the implementation of a system smells terrible to me. Your users’ intentions may be stable but the domain objects in your system that implement those intentions may need to be refactored many times. This kind of coupling makes your acceptance tests brittle and unmaintainable.

    To me ATDD is about coding discipline with a very customer friendly facing intention based “view”. Concordion is the simplest most direct framework for achieving this for me.

    I talk more about why I prefer Concordion over FitNesse here

    http://www.dish2dish.com/confluence/display/NPB/2010/02/14/Why+I+prefer+Concordion+to+FitNesse

  6. I’ve encountered several organisations that have struggled to get FIT adopted by the business and have struggled even more with the disconnect caused by the specification not being versioned along-side the code and the “unnatural” method of capturing the tests.

    However, when I’ve introduced Concordion it has been exceptionally well received. This has worked particularly well when shipping executable acceptance tests to be delivered by outsourcing companies and allows the onsite team to specify the acceptance criteria as well as get the senior developers and analysts onsite to design the interfaces and business service calls they expect to be called to create a failing acceptance test.

    By far the biggest problem I’ve encountered with Concordion adoption (but negligible compared to FIT) has been developers trying to take ownership of the specifications, but as mentioned in other comments this is an organisational issue rather than a tool issue.

    I’ve also found it extremely easy to adapt concordion to meet my particular needs. Being JUnit based you can integrate easily with enterprise tools for sourcing and reporting. This has often proved much more onerous with the other tools I’ve encountered…

    I would finally commend David on his excellent diagram of the story delivery lifecycle (http://www.concordion.org/memo/StoryDeliveryLifecycle.pdf). Too often developer pick up a task and think they know what to do and never speak to the customer until it’s done. I am constantly prompting to seek feedback at every opportunity, about to write an acceptance test, speak to the customer, got a failing acceptance test in place, check with the customer, got a single acceptance test passing, demonstrate to the customer…

    http://www.agileinsider.org/2010/04/natural-language-automated-acceptance-testing/
    http://www.agileinsider.org/concordion-plus/

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>