Blog

What It Takes to Test a Vaadin

By  
Henrik Paul
·
On Jul 4, 2011 12:09:00 PM
·

"Open source" and "agile" can sometimes be understood as synonymous with "off-the-cuff". After all, those open source developers are all just groups of hobbyists who commit code willy-nilly, only for the sake of belonging to something bigger, right?

Well, we at Vaadin have something to say about that. Every release we do go through a rigorous regiment of tests. For example, the 6.6.3.nightly-20110629-c19566 version ran through 2823 tests before we accepted it for release. We're constantly adding more tests, so that number won't be the same for too long.

What Does That Actually Mean, Then?

On the browser-y side of things, we currently support Internet Explorer, Chrome, Safari, Opera and Firefox. Since all versions behave a bit differently, even if it's the same browser with a different number, we're facing at least 11 different browser behavior patterns that we support. On top of that, we support 19 different servlet container versions, divided amongst Tomcat, Jetty, JBoss, GlassFish, Liferay, GateIn, eXo, WebLogic and Google App Engine.

All those combinations are meaningless unless we have actual tests to run for them; we have a respectable load of server-side JUnit tests and many integration tests for each of the mentioned servlet containers. Adding on top of those all the TestBench tests we run, we arrive at the 2823 mentioned tests, and over 8000 automated screenshot comparisons for pixel-perfectness. Each night, our server rack starts to churn. A build process is fired off for each of the branches we are developing on. This keeps on for about two hours per build, involving all the ruthless and rigorous testing we have built.

This was only for the core Vaadin Framework, by the way. Feel free to add over a thousand tests on top of all that for our Pro Add-ons.

In order to run all of these tests, we need a good amount of hardware and a working architecture. All this is run in two rack-mounted servers, divided into 14 virtual machines running the TestBench tests - 12 Windows XP and two Windows 7 machines. On top of that, we have a virtual machine that runs all the required integration tests, manages the TestBench hub and a crucial TeamCity installation that manages everything required to make all the builds we need - nightlies, maintenance/minor/major releases, custom builds and all the various Vaadin Pro Add-Ons.

Day-to-Day Work Process

Ok, we have established that we have some awesome hardware and associated architecture to churn everything we need. But that alone is not enough. Having a great engine is worth nothing without a great driver. Instead of just fixing stuff, we fill a sprint by gathering a nice bunch of tickets from our Trac. Then we read them through, make sure we understand what is involved in fixing them and fix them properly.

Depending on the task at hand, there's usually a set of automated tests done for the ticket prior to the work itself. The tests will initially fail, so that the developer knows that he's testing the right thing. Once the ticket is done, the automated tests will show a green light, so that the developer knows that he has fixed the right thing. These tests are then to be included among the thousands of existing tests, making sure that the issue will remain fixed.

Once the problem is believed to be correctly solved by the developer, it is reviewed by another person in the development team. Thanks to our open communication and friendly atmosphere, mistakes, wrong assumptions and most importantly dirty fixes and bad code are kept away from the Vaadin code base - all committed code is meeting our high standards, even if someone is having a bad day.

Willy-Nilly Schmilly

When we ship something, we're dedicated to make it the best in class, bar none. Whenever there's a bug in our code, we make it a kind of a personal mission to fix it, and make sure that we never do something like that again.

Coding is what we love to do. We take it very seriously. We don't want some bugs to spoil the fun; not for you nor for us.

Henrik Paul
Henrik Paul works as a Scrum Master at Vaadin's product development. He has been working at Vaadin since 2008 with basically anything and everything, except in sales or administration. He's one of those annoying guys who is never satisfied with the status quo, and questions established practices constantly.
Other posts by Henrik Paul