In part 1 I introduced Atlassian’s agile process. In this part I detail how some of the key eXtreme Programming (XP) practices are… well, put into practice in our Sydney development group. Update: Part 3 and part 4 available.
The most controversial and misunderstood XP practice is pairing. We love it. We get benefits in code review, knowledge transfer and enhanced team gelling. Pretty much all feature work is paired, however, bug fixing, spiking and specification writing is rarely paired, we find it doesn’t add as much value for these activities.
Actually, some Atlassians are allergic to pairing, particularly on the smaller teams of 2-4 developers. However, the vast majority of Atlassian developers are pro-pairing.
Incidentally it is my observation that, for the strict purpose of code review, pairing is not as good as a traditional post hoc review process, mostly due to the impact of groupthink. So most teams also have some code review. The attentive among you may even have noticed we liked the Pairon so much we bought the company.
Stand Up Meetings
Every day teams hold a 10 minute status and commitment meeting, on our feet. We find that this serves the purpose that a weekly status meeting could with a greater amount of engagement and more timely information exchange. We pass a token around which represents the right to speak. In the JIRA team we have a football (AFL) which also serves as a suitable weapon against the ill-attentive. The Confluence team have a fluffy Koala which also serves as a suitable embarrassment.
We have many thousands of JUnit unit tests on each product. In addition we have jWebUnit functional tests which employ a web client conversation with a fully-integrated web app on a real database and a real app server. An example of one such test in JIRA is performing a bulk edit on all the issues that are returned as the result of a search. There are also Selenium tests which drive a real web browser. These are great for testing the increasing amount of sexy AJAX magic in our web apps. We also use Clover to check the coverage of our tests.
Automated unit tests are so essential and of general benefit that I don’t think of them as an aspect of XP. Of course, they’re essential to doing XP but then I’d argue they’re essential to doing development in much the same way as bug tracking is. Forgive me for being so bold but if you’re not writing unit tests and running them all the time, then, to be frank, I question the quality of your software.
The Planning Game
Each weekly iteration starts with a
game of poker schedule of estimated tasks. Each fine-grained task is tracked on a card in addition to appropriate tracking in JIRA. The cards go up on a prominent wall that broadcasts our progress through the iteration. As tasks are completed, the cards are checked off and moved into the done part of the card wall. Detailed status and extra information is tracked in JIRA and Confluence but information is also shared in conversations which are encouraged as a legitimate means to capture and disseminate details. We also collect detailed data on estimate accuracy and schedule volatility.
In the interests of helping you maintain your blog reading agility, I’ve decided to cut things off here until the next installment where I cover further XP practices and wherein I fulfil my promise to shoe-horn sociopaths into the topic.