One year ago, Atlassian announced it’s 20% Time Experiment. Originally a 6-month trial, we’ve actually let it go for 12 months before reviewing results and determining its fate.
In line with Atlassian’s innate openness, I’m happy to share our findings.

Atlassian’s 20% Results At-a-glance

1 Year

48 Projects*

(25 In Progress, 16 Incorporated into products**, 7 retired)

248 Days of Effort* = “1.1% Time”

34 People*

8 Developer Blog Posts

* It’s actually more, but not everyone updated their statistics, so we don’t know
** Really, 20% Time “incubated” the feature, which was finished in normal development cycle

The biggest surprise was the fact that we only delivered “1.1% Time”, based upon the number of developers involved. While people had worried that we would spend too much time on 20% projects, it turned out that we hardly invested much time at all.
It’s worth mentioning that these statistics are approximate. We didn’t mandate strict record-keeping and we don’t have reliable timesheet data since 20% Time got mixed-in with our ShipIt projects. Therefore, the actual time spent could be 2 or 3 times this figure, which is still very low.
However, we do know that projects ranged in time from 1 to 18 days of effort and that we created some shit-hot improvements to JIRA, Confluence and Bamboo. There were also some projects focussed on internal improvements and Open Source efforts.

How did it work?

Developers could choose their own project. They then negotiated with their Team Leader to get time for their 20% project in amongst normal development activities, including their rotations through Support. Some developers took time in multiple small-blocks. Some locked themselves away for longer periods with signs warding off disturbance. Some teams even trialled a “20% Week” in between release cycles. (This one has generated mixed reactions — it was great to make the time available, but it’s not always possible to schedule innovation on somebody else’s timetable.)
Each project was listed on our internal Confluence wiki, with manual tracking of ‘days expended’. Projects that got incorporated into products were moved to a virtual ‘Hall of Fame’ while other projects were retired. A great many projects are still on-going.
Interestingly, the projects that were incorporated into products actually consumed very little 20% Time. Once a concept was proven, it was typically adopted by a Product Manager and incorporated into the normal stream of development. The bulk of work on the feature (adding functionality, testing, perfecting) was therefore done in normal development time. As a result, Hall of Fame projects were typically only 1-5 days.

What problems did we experience?

As part of the 20% Time trial, we surveyed developers to gather feedback. Far and away, the biggest problem was scheduling time for 20% work. As one person put it, “Getting 20% time is incredibly difficult amongst all the pressure to deliver new features and bug fixes.”
Atlassian has frequent product releases, so it is very hard for teams to schedule ‘down time’. Small teams in particular found it hard to afford time away from core product development. This wasn’t due to Team Leaders being harsh. It was often due to developers not wanting to increase the workload on their peers while they did 20% work. They like the products they are developing and are proud of their efforts. However, they don’t want to be seen as enjoying a privilege while others carry the workload.
Another problem was accurate tracking of 20% effort. We have tight controls around Annual Leave, so why don’t we have controls around 20% Time? The answer probably lies within the Atlassian culture. One of our core values is to “build with heart and balance” and there is a shared feeling of trust that people know whether they are providing valuable work. We track time in major buckets (eg new product development vs maintenance) but there is no “big brother” breathing down the necks of developers who can’t explain every minute of their day.

Are we continuing with 20% Time?

To find that out, you’ll have to wait for our next 20% Time blog post. Let’s just say that this review has led us to reconsider the goals of 20% Time and question how we can determine whether it is of value to the company. Stay Tuned!