Agile software development at Atlassian

When I think of Agile, I tend to think of the awesomeness it can bring me as a performance engineer. I see short cycles where the code is usually always stable and features are delivered incrementally. I get to watch the performance of the system evolve over time. Unfortunately, this means I’m testing software all the time. Thats fine though – none of this is by hand – this kind of repetitive task is perfect for automation.

It’s clear that automating testing is important – automated tests mean you have more time to do investigation while you make your build system do the repetitive tasks, and when your build system does the heavy lifting then it’s easier to test more regularly. More regular testing means you find problems sooner and go home on time.

In this blog, I’m going to go over how to take your JMeter performance tests and put them into a Maven build which you can run on your continuous integration server as often as is needed. I’ll introduce you to the Chronos Maven plugin which can control JMeter in a repeatable way and has reporting features which can be used to gain useful information about the performance test.

pfe-history-response-summary-fixedloadtest.png

Prerequisites:

For this blog, you’ll need the following prerequisites:

  • A working knowledge of Maven
  • Working JMeter Performance tests
  • A Maven repository set up for retrieving artifacts and deploying new ones.

Atlassian has a well maintained Maven infrastructure – if you’re just getting started, I’d suggest you read Sherali Karimov’s great guide on setting up Maven infrastructure here.

The JMeter tests used in this paper are fairly complete, however this is not necessary for your first pass at getting the automation running. Its more important to get started and then improve what you have as your need more features.

When you are writing your JMeter tests, I would suggest keeping the information about the data you’re going to be testing separate from the testing logic. By this, I mean keep the “flow” of your test in JMeter but use CSV files to define what URLs or options need to be used when testing. For example, instead of hardcoding a list of pages to sample rather put this into the CSV and load that into JMeter using the “CSV Dataset” controller. This way, you can change the CSV files to test a different set of pages or data in your application.

Here are the main steps I’m going to cover:

  1. Create a performance test Maven artefact
    1. Write a Maven pom.xml for the performance tests
    2. Package & deploy the tests to the Maven repository
  2. Write a Maven pom.xml to run the performance tests against an application
  3. Run the tests
  4. Automate the tests with Bamboo


George Barnett discusses performance testing inside Atlassian

Create a performance test artefact

In order to use your performance tests in an automated environment, you need to get them into Maven. Below are the components for a sample artefact, starting with the directory structure to use.

./pom.xml
./src
./src/main
./src/main/assembly
./src/main/jmeter
./src/main/resources

There are 3 directories in the src/main tree:

Directory Purpose
assembly For the public-distribution.xml which contains information about how to package the artefact
jmeter JMeter performance tests
resources CSV resources files

In this test artefact, the CSV resource files which are distributed should contain the bare minimum information needed to run the test – the actual data you will use will be stored in another project. If your application has a demonstration dataset then it would be good to add information about this data to the CSV files as an example.  Don’t leave the files blank – it makes it easier for others if you have a simple structure inside as an example.

$ cat pagePaths.csv
/path/to/page.html

Write a pom.xml for the JMeter tests

Resource: pfe-perftest-pom.xml

To deploy this project using Maven you’ll need to write the pom.xml. I’ve attached a sample pom.xml to this post but here’s some of the things you need to take care of:

Parent Pom:

<parent>
<groupId>com.atlassian.pom</groupId>
<artifactId>atlassian-public-pom</artifactId>
<version>14</version>
</parent>

Atlassian’s performance tests are public. For this reason, I include the Atlassian Public pom as a parent. This ensures that this project inherits the correct settings and will be deployed to the correct place.

Artifact Id and Version:

<groupId>com.atlassian.performance.fisheye</groupId>
<artifactId>performance-test</artifactId>
<packaging>pom</packaging>
<name>FishEye Performance Tests</name>
<version>2.0-SNAPSHOT</version>

To keep things easy, set the version of the pom to the same as whatever software you’re testing. When the software you’re testing releases, bump the version to follow them (don’t forget to tag the previous release!).

The build section of the pom.xml has two plugins:

Resource: pfe-public-distribution.xml

  1. Maven assembly plugin.  This uses the public-distribution.xml configuration file to package the tests and the sample dataset into a .zip and .tar.gz archive.  It specifies which files and directories should be included into the generated archive.
  2. Maven clean plugin. This plugin is set up to remove any .jtl files from the results directory to prevent leaking any test results into the archive file. .jtl is the extension given to output files that Listeners save result data in during the JMeter test run.

Packaging the JMeter test

If everything is in place you should now be able to use Maven to package and deploy the JMeter tests to your Maven repository. This is the first step to being able to include them in a repeatable test.

First, check that the artefact generated is correct:

spoem:trunk gbarnett$ mvn verify
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building Performance Tests
[INFO]    task-segment: [verify]
[INFO] ------------------------------------------------------------------------
[INFO] [site:attach-descriptor]
[INFO] [assembly:attached {execution: public}]
[INFO] Reading assembly descriptor: /Users/gbarnett/fisheye/performance-test/trunk/src/main/assembly/public-distribution.xml
[INFO] Building zip: /Users/gbarnett/fisheye/performance-test/trunk/target/performance-test-2.0-SNAPSHOT.zip
[INFO] Building tar : /Users/gbarnett/fisheye/performance-test/trunk/target/performance-test-2.0-SNAPSHOT.tar.gz
...

Check the resulting ZIP files to ensure that they’re packaged correctly.  Check that all the files you need are there.  For an example of what a working artefact looks like, have a look at the FishEye performance test artefacts here 

Once you’re happy with this, you can deploy to the remote repository.

spoem:trunk gbarnett$ mvn deploy
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building Performance Tests
[INFO]    task-segment: [deploy]
[INFO] ------------------------------------------------------------------------
[INFO] [site:attach-descriptor]
[INFO] [assembly:attached {execution: public}]
[INFO] Reading assembly descriptor: /Users/gbarnett/fisheye/performance-test/trunk/src/main/assembly/public-distribution.xml
[INFO] Building zip: /Users/gbarnett/fisheye/performance-test/trunk/target/performance-test-2.0-SNAPSHOT.zip
[INFO] Building tar : /Users/gbarnett/fisheye/performance-test/trunk/target/performance-test-2.0-SNAPSHOT.tar.gz
[INFO] [install:install]
..
[INFO] [deploy:deploy]
..
[INFO] Retrieving previous build number from atlassian-m2-snapshot-repository
Uploading: https://maven.atlassian.com/public-snapshot/com/atlassian/performance/fisheye/performance-test/2.0-SNAPSHOT/performance-test-2.0-20080929.031627-1.zip
[INFO] Retrieving previous build number from atlassian-m2-snapshot-repository
Uploading: https://maven.atlassian.com/public-snapshot/com/atlassian/performance/fisheye/performance-test/2.0-SNAPSHOT/performance-test-2.0-20080929.031627-1.tar.gz
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 41 seconds
[INFO] Final Memory: 14M/762M
[INFO] ------------------------------------------------------------------------

If you see this it means you have successfully deployed your JMeter tests to your Maven repository. If you find that Maven has failed then go over the setup carefully to check everything is in place. If you require more debugging information from Maven, you can run the above commands with the -X switch to enable more logging.

Once the build is successful and your artefacts are deployed to your Maven repository, they can then be included in your automated performance test.  In the next part, I’ll create project which will run the performance tests against your application.  This Maven build can then be automated.

Directory structure for the automated test

The structure for the automated test is slightly different as you can see below.  Create a new project in your SCM and create the directories listed.

The main difference is that we’re using the test directory instead of main. This is because the main purpose of this project is to run tests instead of build an artefact.

./pom.xml
./src
./src/test
./src/test/resources
./src/test/resources/jprofilerconfig.xml
./src/test/resources/dataset-sanity1
./src/test/resources/dataset-sanity1/config.xml
./src/test/resources/dataset-sanity1/repository
./src/test/resources/dataset-sanity1/repository/filePaths.csv
./src/test/resources/dataset-sanity1/repository/repoNames.csv
./src/test/resources/dataset-sanity1/repository/repoPaths.csv
./src/test/resources/dataset-sanity1/search
./src/test/resources/dataset-sanity1/search/complexTerms.csv
./src/test/resources/dataset-sanity1/search/eyeqlTerms.csv
./src/test/resources/dataset-sanity1/search/simpleTerms.csv

Write a pom.xml for the automated test

Resource: pfe-automation-pom.xml

It’s best to start out by working out what actions (and in which order) you will need to take to test your application. In this example, here are the steps the test will perform:

  1. Prepare the FishEye data directory using the chosen data to load
  2. Prepare the Jprofiler config
  3. Download and extract the FishEye application
  4. Start FishEye
  5. Run a JMeter test using the defined CSV data
  6. Stop FishEye
  7. Run reporting on the JMeter outputs and generate a site directory for upload

It’s important to define what order this work should occur in before starting so you know what you are aiming for.  I’m not going to annotate the whole pom.xml since it is several hundred lines long.  I’ll go over the important parts which you may not have seen before – I would suggest opening a copy locally before proceeding.

POM: Define build profiles

There will be many tasks to perform and settings to choose when running the test. Some of these tasks will need to happen every time (for example, setting up a database) and others will only need to happen when you are testing something specific (such as adding a code profiler). Maven provides the concept of Profiles (not to be confused with code profiling) which can be used to add and remove plugins to your test harness.

In this example, I will set up two profiles. By default, the information contained in them will not be used but when we do want to use it, it can be added to the test with a simple switch.

Profile Task
profile Run JProfiler before starting the build to gather CPU snapshots
dataset-sanity1 Controls the backend data to be loaded into Fisheye and the data CSVs to be passed to JMeter

POM: Import Dependencies

<dependency>
<groupId>com.atlassian.performance.fisheye</groupId>
<artifactId>performance-test</artifactId>
<type>zip</type>
<version>${project.version}</version>
</dependency>

The JMeter performance test artefact we packaged up earlier is a dependency in this test. Using the maven-dependency-plugin this artefact will be extracted into the target/test-classes/ directory for use.

The maven-dependency-plugin is also used to download and extract the application that’s going to be tested.

The Maven exec plugin is used to start FishEye using the exec:exec goal. FishEye is started at the pre-integration-test phase so it’s ready for use with the JMeter goals which run at integration-test phase.  If your application runs inside an application server such as Tomcat or Jetty, you should use the Maven Cargo plugin to start and stop your application.

POM: Plugins – JMeter configuration

The JMeter test is run by the Maven Chronos plugin which is responsible for running JMeter. There a few settings which you need to pay attention to:

<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>chronos-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<jmeterhome>${jmeter.home}</jmeterhome>
<historydir>${history.directory}</historydir>
<heap>756m</heap>

jmeterhome

Where JMeter is installed but the base directory – JMeter should be in bin/jmeter under this directory. It’s worth nothing that the plugin does not start jmeter via the startup script but rather invokes the jar’s directly.

historydir

This variable should be a unique directory for the build. Since you’re going to eventually have several profiles with different datasets and JVM versions, it should take this into account. I use the following format:

history.directory / product / product.version-java.version-data.version

heap

It’s important to give JMeter enough memory. This is especially true if your test uses multiple Listeners. As you will see later when I cover reporting, the default report will include a graph showing Garbage Collection. This graph is actually JMeters garbage collection, but it’s good to keep an eye on it. If JMeter spends too much time doing GC then the test results will be skewed and useless.

The settings in the attached pom.xml include some tweaks to the Garbage Collection, new size and survivor ratio.

POM: Plugins – JMeter System Properties

These variables are initially set as global configuration items but are then overridden in the profile for the dataset used.  The variables here will be passed to JMeter using the -Dvariable.name=value command line switch.

<sysproperties>
<property>
<name>script.runtime</name>
<value>${script.fixedload.runtime}</value>
</property>
<property>
<name>script.base</name>
<value>${project.build.directory}/performanceTest</value>
</property>
<property>
<name>resource.dir</name>
<value>${dataset.location}</value>
</property>
</sysproperties>

JMeter accepts external variable input using the P function, eg: ${__P(script.runtime,1800)}

Using sysproperties in the pom, it’s possible to override the variables inside the JMeter test. The most useful reason for this doing is to change the location of the resources which contains the CSV files which define which pages to sample directory. In the above example, the dataset.location variable is modified by the chosen profile in the pom and this is passed to JMeter, thus allowing the override of the resource directory used for testing.

POM: Plugins – JMeter test executions

<executions>
<execution>
<id>71_fixedload-test</id>
<configuration>
<dataid>fixedloadtest</dataid>
<input>${project.build.directory}/performanceTest/jmeter-test-fixedload.jmx</input>
</configuration>
<goals>
<goal>jmeter</goal>
<goal>check</goal>
<goal>savehistory</goal>
</goals>
</execution>
</executions>

The sysproperties shown earlier will be applied to all exections of the plugin. There are however two config variables above must be unique for each execution however:

dataid

This provides a unique name the test data output will be known as. It will be used to save files in the reports later and the savehistory goal will use this to save the data for this execution.

input

This is the .jmx performance test script to run. 

Chronos Build Goals

Goal Action
JMeter Performs the test using the .jmx file specified in each execution
Check Parses the output logfile and checks the samples against a ruleset in the pom.xml (not shown) to decide whether to fail the build or not.
Savehistory Saves the history of the build to the history directory in order to build up long term data.

POM: Reporting

<reporting>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>chronos-maven-plugin</artifactId>
<version>1.0-atlassian-1</version>
<configuration>
<historydir>${history.directory}</historydir>
</configuration>
<reportSets>
<reportSet>
<id>fixedloadreport</id>
<configuration>
<dataid>fixedloadtest</dataid>
<reportid>jmeter-fixedload-report</reportid>
<title>JMeter Fixed Load Test Report</title>
<description>
<![CDATA[Fixed Load Test Report]]>
</description>
</configuration>
<reports>
<report>report</report>
<report>historyreport</report>
</reports>
</reportSet>
</reportSets>
</plugin>
</plugins>
</reporting>

The reporting lifecycle produces graphs, one of which I’ve shown as an example at the top of the page. You should include one reportSet for each execution that was defined in the build part of the Chronos plugin. The dataid of the reportSet should match the relevant test execution dataid.

POM: Profiles

In order to control the dataset, I’ve added a profile for each piece of data I will be loading into FishEye. At the moment, I have only one, but making another will be easy with profiles. Most of the profile will be setting up variables which will be used elsewhere in the pom. For example, the dataset.location variable will be passed into JMeter using a sysproperty to set the resource base directory.

<profile>
<id>dataset-sanity1</id>
<properties>
<repository.zipfile>${performance.datadir}/sanity1.zip</repository.zipfile>
<repository.location>${repository.basedir}/sanity1</repository.location>
<dataset.location>${dataset.basedir}/dataset-sanity1</dataset.location>
<dataset.version>sanity1</dataset.version>
<fisheye.config>${dataset.location}/config.xml</fisheye.config>
</properties>
</profile>

Run the Tests!

Once your pom is complete, you can run the test using the mvn command. Don’t forget to have any profiles you need as part of the command:

$ mvn verify site -P dataset-sanity1
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building FishEye Performance Test Runner
[INFO]    task-segment: [clean, site]
[INFO] ------------------------------------------------------------------------
[INFO] Preparing chronos:report
[INFO] [dependency:unpack {execution: 30_unpack-fisheye-zip}]..
[INFO] [exec:exec {execution: 60_start-fisheye}]
..
[INFO] [chronos:jmeter {execution: 71_fixedload-test}]
[INFO] Excuting test /Users/gbarnett/fisheye/automated-perftest/trunk/target/performanceTest/jmeter-test-fixedload.jmx
[INFO] Created the tree successfully using /Users/gbarnett/fisheye/automated-perftest/trunk/target/performanceTest/jmeter-test-fixedload.jmx
[INFO] Starting the test @ Fri Oct 03 12:32:53 EST 2008 (1223001173145)
[INFO] Display Summary Results During Run +   783 in 184.3s =    4.2/s Avg:    22 Min:     1 Max:  1377 Err:     0 (0.00%)
[INFO] Display Summary Results During Run +  2616 in 180.0s =   14.5/s Avg:    23 Min:     1 Max:   565 Err:     0 (0.00%)
..
[INFO] Display Summary Results During Run +   448 in  25.6s =   17.5/s Avg:    41 Min:     1 Max:   861 Err:     0 (0.00%)
[INFO] Display Summary Results During Run = 29095 in 1829.9s =   15.9/s Avg:    29 Min:     1 Max:  1377 Err:     0 (0.00%)
[INFO] Tidying up ...    @ Fri Oct 03 13:03:25 EST 2008 (1223003005682)
[INFO] ... end of run
..
[INFO] [exec:exec {execution: 85_stop-fisheye}]
[INFO] [chronos:savehistory {execution: 71_fixedload-test}]
[INFO] [chronos:check {execution: 71_fixedload-test}]
..
[INFO] Generating "jmeter-fixedload-report" report.
[INFO]   tests: 10
[INFO]   jmeter samples: 29095
[INFO]  generating charts.....
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 31 minutes 39 seconds
[INFO] Finished at: Fri Oct 03 13:04:01 EST 2008
[INFO] Final Memory: 55M/746M
[INFO] ------------------------------------------------------------------------

The test has now run successfully. In the target directory you’ll find a site directory which contains HTML and images to make up a build report. This will be uploaded to the URL specified in the distributionManagement section of the pom.

Automate the tests using Bamboo

Bamboo is a continuous integration server made by Atlassian and is perfect for running the tests regularly.

For information on creating a new build plan in Bamboo, see the document Creating a build plan in the online documentation for Bamboo.

In closing..

With this information, you can start to make a real difference to your development cycle. You’ll find performance regressions faster and be on you way to code bliss. If you’re interested in some of the processes which you can use to better integrate this cycle, see Andrew Prentice’s blog on improving the QA cycle in You don’t have to be Agile to be agile.

Learn more about software testing in an agile development environment