In June 2013 Ninefold tasked reinteractive with producing a competitive performance analysis between Ninefold and two of its major competitors.
This provided reinteractive with a challenge, because we sign up our customers on multiple providers depending on our client's needs and also historically, competitive performance benchmarks are traditionally some of the most misunderstood and skewed benchmarks available.
To Ninefold's credit however, their requirements for the benchmark included that it would be open source and that ANYONE could reproduce on any Ruby on Rails based platform. This requirement gave reinteractive confidence that we could produce something useful for the community, not just something that was tailored for a marketing department.
The requirements provided from Ninefold were:
- establish the actual performance of the Ninefold platform
- compare Ninefold's performance against competitors
- provide insights into areas for performance improvements
- measure the performance impact of any changes to the Ninefold platform
- provide a repeatable code based way for anyone to run the tests
What was needed was a simple, repeatable method of testing the Ninefold platform and comparable competitor platforms. The testing of competitor platforms raised some challenges as we had to make sure the tests were as "comparable" as possible.
To this end reinteractive produced the following:
- Reference Application: A simple reference application, with preloaded data and images. This same application can then be deployed across many PAAS providers using seed data to ensure it is the same.
- Repeatable Test Tool: A load testing script that simulates real user load and can be repeated on any PAAS platform by simply changing the base URL.
- Comparable Platforms: A way to show "comparable" platforms, as every PAAS provide has different price points.
- Github Repository: A public Github repository that provides step by step, indexed instructions on how to repeat the entire test on three difference platforms.
The application chosen was a default "demo" install of the Spree Commerce platform loaded up with sample data. This produces a complex Rails Application, with sessions, shopping carts, search capability and indexed data.
This was chosen because it is a complex Rails application and has a known set of sample data that can be loaded on each platform being tested.
Repeatable Test Tool
To produce a repeatable test plan, reinteractive decided to use JMeter and the flood.io load testing platform to do repeatable, scripted testing.
The first target was to develop a script that steps through the application and simulates a real user.
Once written using the ruby-jmeter gem DSL, this was able to be run on demand through the flood.io API.
The advantage of these test scripts is that once they were written, they can be rerun at will, at any time by any user on any platform by simply changing the URL that the script was pointing to.
Deciding how to measure "comparable platforms" was the hardest part of this exercise, but in the end, reinteractive suggested that Ninefold go with really the only comparison that matters, dollars. That is, how much customer happiness (measured in response time performance) can be purchased for X dollars per month.
To simplify matters, three base price points were chosen:
- $50 per month - the price for a small company website or active blog
- $175 per month - higher capacity small company website
- $575 per month - production high traffic website
Obviously there can be higher cost websites, but for this performance analysis, the above was a good set of measuring sticks.
The end product of this was two Github repositories:
These two repositories contain everything you need to spin up your own performance test to repeat the results we found.
The results of this performance analysis were quite dramatic and were published online on the Ninefold Performance website for everyone to view.