Adventures in Scala-based functional testing

We are in the process of creating a financial calculation library in Scala for one of our applications, and if there is one thing that is “really easy” for calculations, it is testing them… </sarcasm>

Everybody likes to demonstrate simple examples of testing mathematical formulas using tools like JCheck/ScalaCheck or other testing frameworks. The test code always looks pretty, but unfortunately, it is never so simple in practice. Our tests have dozens, if not hundreds of numbers going in and a few distilled numbers coming out.

As a “simple” example, if you want to calculate the Jensen’s Alpha of your USD position in British Petroleum, then we are going to need prices for BP, fx rates for GBp to USD, risk free rate, beta for the stock, index prices (for your market return)… and that is for only one position. Jensen’s Alpha is a much more useful metric over a whole portfolio…

We have yet to find a truly convenient way to model all of this information using any of the popular functional testing tools like Fitnesse or Cucumber. They just seem to end up with more work for us to do (in creating fixtures or step definitions), usually with less readable results in the end. Consequently, we decided to create a test script format that made our BA’s, QA folk (and developers) happy, and decided to go from there.

As a group, we decided on a JSON format that presented the data and its hierarchies in a form that all could easily read and understand, something like:

{ 
  "positions": [ ...position objects here... ],
  "period": { "start": "2009-01-01", "end": "2009-03-31" }
  "prices": [ ...price objects... ],
  "fxRates": [ ...fx rate objects... ],
  "verification": {
    "averageDuration" : 42,
    "portfolioBeta" : 0.85
  }
}

For JSON in Scala, you don’t have to look any farther than scala.util.parsing.json package built into Scala itself. The simple parser gives back a List or Map (of List/Map/Double/String…) which mirrors the JSON itself. Unfortunately, that output forced us to write brittle (and casting-filled) transformation code from the Map[String, Any] and List[Any] into the traits needed for our calculations. Small changes to the format, like turning an object into an array, would cause painful, run-time errors.

After some frantic googling, we stumbled across the lift-json project, an extracted library from the original Lift project. Lift-json has the very pleasant notion of parsing JSON into a series of case classes, allowing us to create case class implementing the traits for our calculation library. Our implementation also leveraged the excellent ScalaTest libraries and its ShouldMatcher syntax to map our final verifications into a Map[String, Double] where the String was the field name on the calculation result and the Double was the result value.

Excellent! Boilerplate collection parsing and casting begone!

This structure worked well, until we ran into a problem with our case-class-mapped, JSON schema. In our verification section, we wanted to try to assert calculation results that were more complex. So, a simple Map[String, Double] would not do. For example:

{ ...
  "verification": {
    "averageDuration" : 42,
    "portfolioBeta" : 0.85,
    "totalReturn" : [ { amount: 1000, currency: "USD"}, 
                      { amount: 800, currency: "EUR" } ]
  }
}

This meant that we either needed to complicate our JSON schema a lot, (by adding new verification sections or much more complex verification types) or try again. We were very committed to making this test script concept work, but we didn’t want to be hindered again by another trip-up like this. So, one of the guys suggested a Scala DSL instead, and he ran with creating a Fluent Interface for a test case.

In the end, he produced something like this:

class TestCase1 extends CalcTestCase {
  period from "2009-01-01 00:00:00" to "2009-03-31 23:59:59"
  
  position("openDate" -> "2009-01-15 10:45:00",
           /* ... more position details... */)
  position(...)
  // more positions, prices, fxRates, etc...

  verify("averageDuration" -> 42,
         "portfolioBeta"   -> 0.85,
         "totalReturn"     -> is(1000 in "USD", 800 in "EUR") )
}

Using Scala as our test script language gave us huge wins. We could choose to write with a more fluent style where it suited: period from "2009-01-01 00:00:00" to "2009-03-31 23:59:59" or 1000 in "USD". Also, we could still fall back to String value maps for our verifications, allowing us to easily make failing tests (but not tests that fail to compile…) Using Scala as the test language (with a our DSL on top) seems like the choice that we should have made all along.

Leave a Reply

Your email address will not be published. Required fields are marked *