Projects and Clients

Here is a selection of the bespoke projects that we've worked on - the aim is to give a taste of what we can do for you. If any of these requirements, tools or applications sound useful or if any of the issues, complications or stumbling blocks sound familiar, then do get in touch and we can discuss how we might work together.

Above all, we find computers interesting and we're continually amazed by what they can do - often more easily than you might expect. So if you have an unusual requirement (or even a dull requirement - the dull ones often have an interesting inner core, screaming to be let out) and you don't know where to turn, then do let us know.



Re-working an oil field production model

This client was investigating the effect of oil production levels and macroeconomic indicators on the profit margin of an oil field. They had built a model in Excel but they weren't entirely happy with it. First, they weren't convinced that the model as it stood was giving the correct results. Second, they wanted to add a sensitivity analysis of the inputs but they were having trouble fitting it in. They asked us to make the model clearer, check its calculations and add this extra analysis.

Beau Derrick

We started by reworking the model so it conformed to design standards - for this we used a simplified form of the FAST standard (FAST being a well-established design framework for financial models). An important part of this design is separating the inputs, calculations and outputs into their own layers so the flow of data through the model is clear. Now, with a transparent flow of data through the model, it was much simpler to verify the results. Having done that it really was quite straightforward to add a Monte Carlo module for running a sensitivity analysis.

And with a clear flow of data through the system, from input to output, we were able to export the data and run further analysis using the R statistical language.



bits

Importing a morass of survey data for analysis and reporting

This client had around 100 sets of market survey data in 100 spreadsheets. But they all differed in little ways that ruled out a simple import process. All this data was to be imported, analysed and compiled into numerous reports. The statistical analysis and reporting was fairly straightforward - the problem was the import.

Sidney Harris miracle cartoon

The client had developed an import and analysis tool for the previous year's survey, but while that suvey was similar to this new one, it was sufficiently different that the tool needed a big update. But the previous tool used a black-box "holistic" approach to software development that made it difficult to modify, and even to understand. We were asked to create a tool in Excel/VBA that would work with the new survey and, importantly, could be re-used the following year without too much effort.

The single biggest improvement came from separating the import, the analysis and the reporting into three separate layers with clear boundaries between them. The rest just fell into place. The import stage used a system of fuzzy matching probes to find the individual data items, which were then added to a simple relational-style database within the Excel model itself. These probes were parameterized so they could be updated without changes to the software itself. (The principles were similar to those I had used in the past to pull data from a hospital's main database into a separate maternity database. Medical data is, rather obviously, more critical than survey data so the mechanism I had developed employed numerous checks and audit trails. You can guess that this was before APIs became common.) Now with the data in an easy-to-access format, the analysis layer could interrogate it easily, and an interactive reporting layer sat on top of all this, happily using the clearly formatted data.

The clarity of the design led to a few extras that weren't expected by the client. For example, an audit log showing exactly where every single piece of data had come from within the mass of input data wasn't so much an added extra as a natural part of the design.



bits

Creating an interactive tool for identifying cost items with dubious descriptions

This time a large organisation was looking into the real cost of running their IT business applications. Their ERP platform did, on the face of it, do exactly this - that's what they're for. But the items had manually-entered descriptions and many of these descriptions weren't clear, so it wasn't always obvious where all this money was going. And there was quite a lot of it. Fixing this required a mix of human judgement and machine searching. We were asked to quickly create an easy-to-use interactive tool that would do most of the grunt work finding possible matches and then provide a simple mechanism for a user to make a definite identification or flag the item up for investigation.

Queueing in Rome
Queueing in Rome on a very hot day.
At what point do you move forward into
the heat to prevent queue-jumping?
Is this queueing or is it poker?
Compelling stuff.

We created a bespoke tool with an Excel/VBA front end connected to a SQL database that contained the purchasing data and a dump of the CMDB (that's the Configuration Management Database, which should hold info on everything technical). With a few clever data analysis techniques the tool would find matches between the purchased item and the actual applications. The interactive graphical interface (yes, Excel is surprisingly good at interactive graphical interfaces) allowed the user to choose from among likely applications or flag the item up for further checking. And the system would learn as it went along so it got better with time.

In the end a few thousand unknown costs were married up to their real homes and the client was happy knowing where all that money was really going.

This was one of those nice one-off jobs where a bespoke tool was needed for a very specific purpose and could be discarded when it was finished. Perhaps too often a vendor will attempt to sell a solution that will "grow with the business" when really all that's needed is a one-off tool to get the job done.



bits

Reverse engineering a large-scale gas contract

A client wanted to estimate the future gas prices coming from an international level gas contract. They had identified a likely set of price drivers - wholesale gas and oil prices, consumer prices indexes, macroeconomic indicators, and so on. They also had an indea of how these drivers should be combined to give the contract price. They now needed an efficient way of obtaining the values of these potential price drivers from various data sources, and then determine which of all the possible permuatations gave the closest match to the actual contract prices. This would be done by comparing the actual historical prices against their corresponding estimates and find the overall closest.

We put together a transparent, interactive, easy-to-use model in Excel/VBA that could select from a menu of candidate cost drivers and ways of combining them, generate a history of estimated prices and compare this against the known price history. With a little automated analysis and a clear graphical output the client identified the best combination of drivers and were then able to anticipate price changes. And the clarity of the model gave them confidence in the results.



bits

Spurious ring tones

This was a curious little request. The client had a sound recording that contained some brief, but annoying, spurious tones - was there some way of removing them?

The Conversation
This used to be so much harder.

This was a fairly straightforward task using the signal processing libraries available with the Python programming language. In the past I would have written the whole thing in C, but it really is so much easier these days. I used the discrete Fourier Transform to identify the unwanted frequencies and then created a suitable multiple band-reject filter to remove them. The result was the barely changed original sound with just the barest hint of that rogue iPhone ring-tone.

Numerical Recipes in C cover

As a side note, I'm still bewildered that Python, and many of its ridiculously useful software libraries, are completely free. We live in a world of rampant capitalism and yet there are industrial quality software libraries for mathematics, statistics, artificial intelligence and, well, just stuff - all for free. If you're interested, you can download Python from any number of places — a well-regarded distribution is Anaconda. I was brought up on Numerical Recipes in C, which is still one of my Desert Island Discs books, but Python has brought this to a new level.



bits

Improving and enriching a Financial Model

This client had a financial model for forecasting the long-term costs and benefits of restructuring a part of its business. The model had been created by a large finance consultancy but it was rather tangled and, as a consequence of this tangledness, was difficult to follow and even harder to modify. Once again, this was a typical job of taking a model without a clear flow of data and giving it a clear flow of data. Once that solid foundation is in place, you can start building on it.

As ever, we separated the inputs, calculations and outputs into their own sections. Then we gave it a consistent time line structure, removed the garish colours, made it read left-to-right, top-to-bottom (yes, just like reading a book) and polished off some other rough edges.

Now that the model was clean and tidy it was easier to check whether it was in fact correct, which it was. But then we could start adding the nice-to-have functions. Of note was the Monte Carlo modelling on exchange rates, interest rates, time-lags in project completion and so on. The end result was a simpler, clearer, richer model.



bits