C#

Demo: Unit Testing in Visual Studio (C#)

Introduction

In this post, a demonstration on how to perform unit testing in Microsoft Visual Studio is covered. The demo is structured as follows:

  • create a console app with accompanying c sharp classes, and
  • perform unit testing on the methods used in the created classes.

In writing a program, implementing unit testing is a key step. The unit test project can be thought of as a document which can be used to understand the functionality and expected outcomes of the code. It is good practice to test the methods used in a program to check that the program delivers the expected results, and in performing the tests errors in the code can be picked up and corrected.

It often occurs that in the production process, a program needs to be extended. In extending the program, the results from one method should not alter the functioning/outcome of other methods. When unit testing is in place, errors can easily be picked up if it were the case that adding new code altered the outcome of existing working code. In this way, the existing working code is validated again when new code is added to the project. Using unit testing therefore serves as an important quality check of a program in the production process. Continue reading

On the Design Board: From Single- to Multi-Curve

Introduction

Standard fixed-income applications make a larger and larger use of the multi-curve framework to price products and hedge risks. For whatever reason this is the case, it is useful to know how to implement such a framework.

We have already talked about multi-curves in the past. Here we gave a list useful references and here we illustrated the mean features of risk metrics and sensitivity patterns. In this blog, we describe how to design the multi-curve framework. We do not claim that this is the only way or the best way. This is one possible way, which however turned out to work quite well within our system and happened to be easily integrated into our library.

Code snippets that will be shown below have been developed in C# using Visual Studio. Continue reading

ETL for Daily Liquidity Monitoring

Daily monitoring of liquidity has become a crucial job inside any bank. We implemented the Extract-Load-Transform (ETL) operations of liquidity tools for different trading desks for the modelling and risk-reporting departments in a large Dutch bank, which allows the bank to run its liquidity-monitoring tools daily.

Liquidity input data come from various sources and all have different formats. Some are Excel files, while some others are comma-separated text files. Moreover, date conventions are not standard and depends on external factors, such as Excel settings. In addition, all the different pieces of information have to be adjusted before they can be used. Such adjustments include specific selection and join operations.

Our input tool has been developed in C#. We have created in-memory databases and used LINQ to perform DB operations. The code has been unit-tested. Log text files and Excel output files are created daily. The tool has been accompanied by a self-contained user manual explaining the business logic, configuration setting and command-line arguments, exceptions that may arise during the execution.

Multi-Curve - Useful References

We have recently started the project of including the multi-curve framework into the UDFinLib, our own financial library. The topic is delicate, as it consists of both research and implementation at once.

After the 2007-2008 world financial crisis it became clear that the classical single-curve framework that had been used until then was not appropriate to value products and to hedge portfolio's positions. All of a sudden credit risk was an every day's topic, collateral margins exploded and the previously small spreads between different-tenor swaps (OIS vs Libor, 3M-tenor vs 6M-tenor) could not be neglected anymore. Single-curve building, which treated instrument with different tenors in the same way, had to be upgraded to multi-curve.

In a nutshell, the multi-curve framework amounts to construct one discount curve and many tenor curves. The discount curve is typically built with OIS instruments, which are the best approximation for the risk free rate. All the other tenor curve are built with instruments with homogeneous tenors. The most used tenors are 3M, 6M, 9M, 12M. Typically, the longer the tenor the riskier the trade and hence the higher the corresponding rate.

In this blog we give a non-exhaustive list of references that helped us in both understanding the multi-curve and designing the process.

Articles

Books

Nice reading!

Wrapping C++ DLL for use in Excel / VBA using some pretty awesome open source projects

In this blog it’s explained how you can wrap up C++ managed or unmanaged code and make the functionality available in Excel / VBA. The blog uses levmar as specific example, but most steps are generic. The levmar specific example is due to the fact that we are currently working on some short rate lattice models that need to be calibrated to the interests market. The calibration requires a robust solver for a nonlinear least squares problem. To share our models and their implementation with the wider finance community we choose Excel / VBA for the implementation. This allows us to easily share our ideas and algorithms. For most simple cases this worked wonderfully using a solver algorithm that was originally part of MINPACK developed by Jorge More, Burt Garbow, and Ken Hillstrom at Argonne National Laboratory. The algorithm we used was translated from Fortran to VBA by Vanna and shared on quantcode.com. This algorithm proved not to be stable enough for more complex problems. So we started looking for more stable implementations and found levmar by Lourakis. This package was written in ANSI C. Here we explain how we took this code and build xll add-in for Excel using:

Continue reading

Compiling LevmarSharp (Visual Studio 2010)

Prerequisites:

-Visual Studio 2010

-levmar 2.6 (http://users.ics.forth.gr/~lourakis/levmar/)

-levmarsharp (https://github.com/AvengerDr/LevmarSharp)


For a recent research project we needed to solve an optimization problem. In specific we were trying to reproduce the results in the paper “A Generalized Procedure for Building Trees for the Short Rate and its Application to Determining Market Implied Volatility Functions” by Hull and White. In the paper it is described how a lattice can be constructed and calibrated to market. The calibration is essentially an optimization problem where the difference between the discount factors (or interest rates) observed in the market and the discounts generated in the model is made as small as possible by varying the model parameters.

Continue reading

Compiling Levmar using NMake (Visual Studio 2010)

Prerequisites:

-Visual Studio 2010 which comes with NMake

-levmar 2.6 (http://users.ics.forth.gr/~lourakis/levmar/)


For a recent research project we needed to solve an optimization problem. We considered using levmar by Lourakis. Not having touched C or build code using Make for a while it took a little while to get everything setup and building. In this blog the steps needed will be described. Should you run into trouble please consider the troubleshooting section at the end of this post. If you are interested in using levmar in C# check out this blog post.

Continue reading

UD Fin Lib

Ugly Duckling Finance is currently working on its financial library, UDFinLib. UDFinLib will appear soon and will be advertised on this blog and website. Everybody interested is therefore invited to come back later when it will be ready to use. In this blog post I would like to anticipate some of the features of the library.

The library comes in two parts: the core and the Excel Add-in. Continue reading

Automation of FTP process

For a large Dutch bank we created a detailed plan plus working prototype to replace the manual daily funds transfer prices (FTP) for the mortgages domain process with an automate system.

The resulting prototype achieved automation using the windows scheduler, Visual Basic scripts and Excel (with macro code). This prototype was the first step to automate the process and greatly reduced the workload for the business team charged with producing the daily prices. From an IT perspective this first round of automation was not enough as the prototype model didn’t fulfill the straight through processing required by IT as it still requires manual steps. In the second phase of the project we investigated which steps can be taken to achieve straight through processing. We suggested to migrate the solution to a more technically robust environment using C#, SISS package and SQL-server.