Christopher Cundy investigates the strengths, weaknesses and applicability of two technologies – cloud and GPUs – that are helping improve the speed of actuarial modelling

Accelerating the performance of actuarial modelling calculations can be achieved via improvements to both software and hardware. This article focuses on the latter, and specifically on the use of cloud computing and graphical processing units (GPUs).

The practical applications of cloud computing emerged in the early 2000s and it began to spread widely in actuarial circles a decade later. The idea that processing and data storage would no longer be handled by an in-house computing system, but by an third-party provider, provoked many concerns – some of which linger today – about data security and operational resilience.

But the sector was generally won over by the opportunities presented by the cloud, chiefly the potential cost savings from having a more flexible and scalable IT resource, and the way the cloud facilitates collaboration and rapid deployment of modelling tools.

GPUs, as their name suggests, contain specialised circuitry to perform the intensive calculations required to display computer graphics. They have been around since the early days of computing, but their performance has been driven forward in leaps and bounds by two trends: the mining of cryptocurrency; and the developments in artificial intelligence (AI) and machine learning.

The advantage of GPUs over the chips usually at the heart of a computer (CPUs) is that they are capable of parallelisation, i.e. solving problems in parallel rather than sequentially. This makes them well suited to certain actuarial tasks such as stochastic modelling. GPUs can be accessed in the cloud or installed in-house.


Pros and cons

So where should an actuary turn if they want faster and more efficient processing?

Alexey Mashechkin, chair of lifelong learning for data science and AI at the Institute and Faculty of Actuaries (IFoA), says both cloud computing and GPU technologies can improve processing power and actuaries will benefit from processing larger data volumes with more modern algorithms.

“On the flip-side, these technologies require investment and organisations must be satisfied that their use and the extra cost is justified. Another risk is that of privacy and data leakage where internal data is being processed outside of the company’s IT landscape, as in the case of the cloud.”

Christo MullerChristo Muller, partner, IT services at MBE Consulting, says: “There’s clearly benefit technologically from the use of GPUs in terms of speed of certain types of calculations.

Some vendors have specifically targeted that architecture and have proven there is a speed benefit. However, if the software is not architected for it then – just like with the cloud – it’s not going to bring the most benefits.”

“If an insurer has got a system today that is optimised for CPUs and they want to use GPUs, it’s not necessarily a flick of a switch even if the vendor has added support for GPUs. Depending on the type of model it can be a fundamentally a different way of thinking about how you vectorise those code calculations to obtain maximum use of GPUs. Clearly there’s a cost-benefit case to think about,” explains Andy Maclennan, vice president, product management, insurance risk at software vendor FIS.

“We will be building example models on which customers can layer on their customisations, and also bring more of the product types that really perform well on GPUs into our standard libraries and support them going forward,” he adds.

Muller notes there are potential downsides to GPUs when it comes to programming. “GPU architecture is highly complex in terms of coding. And although the vendors are abstracting that away from the users, it does still introduce some challenges for users when there are issues – which there always are.”

He continues: “In the more traditional CPU-based space, there is definitely more opportunity and ability to debug and fact-find. Of course it’s possible in the GPU space, but it’s highly complex and very specialised.”

This article is an excerpt from InsuranceERM’s special report on Actuarial Modelling. To download the full report, for free, click here.


Specific calculations

Cloud and GPU solutions both offer powerful means to process the data and carry out calculations needed for actuarial modelling purposes.

But as Iain Macintyre, head of risk and capital in the Insurance & Financial Services division of consultancy Hymans Robertson, explains: “The catch is that cloud or GPU alone are not sufficient as it is likely the case that work is required to adapt actuarial models to make the most of the additional power.

“The greater scope there is for parallel running across policies, model points, products, and/or simulations, the greater the value that can be obtained. So work is needed to identify and isolate those calculations and dimensions before migrating some of the computation to the cloud or GPU compute.”

GPUs are good at performing matrix-based calculations on a large amount of data. In terms of actuarial tasks, that means nested stochastic simulations used in capital modelling, best estimate liability projections, seriatim valuation and discounting cash flows under multiple economic scenarios.

But CPUs tend to perform better when there are many different types of calculation required, or when there are a lot of dependencies to model.


Cost

Choosing between cloud and GPUs, or a combination of both, is not straightforward. FIS’s Maclennan says: “Comparing GPUs with CPUs on speed alone doesn’t provide the full picture – you really need to know which tool gives you the necessary speed for the lowest cost. Users can generally make runs go faster by adding more CPU cores, at least on the cloud. Some

GPUs cards are very expensive and there is a trade off on cost. There is no point buying GPU on premise if it is going to sit idle for most of the year. We want to give clients the choice of which is best for their purpose: CPU, CPU+AVX, GPU or, in the future, even quantum computing.”

Some firms have uncovered significant performance improvement and cost savings with the transition to the cloud. But others have been surprised by the bills from cloud-based systemsas actuaries take advantage of more powerful and easily accessible tools to perform more analysis.


Conclusion

Regulatory requirements such as Solvency II and IFRS 17, as well as demands from the business to better understand risks and opportunities, have put pressure on firms to improve the speed and reduce cost of actuarial modelling.

Cloud computing and GPUs offer potential routes to achieving this, but the software solutions must be appropriately tailored to the hardware. GPUs offer large potential gains, but only for a limited number of actuarial applications.

What’s the potential for quantum computing?

Quantum computing refers to a novel branch of computing that uses the principles of quantum mechanics to perform calculations – and promises a huge increase in processing speeds.

The hardware to perform quantum computing is best described as ‘in development’, but there are quantum simulators that have enabled actuaries to develop algorithms proving the potential for a rapid increase in processing speeds for various optimisation tasks – such as asset-liability management (ALM) and portfolio management – and value-at-risk calculations.

Consultant actuaries Tim Berry and James Sharpe’s paper, Asset–liability modelling in the quantum era, published in the British Actuarial Journal, describes a quantum approach to optimising the selection of a Solvency II matching adjustment (MA) portfolio.

Running on a traditional computer, their software solution took five to 10 minutes to complete a typical MA optimisation task. The actuaries were able to show the quantum computer has the potential to solve MA optimisation tasks in a fraction of a second.

In Muhammad Amjad’s paper on Quantum internal models for Solvency II and quantitative risk management, also published in the British Actuarial Journal, he investigates how quantum computing could be used in the context of an insurer’s internal model.

He reported the implementation of an internal model differs significantly between quantum and classical computing, due to the fundamental differences in how each technology processes information, and building a quantum model would be such as significant task that it might not be worthwhile.

But the quantum model’s advantages emerge when insurers are required to calculate the solvency capital requirement numerous times, for example to map out the multidimensional capital and risk landscape for understanding sensitivities to market and non-market risks, and for setting risk appetite.