LMU Spring Workshop on Finance, Stochastics, and Statistics

Dr. A-P. Perkkiö, Prof. Dr. F. Liebrich, Niklas Walter

Date and Time

  • 28.04.2023


  • At LMU Mathematics Institute,
  • At LMU Mathematics Institute, Theresienstraße 39-B (Room A 027) (how to find us).


Coffee and discussions
11.00 - 11.45Gregor SvinlandTowards a Measurement of Pandemic Cyber Risk
11:45-12:30Alessandro DoldiRisk Sharing with Deep Neural Networks
Lunch break
14.00 - 14.45Aleš CernySimplified Stochastic Calculus and its Applications with an Average Student in Mind
14:45-15:30Sara Svaluto-FerroSignature-based models: Theory and calibration
Coffee break
16:00-16:45Lauri ViitasaariFlexible Integrated Functional Depth
16:45-17:30Teemu PennanenTopological Duals of Locally Convex Function Spaces


We consider the problem of modeling systemic cyber risks. Based on considerations regarding the dynamics of the spread of cyber risk, we present first steps towards a cyber risk measurement.

We consider the problem of optimally sharing a financial position among agents with potentially different reference risk measures. The problem is equivalent to computing the infimal convolution of the risk metrics and finding the so-called optimal allocations. We propose a neural network-based framework to solve the problem and we prove the convergence of the approximated inf-convolution, as well as the approximated optimal allocations, to the corresponding theoretical values. We support our findings with several numerical experiments.

We introduce a simple way of recording and manipulating general stochastic processes without explicit reference to a probability measure. The resulting calculus makes it easy to capture a variety of predictable transformations of semimartingales such as changes of variables, stochastic integrals, smooth variations, and their compositions. The new calculus is very effective when it comes to computing drifts and expected values that possibly involve a change of measure. In this talk we discuss some new examples that might appeal to an average MSc student.

Universal classes of dynamic processes based on neural networks and signature methods have recently entered the area of stochastic modeling and Mathematical Finance. This has opened the door to robust and more data-driven model selection mechanisms, while first principles like no arbitrage still apply. In the first part of the talk we focus on signature SDEs whose characteristics are linear functions of a primary underlying process, which can range from a (market-inferred) Brownian motion to a general multidimensional tractable stochastic process. The framework is universal in the sense that any classical model can be approximated arbitrarily well and that the model characteristics can be learned from all sources of available data by simple methods. Indeed, we derive formulas for the expected signature in terms of the expected signature of the primary underlying process. In the second part we focus on a stochastic volatility model where the dynamics of the volatility are described by linear functions of the (time extended) signature of a primary underlying process. Under the additional assumption that this primary process is of polynomial type, we obtain closed form expressions for the squared VIX by exploiting the fact that the truncated signature of a polynomial process is again a polynomial process. Adding to such a primary process the Brownian motion driving the stock price, allows then to express both the log-price and the squared VIX as linear functions of the signature of the corresponding augmented process. For both SPX and VIX options we obtain highly accurate calibration results.

With the tremendous increase in storage capacity and computing power, nowadays the data we face come in many different forms, and in particular is often high dimensional. This has led to the development of functional data analysis, in which observations (data points) are assumed to be functions, infinite dimensional in nature. For example, such observations can represent cash flow data, electricity demand, or a time output of a specific sensor representing a typical situation in various industrial settings. However, in the context of functional data, it is unclear how basic concepts such as typicality and/or outlyingness should be defined. Indeed, while in finite dimensional case natural definition for typicality is centrality in location, in the case of functions the shape of the function play a crucial role as well. In this talk we introduce a flexible non-parametric measure of typicality for functional data that is based on depth functions, and that also takes into account the shape of the function. Theoretical properties and practical examples are discussed.

We study topological duals of locally convex function spaces that are natural generalizations of Fréchet and Banach function spaces. The dual is identified with the direct sum of another function space, a space of purely finitely additive measures and the annihilator of L∞. This allows for quick proofs of various classical as well as new duality results e.g. in Lebesgue, Musielak–Orlicz, Orlicz–Lorentz space and spaces associated with convex risk measures. Beyond Banach and Fréchet spaces, we obtain completeness and duality results in general paired spaces of random variables.