Svetlana BorovkovaVrije Universiteit Amsterdam ## Abstract## Joint multifactor models for commodity forward curves under P and Q measuresS. Borovkova, S. LadokhinInspired by the two and three factor forward curve models of Borovkova and Geman (2007), we re-introduce a multifactor forward curve model, jointly under the real world and the risk neutral probability measures. We specify the P- and Q-dynamics of the fundamental factors - the average level of the forward curve and stochastic convenience yield - that drive the forward curve and develop a joint calibration approach, based on Kalman Filter methodology, under both measures. The dynamics under the two measures are related by means of the market price of risk, which is allowed to be stochastic. We calibrate the model parameters on the historical Brent futures prices with many maturities, typical of complex portfolios held by practitioners. The model copes remarkably well with both backwardation and contango regimes, and fits the observed forward curves with great accuracy. We show that the calibrated market price of risk is indeed stochastic, as is observed in interest rate markets. We demonstrate that the crude oil market price of risk is influenced by the changes in energy supply and demand, geopolitical situation and the flow of oil-related news. | Michael CoulonUniversity of Sussex ## Abstract## Spread option pricing: implied volatility implied from implied correlationMichael CoulonUniversity of Sussex Spread options are important derivative contracts in energy markets, closely linked to the valuation and operation of physical assets like power plants, refineries or storage facilities. While many techniques and approximations have been developed to price such options efficiently, very little attention has been paid to the key challenge of choosing the most appropriate volatility parameters calibrated to vanilla derivatives. Given observed implied volatility structures in each of the two legs of the spread, a so-called strike convention is required in order to make use of common approaches like Margrabe’s formula and its many extensions. By means of Malliavin Calculus we construct an optimal linear strike convention for consistently pricing exchange options under stochastic volatility models. This convention allows us to minimize the difference between the model and implied correlations between the two underlying assets in the spread. Moreover, we show that this optimal convention does not depend on the specific stochastic volatility model, and can be linked to market observables instead. Numerical examples are given and demonstrate the strength of the approach under a variety of different settings. | Mark CumminsDublin City University ## Abstract## Model Risk in Gas Storage Valuation: Joint Calibration-Estimation Risk MeasurementGreg Kiely,Mark Cummins and Bernard MurphyWe present a joint calibration-estimation risk measurement methodology, extending recent literature, which incorporates both market
calibration and historical estimation risk within a meaningful distributional
assessment of parameter risk. Extending the emerging literature on
model risk issues in energy markets, we apply our technique to the
problem of natural gas storage valuation, using a flexible multifactor
Mean Reverting Variance Gamma model specification that is \textit{both}
forward curve consistent and calibrated to market traded options.
Realistic models of the natural gas forward curve cannot be calibrated
to benchmark instruments alone due to the lack of a liquid time-spread
options market and thus the correlation structure is typically estimated
from historical data. We additionally devise an accessible model selection
technique based on our distributional assessment of parameter risk.
For a basic one-year 20in/20out storage contract, we show that
the parameter risk of our two-factor Mean Reverting Variance Gamma
model is higher relative to single-factor Mean Reverting Variance
Gamma and Mean Reverting Jump-Diffusion benchmarks, with very different
distributional characteristics. Formally pricing the parameter risk,
shows the model based bid-ask spread to be over five times that of
the benchmarks. The greater flexibility of the two-factor Mean Reverting
Variance Gamma model in capturing more extrinsic value therefore comes
at the cost of greater uncertainty. Our novel model selection technique
shows, however, this increased uncertainty to be bearable, concluding
that the two-factor Mean Reverting Variance Gamma model is an acceptable
choice over its one-factor counterpart.
| ||

Bruno DupireBloomberg LP ## Abstract## Special Techniques for Special EventsBruno DupireHead of Quant Research, Bloomberg L.P. Most models assume a tame and homogeneous behavior of the underlying security price but many situations call for specific treatments. For instance, earnings for stocks, acquisitions, currencies pegged to a level or constrained to remain in a corridor, elections, referendums, FDA approvals for pharmaceutical companies, provide cases that depart from the classical modeling paradigm. We review a number of such examples and study in detail the Brexit, US and French elections case. We clarify the links between risk neutral densities and implied volatility skews, show how to compute the excess variance that the market attributes to special events and propose a way to infer a bi-modal jump compatible with the risk neutral densities for maturities just before and after the event. This process involves inverting a convolution kernel by regularization. | Ernst EberleinTechnical University of Munich ## Abstract## Multiple Curve Interest Rate Modelling Allowing For Negative RatesErnst EberleinUniversity of Freiburg A multiple curve forward process as well as a multiple curve forward rate model is developed. In both approaches time-inhomogeneous Lévy processes are used as drivers. Negative interest rates are taken into account in a natural way. We derive valuation formulas for standard interest rate financial products such as caps, floors, swaptions and digital interest rate options. A number of calibration results is presented where we also consider data in the setting of a two price economy, thus exploiting explicitly bid and ask prices. This is joint work with Christoph Gerhart and Zorana Grbac. | Paul EdgeEDP ## Abstract | ||

Matthias EhrhardtBergische Universitat Wuppertal ## Abstract## High-Order Methods for Parabolic Equations in Multiple Space Dimensions for Option Pricing ProblemsChristian Hendricks,Matthias Ehrhardt and Michael Günther
| Peter ForsythUniversity of Waterloo ## Abstract## Monotone Schemes for Two Factor HJB Equations with Nonzero CorrelationPeter ForsythCheriton School of Computer Science, University of Waterloo, Canada In the case of two or more stochastic factors, with a nonzero correlation between the factors, it is a non-trivial matter to construct a monotone discretization scheme for HJB equations. In this talk, we discuss use of a wide stencil method, based on a local virtual grid rotation. Special care must be taken to ensure consistency near the boundaries of the computational domain. We use unconditionally stable fully implicit timestepping. In order to generate accurate efficient frontiers, it is crucial to modify the usual linear interpolation technique which is needed for semi-Lagrangian timestepping. The non-linear discretized algebraic equations are solved using a Policy iteration algorithm. Numerical results are presented for an uncertain volatility option pricing problem and asset allocation with stochastic volatility. We also demonstrate the use of a hybrid PDE-Monte Carlo approach | Karel In 't HoutUniversity of Antwerp ## Abstract## ADI schemes for valuing European-style options under the Bates model Karel in 't HoutU Antwerp, Belgium Jari Toivanen Stanford U, USA and U Jyväskylä, Finland In this talk we consider the adaptation of Alternating Direction Implicit (ADI) schemes for the numerical solution of the Bates partial integro-differential equation (PIDE) modelling the values of European-style options. ADI schemes, such as the Modified Craig-Sneyd scheme and the Hundsdorfer-Verwer scheme, are a prominent tool in computational finance for the numerical solution of multi-dimensional time-dependent partial differential equations (PDEs). In this talk we discuss the incorporation of the integral term, stemming from the jump part in the underlying asset price model by Bates (1996). We study stability and convergence of the obtained ADI-type schemes and present ample numerical experiments. | ||

Andrey ItkinNew York University ## Abstract## Filling the gaps smoothlyAndrey ItkinTandon School of Engineering, NYU Calibration of a local volatility model to a given set of option prices is a classical problem of mathematical finance. It was considered in multiple papers where various solutions were proposed. In this paper an extension of the approach proposed in Lipton, Sepp 2011 is developed by i) replacing a piecewise constant local variance construction with a piecewise linear one, and ii) allowing non-zero interest rates and dividend yields. Our approach remains analytically tractable: it combines the Laplace transform in time with an analytical solution of the resulting spatial equations in terms of Kummer's degenerate hypergeometric functions. It also preserves no-arbitrage. | Yuri KabanovUniversité de Franche-Comté ## Abstract## Clearing in Financial NetworksYuri KabanovUniversité Bourgogne Franche-Comté, Federal Research Center Informatics and Control, Lomonossov Moscow State University Clearing of financial system, i.e. of a network of interconnecting banks, is a proce- dure of simultaneous repaying debts to reduce their total volume. The vector whose components are repayments of each bank is called clearing vector. In simple models considered by Eisenberg and Noe (2001) and, independently, by Suzuki (2002), it was shown that the clearing to the minimal value of debts accordingly to natural rules can be formulated as a fixpoint problem. The existence of their solutions, i.e. of clear- ing vectors, is rather straightforward and can be obtained by a direct reference to the Knaster–Tarski or Brouwer theorems. The uniqueness of clearing vectors is a more delicate problem which was solved by Eisenberg and Noe using a graph structure of the financial network. We discuss the modern state of art of the theory and, in particu- lar, algorithmic aspects of solving clearing equations in relations with those arising in the theory of optimal stopping. | Juho KanniainenTampere University of Technology ## Abstract## Big Data Problems and Techniques in FinanceJuho KanniainenPeople in modern societies are leaving behind a vast amount of data that can be analysed and exploited in new and unprecedented ways to understand and model financial markets for better risk management. The data sources include social media and news services being heterogeneous and unstructured, and electronic financial markets that generate terabytes of structured ultra-high-frequency limit order book data each day. The resulting datasets are so large and complex that such "Big Data" is becoming difficult to process with the current data management tools and methods. At the same time, this data could provide valuable information to validate financial strategies, manage risks, and make decisions. Big Data holds great promise for discovering subtle patterns and heterogeneities that are not possible with small-scale data; on the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, which require a new computational and statistical paradigm. To exploit this potential, banks and other financial institutions must be able to handle and process massive heterogeneous data sets in a fast and robust manner. Especially risk management can no longer rely on pure traditional approaches, which often make unrealistic assumptions for the sake of soluble theoretical models over agreement with empirical data. Nonetheless, according to European Commissions' communication, compared to the USA, Europe is still at an early stage of adopting Big Data technologies and services. At the same time, financial research has been quite slow to embrace the data revolution. This talk elaborates the opportunities and challenges of using data science methods and large data sets in finance-related industries and research from the following perspectives: How to process and analyze financial big data? How to better understand of investor behavior with complex network techniques? Could information on news events be integrated into financial econometric methods? What are potential Big Data applications in finance and insurance? | ||

Choi-Hong LaiUniversity of Greenwich ## Abstract## A realistic hedging strategy at discrete temporal stepsC.-H. LaiOption pricing models [1] generally require the assumption that stock prices are described by continuous-time stochastic processes. Although the time-continuous trading is easy to conceive theoretically, it is practically impossible to execute in real markets. One reason is because real markets are not perfectly liquid and purchase or sell any amount of an asset would change the asset price drastically. A realistic hedging strategy needs to consider trading that happens at discrete instants of time. This presentation is focused on the impact of temporal discretisation to the prices of bonds and options on bonds. Numerical examples will be used to demonstrate impact on such prices due to the hedging frequency. The study is limited to one factor interest rate models on the prices of interest rate derivatives. The technique relies on a Taylor expansion and certain fundamental elements of probability theory in deriving closed-form solutions for the prices of zero-coupon bonds and European call options on bonds. An application to a Vasicek-type dynamics [2] is included with numerical simulations validating the method. [1] F. Black, M. Scholes (1973). The Pricing of Options and Corporate Liabilities. Journal of Political Economy. 81, 637-654.[2] O. Vasicek (1977). An equilibrium characterization of the term structure. Journal of Financial Economics. 5, 177--188.
| Attilio MeucciARPM ## Abstract## FinTech Education: Leveraging Technology to Communicate and Understand Quantitative FinanceAttilio MeucciWe illustrate innovative e-learning concepts that have been implemented in the ARPM Lab - Say it: true math, as opposed to verbose descriptions
- See it: (voiced-over) simulations as opposed to lecture recording
- Do it: interactive cloud-based computing for on-the-fly replication
- Frame it: cross-linking; multi-media interconnectivity; spoke-to-hub architecture
- Share it: social media link landing
| Marek MusielaOxford-Man Institute ## Abstract## Arbitrage, Information and InvestmentMarek MusielaIntuitively information should play an important role in investment. A better 'informed' portfolio manager should make, in principle, better investment decisions than the one that is less 'informed'. One would also think that there should be a minimal information portfolio manager should use. Otherwise one could rely on a random choice. In this talk we assume that the minimal information (filtration) based on which portfolio decisions are made is the filtration generated by the prices of assets. However, portfolio managers may use many other sources of information (not only the prices). For example, they may use analysts’ recommendations for stocks, sector specific 'non-tradable' information, market indexes, non-farm payroll figures, GDP growth, etc. That is why it is quite natural to assume that the filtration they use is larger than the filtration generated by the prices. In the first part of this talk we analyse an example with two sources of information based on which decisions are made. The first, say S, describes evolution of the stock price which follows a binomial process and hence can only move up or down. The second, say Y, is also binomial and represents information that in principle cannot be traded directly. There is no restriction to the way both sources of information are linked. In the second part we consider more general models of price evolution. Also filtration used by the better ‘informed’ manager is significantly larger than the one generated by the prices. We consider only discrete time models. We explain, however, links with the well-developed continuous time literature. | ||

Cornelis OosterleeDelft University of Technology ## Abstract | Olivier PironneauUniversité Pierre et Marie Curie ## Abstract## The Parareal Algorithm for American Options Olivier PironneauUniversié Pierre et Marie Curie For risk assessment banks have to compute every day a very large number of American options. GPU cards provide a cheap road to parallel computation with a reduction of CPU time of an order of magnitude at least. However algorithm to compute American contracts are quite difficult to parallelize. This talk will provide a description of the parareal method applied to American contracts with the objective of parallel computing. We perform a decomposition of the time interval to maturity (0,T) into subintervals, each allocated to one processor; on each sub-interval the Longstaff-Schwartz Monte-Carlo method is used. Matching is done by a multi-level algorithm. A proof of convergence will be given and a numerical section will assess the performance. Comparison with other numerical methods will also be given. This work is in cooperation with Gilles Pages and part of the PhD thesis of Guillaume Sall. | Christoph ReisingerUniversity of Oxford ## Abstract## Calibration of Local-Stochastic and Path-Dependent Volatility Models to Vanilla and No-TouchChristoph ReisingerWe begin by reviewing the state-of-the-art of the calibration of volatility models to vanilla and barrier-type option prices. We then propose a novel and generic calibration framework to both vanilla and barrier options for a large class of continuous semi-martingale models. The method builds upon a forward partial integro-differential equation (PIDE) which allows computation of up-and-out call prices for the complete set of strikes, barriers and maturities using a Markovian projection of the variance process. Our work is the direct extension of the calibration literature on vanilla options with the Dupire partial differential equation (PDE). We derive a necessary and sufficient condition for the exact calibration to up-and-out call options and provide a step-by-step procedure to calibrate a local maximum volatility model (LMV), a local maximum stochastic volatility model (LMSV), and a new Heston-type local stochastic volatility model with local vol-of-vol (LSV-LVV). While the LMV model is calibrated directly with the aforementioned forward PIDE, both the LMSV and LSV-LVV calibration use a two-dimensional particle method to estimate conditional expectations in the calibration condition. We present numerical results for the aforementioned models for a set of EURUSD market data of vanilla and no-touch options. Finally, we conclude by extending the main Markovian projection formula to handle stochastic rates and discuss how the algorithms can be adapted. | ||

Daniel SevcovicComenius University ## Abstract## Riccati transformation method for solving Hamilton-Jacobi-Bellman equationDaniel ŠevčovičComenius University of Bratislava, Slovakia In this talk we present recent results on application of the Riccati transformation for solving the evolutionary Hamilton-Jacobi-Bellman equation arising from the stochastic dynamic optimal allocation problem. It turns out that the fully nonlinear Hamilton-Jacobi-Bellman equation governing evolution of the value function can be transformed into a quasi-linear parabolic equation. Its diffusion function is obtained as a value function of certain parametric convex optimization problem. We will show that point of discontinuity of this value function can be identified with transitions and changes of the optimal portfolio composition. We prove existence of classical solutions to the HJB equation. A numerical solution is then constructed by means of an implicit iterative finite volume numerical approximation scheme. As an application we present results of computing optimal strategies for a portfolio investment problem for German DAX stock index. This is a joint work with S. Kilianová. | Qin ShengBaylor University ## Abstract## Why or Why Not? Adaptive Splitting Interactions for Multiphysics and Stochastic VolatilitiesQin ShengDepartment of Mathematics, Center for Astrophysics, Space Physics and Engineering Research Baylor University, Texas, USA Partial differential equations of different types are playing an increasingly important role in modeling numerous natural processes, including laser and ocean waves, fuel combustion and injections, automatic controls, financial derivative transactions and options exchanges. In situations when singularities, nonlinearities, multiple dimensions or stochastic influences are present, efficiency and effectiveness of existing computational procedures are challenged. Splitting and adaptations become extremely sensitive options due to their great simplicity in algorithmic structures and flexibility in applications. This presentation consists of three interactive components. First, we review traditional splitting concepts of the East and the West, as well as their exponential splitting modernizations. Second, we introduce the latest adaptive strategies for robotic computations targeting singular partial differential equations. Third, we explore solid theory and methods of adaptive splitting for reatistic problems from the industry involving stochastic influences or high frequency waves. A stochastic Black-Scholels' model will be discussed. Computational experiments will be given to illustrate our investigations and conclusions. | Albert ShiryaevSteklov Mathematical Institute ## Abstract## Optimal stopping times of the detection of financial drift-bubbles Albert ShiryaevWe consider Bachelier and Black-Scholes models of prices where drifts change values (increasing/decreasing) in a random time.That time we assume uniformly distributed on interval [a,b].Our aim is finding stopping times where we are close to the time of disorder in the drifts of prices. | ||

Steven ShreveCarnegie Mellon University ## Abstract## A Diffusion Model for Limit-Order Book EvolutionSteven E. ShreveCarnegie Mellon University With the movement away from the trading floor to electronic exchanges and the accompanying substantial increase in the volume of order submission has come the need for tractable mathematical models of the evolution of the limit-order book. The problem is inherently high dimensional, and any realistic description of order flows must have them depend on the state of the limit-order book. Poisson process models for the evolution of the limit-order book have been proposed, but the analysis of these is either difficult or impossible. In this talk, we show how diffusion scaling of a simple Poisson model, inspired by queueing theory, can lead to a rich yet tractable diffusion model for the evolution of the limit-order book. We then show how to compute the probability of up and down price moves and the time between price changes in this model. This is joint work with Chris Almost, John Lehoczky and Xiaofeng Yu. | Nizar TouziEcole Polytechnique ## Abstract## Continuous-time Principal-Agent problem: a stackelberg stochastic differential gameNizar TouziEcole Polytechnique, France We provide a systematic method for solving general Principal-Agent problems with possibly infinite horizon. Our main result reduces such Stackelberg stochastic differential games to a standard stochastic control problem, which may be addressed by the standard tools of control theory. Our proofs rely on the backward stochastic differential equations approach to non-Markovian stochastic control, and more specifically, on the recent extensions to the second order case. The infinite horizon setting requires an extension of second order BSDEs to the random horizon setting. | Carlos VazquezUniversidade da Coruña ## Abstract## PDE models for optimal investment under uncertainty with environmental effects Carlos VázquezIn this work the authors propose efficient numerical methods to solve mathematical models for different optimal investment problems with irreversible environmental effects. A relevant point is that both the benefits of the environment and the alternative project are uncertain. The cases with instantaneous and progressive transformation of the environment are addressed. In the first case, an Augmented Lagrangian Active Set (ALAS) algorithm combined with finite element methods are proposed as a more efficient technique for the numerical solution of the obstacle prob- lem associated to a degenerated elliptic PDE. In the second case, the mathematical model can be split into two subsequent steps: first we solve numerically a set of parameter dependent boundary value problems (the parameter being the level of progressive transformation) and secondly an evolutive nonstandard obstacle problem is discretized, thus leading to an obstacle problem at each time step. Numerical solutions are validated through qualitative properties theoretically proven in the literature for different examples. This is a joint work with Iñigo Arregui (University of A Coruña). | ||

Michael WickensUniversity of York ## Abstract## Is asset-pricing pure data-mining? If so, what happened to theory?Michael WickensUniversity of York and Cardiff Business School Whether one considers the pricing of bonds, equity or FOREX theoretical asset-pricing models perform poorly. As a result the finance industry relies on data-mining: statistical models chosen to fit the data often with little theoretical justification. This is illustrated with popular examples of the pricing of each of these assets, including CAPM, affine factor pricing models of bonds and UIP. Is this because the theoretical models are poor, or because investors don't understand, and hence employ, the theories? |