Challenge Data

Deep Hedging for an Equinoxe
by Natixis


Register or login to participate !

Description


NO LOGO FOR THIS CHALLENGE
Dates

From Jan. 6, 2020 to Dec. 18, 2020


Challenge context

Deep Hedging of an Equinox

In this project, we construct optimal portfolios that hedge an option issued by the bank to an investment manager who wants to invest with a garanty to maintain his capital . Si he is willing to pay a premium as small as possible. The equinox is a long term option that pays :

payoff=1SR>=BG+1SR<B(STK)+ payoff = 1_{S_R>=B} \, G+1_{S_R<B}\,(S_T - K)^+

where

StS_t is the value of the underlying asset (stock index,...) at time t . T is the maturity of the option, and K is typically the forward price of the underlying asset. Usually the maturity of such option is 10 years, but here in this project the maturity is reduced to 5 years to lower the hedging time and thefore to lower the complexity of the optimization.

Usually such an option is expensive, we makes it an autocall to lower its price. Which means that at an intermediate date R<T, the bank recalls the option if anf only if the value of the underlying is greater that a barrier B. If at the reset date R, the value of the underlying is lower than the barrier, then the quinox become ana eurpean call option.

G is the rebate paid by the bank to the investor, when the equinox is recalled.

This barrier is usually closed to 100% of the sopt value of the underlying at the original date. If the underlying is above this barrier, we will give to the client a compensation, here it will be 5% using a Black-Sholes Model, such an option at 20% of volatility for the underlying is sold at 6.1% of the notionnal amount. An ordinary option is sold at 17 % . So it looks very appealing It makes such equinox option interesting for the investor to put in portfolio in order to capture long term performance of a market especially if you think that the market may experiance a medium term low.

The Black and Scholes model (lognormality of the process of the underlying process) used to value the option is a very crude approximation because of the following fact:

  1. The black Scholes assumption is very crude. A more realistic model should inclue stochastic volatility with a stochastic correlation between the spot and the spot volatility.

  2. The dynamic hedging argument that value the option is valid for continuous rehedging. In reality, hedging is done at discrete time. Here in the projet assumme that it is donne weekly, to make the computation doable with limited computing ressources.

  3. Transaction cost are not taken into account in the martingale based pricing model. In reality transaction costs are unavoidable. In this case we assume that they are equal to 0.01 percent of this value of the transaction.

  4. Vega hedging is needed with stochastic volatility models and is enforced by the risk control of the bank and the regulatory authorities.

They are motivated by the fact that without dynamic hedging whose goal is to reshape the risk distribution in the martingale based approach of derivative valuation, the bank would need to post capital dedicated to bear the risk associated with selling this option. Veaga rehedging is assumed to happen only periodically .

Vega hedging is done by buying or selling the same europan call option of maturity T and strike K.

For the sake of simplicity, hedging is done only at two dates : at origination of the deal (time =0) and at the reset date.

The vega trades are usuall very expensive because transaction costs , mainly bid-offer markets.

This is why vega hedging is done in discrete times.

It is important to understand that those call options come with their own delta hedging. Every week the combined position equinox + vanilla option is delta rehedged. Those rehedging generate transaction costs.

In this project, we assume that the transaction cost associated with the vanilla options is 0.02 percent of the value of the option.

Hedging policy

In this project we assume that the risk control of the bank is setting limits for delta and vega exposures.

The delta limit is enforced weekly and the vega limit is enforced only at inception of the deal and at the reset date.

For the sake of simplity, we assume that the risk limit for vega is 0, while we consider 3 risk limits for delta: L = 10%, L =20% and L=50% .

Therefore the only limit which is to be enforced is :

Abs(Δequinox+Δvanillanvegahedge+Hhedge)<L Abs(\Delta_{equinox} + \Delta_{vanilla} * n_{vegahedge} + H_{hedge}) < L

In this projets, the paticipants are given the Equinox delta, the vanilla delta and the vega hedge. The participants need to determine the number of underlying H_hedge that needs to be traded to maximize a utility function that is described below.

The risk control of the bank uses a Black and Scholes type of model to generate the deltas and vegas . The proxy for implicit volatility of the pricing models is the instantaneous volatility generated by a market model. This market model assumes a more realistic dynamics of the value of the underlying assets

It is a major simplification that allows to avoid any stochastic calculation by providing all the necessary computations in the dataset.

The Market Model

Trajectories ar generated by a stochastic model . A realistic model should have its drift controled such that it is martingale in the delta neutral measure of the other assets. In the current market environement, this implies a negative drift in forward prices of 3% per years in average. This negative drift corresponds to the average dividends rate and the repo rates. The volatility in this model is stochastic but its process does not need to be a martingale , because no tradable security pays it. Therefore it is assumed here to be a mean reverting process. vAlternative models assume a term structure in it. We also assume that the instantaneous correlation between the underlying and the spot volatility is stochastic and follows a mean reverting process,

Details about the modela re given in the following document :

Click here to access the document

We assume that the initial value of the underlying is equal to 1 . This facilitates the computations and makes life easier for the optimizers. We also provide initial values for the spot volatility and the instantaneous correlation. The negative value of the correlation is related to the frequent observation of an inverse relationship between the direction of the market and its volatility.

Theoretical considerations needed to solve the optimization described above

The project simulates the cost of hedging the equinox option along each tajectories (this cost is also refered to as the replication cost of the equinox):

CR=Γ(S1,S2,...,Sn)k=1nδk1(SkSk1)+TrCost(δkδk1)+VegaHdg CR = \Gamma(S_1,S_2,...,S_n)-\sum_{k=1}^n \delta_{k-1} (S_k-S_{k-1})+TrCost(\delta_k-\delta_{k-1})+VegaHdg

We then obtain a distribution of the hedging cost. The purpose of the project is to derive the optimal hedging strategy that satistifies the risk limits and maximizes the following utility function.

U(X)=1λlog  E[exp(λX)] U(X) =-\frac{1}{\lambda} log \; \mathbb{E}[exp(- \lambda X)]

where X=CRX = -CR

and
λ\lambda relates to the trade-off between risk and returns.

For the project,

λ=1 \lambda = 1

references

The main reference for this topic is the paper: Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood: Deep Hedging, arXiv:1802.03042.

We also advise to consult the book: Hans Follmer, Alexander Schied : Stochastic Finance, an introduction in discrete time, Walter de Gruyter, 4th edition

A nice presentation of these matters can also be found there : SwissQuote 2018


Challenge goals

Goal of the Challenge

The goal is to determine 3 strategies associated with 3 different limits L :

  1. first case : L = 0.1

  2. Second case : L = 0.2

  3. Third case: L = 0.5

Submission

A submission consists,for each the 3 limits and for each of 5000 tragectories, of the optimal delta hedging strategies For each limit, an average of the replication cost is computed. The score of your submission is the average of these 3 averages.

In case of ExAequo, an additional figure is computed : the expected shortfall. If two participant have the same score, the winner is the one with the lowest sum of expected shortfalls associated with the 1% limit.

A submission consists therfore in a CSV file made of 15000 X 260 real numbers. ie 5000 X 260 real numbers for each limit.

Constraints on the submission

The submission has to represent a hedge in delta for the vega hedged equinox. Therefore the following constraints should be enforced:

  1. For every trajectory the sum of deltas should be equal to the payoff of the equinox on the trajectory, which depends, of course, on the exercise of the callability.

  2. At each date the averall delta is the sum of the delta of the option, the delta of the vega hedge and H_hedge . This sum should be less than the limits L.


Data description

Structure of the data set

There are several datasets available. they are there :

Click here to access the datasets

The one which is official is X_Validation_new2.CSV

The official data set is made of a CSV file of 5000 lignes. Each ligne includes the following information :

. 261 Values of StS_t : the spot price.

. 261 values of αt \alpha_t : the instantaneous volatility of the underlying at time t, and also the proxy for implied volatilty . (cf attached paper) -not needed- : .

. 261 values of λt\lambda_t : the parameter of the instantaneous correlation process between the underlying and its volatility. (cf attached paper) -not needed-

. 261 values of XtX_t : the autocall value at time t.

. 261 values of (XS)t\left(\frac{\partial X}{\partial S} \right)_t : the delta of the autocall at time t.

. 261 values of (Xσ)t\left(\frac{\partial X}{\partial \sigma} \right)_t : the vega of the autocall.

. 261 values of ZtZ_t : the vanilla call option value at time t.

. 261 values of (ZS)t\left(\frac{\partial Z}{\partial S} \right)_t : the delta of the the vanilla call option at time t.

. 261 values of (Zσ)t\left(\frac{\partial Z}{\partial \sigma} \right)_t : the vega of the the vanilla call option at time t.

If you run into overparametrization problems such as unstabilities , it may be interesting to use additional trajectories to stabilize the optimization. The suplementary file contains a set of 25000 trajectories together with associated data described above) to help you stabilizing the optimization.

The X-train dataset is the same as the X-test dataset training can be done only with the x-test. But it is advised to use the extended X_Supplementary_Training_new2.csv included in the supplementary files in order to better stability while optimizing.

The time series used in this dataset are not arbitrage-free. They given in a historical measure instead of a the delta neutral measure. The benchmark, described below, does'nt generate any profit. Will you be smart enough to exploit those arbitrages and make the autocall business profitable while respecting the risk policy ?


Benchmark description

Benchmark

There is a whole range of possible strategies that deliver the pay-off. Among all those strategies, there is a strategy that minimizes the expected replicating cost, but its risk profile is costly, according the expected utility measure.

This is the benchmark used for this project.

The average replication cost for the 3 limits is of 20%, which is enormous. That comes from the disretization of the hedge (which makes the replication less precise), the transaction costs and the wrong computation of the hedge ratios due to the use of the wrong model (black Sholes).

 

Files


Files are accessible when logged in and registered to the challenge


The challenge provider


PROVIDER LOGO

Investment banking