The Black-Scholes analysis underpins much of modern financing. Today I am going to take an axe to one of its primary underpinnings. The Black-Scholes analysis depends, among other things, on the idea that volatility can be represented by a single number which is independent of the time over which you mention it.
Suppose that we have a random walk made up of randomly sampling a probability distribution, plus a general tendency to move in some direction. I.e. . Now since dX samples randomly from a distribution with variance , we can expect that it is time dependent. If the variance can be represented by a single parameter, then since
then its clear that for a random walk if we sample over a time that is twice as long, (i.e., contains two samples from the distribution X added together), then we have , and hence we must have , thus obtaining Ito’s Lemma by inspection.
Now if the market were perfectly efficient, we would know that this must be the case, since if it is not the case, we can obtain an arbitrage opportunity in the following way. (Today we ignore all frictions and transaction costs). Suppose that I have £1,000,000 in cash, and so I decide to take a short position worth £1,000,000 in some index, say the FTSE100 while simultaneously taking a long position in the same index. The only difference is that I am going to rebalance the long position at the open and the close every day to be worth exactly £1,000,000 by using my cash to buy or sell as needed, whereas I will rebalance the short position only every thirty days. The idea is that if volatility is greater on a day to day basis than on a thirty day basis, then by buying high and selling low I can make an incremental profit while, in the main, being hedged against market moves. The open and close data for the FTSE100 is widely available. Ideally I would like to have the tick data so I could look at rebalancing on shorter timescales. Sadly that data appears to be very expensive :(.
Anyway, here is the result of the strategy on the FTSE100 since 1984.
I find this graph fascinating. We are looking at the evolution of the market in action. In essence, from 2000, we see the emergence of a term structure, after which the strategy routinely makes money. Note that as we are very close to Beta-neutral, this strategy can be levered up and would need to be, as the absolute returns are not exactly great. However, note that the volatility of the underlying instruments is likely to be much greater than that of the composite index, so the returns would be greater if we applied this strategy individually to each of the constituents. That would mean much more work for me to back test it. 🙂
I suspect that the change in term structure is related to the rise of algorithmic trading. In essence, this strategy is the opposite of trend following. If you trade based on trends, you make the trends shorter and steeper and hence increase volatility. Its plausible that this results in higher volatility on short time scales compared to longer ones. An alternative explanation is that options trading has calmed longer term volatility.
Another interesting feature of this term structure applies to options pricing. Since options are priced via Black Scholes based on the volatility, and since measuring volatility on a monthly rather than a daily basis would give you different measures of volatility, then they will also give you different prices. This is already known in finance, where the implied volatility of longer term options is often less than that of shorter term options.
Still, its fascinating to me to watch how the evolution of the market and trading ideas has created an inefficiency which did not appear to exist in the past. Probably some Quant based Hedge funds are already trading on this idea, and as it becomes more heavily traded it will disappear.