The VIX, Lehman Brothers, and Forecasting
Jeff Kearns and Michael Tsang of Bloomberg have an article out today (VIX Fails to Forecast S&P 500 Drop, Loses Followers) in which the authors contend that largely because the VIX failed to predict the October losses in the S&P 500 index (SPX), the VIX is no longer considered to be an accurate gauge of future market activity.
One of the central claims made by Kearns and Tsang regarding the lack of effectiveness of the VIX is stated as follows:
“On Sept. 11, less than a week before New York-based Lehman Brothers Holdings Inc. went bankrupt and four days after the government takeovers of Washington-based Fannie Mae and McLean, Virginia-based Freddie Mac, the VIX closed at 24.39. That meant traders bet the S&P 500 wouldn’t fluctuate more than 24.39 percent on an annualized basis, or about 7 percent in the next 30 days, and implied a range for the index of 1,161.11 to 1,336.99.
One month later, on Oct. 10, the S&P 500 closed at 899.22, or a record 23 percent lower than what the VIX predicted.”
As fellow blogger Don Fishback was quick to point out, the VIX calculation actually estimates one standard deviation of annualized 30 day volatility expectations in SPX options. That standard deviation, of course, is meant to capture 68.3% of the Gaussian or normal distribution of prices. In fact, the distribution of VIX prices does not follow a normal distribution, but even if it did, the movements from September 11th to October 10th would not be that statistically improbable.
Some quick comments on the math involved here. Using a VIX of 24.39, the conversion of an annualized volatility of 24.39% to 30 day volatility yields a 30 day volatility of 7.04%. Two standard deviations translate into a 30 day volatility of +/- 14.08% and should cover about 95.5% of the normal distribution (I am assuming a normal distribution for the sake of mathematical simplicity). Three standard deviations increase the range of expected volatility to +/- 21.12% and should capture about 99.7% of the normal distribution. In fact, the 99.9% boundary for the normal distribution is 3.29 standard deviations and translates into expectations of an SPX move of 23.16%, about on par with what transpired. So, given that an event which falls outside of 99.9% probability distribution happens once every 1000 instances. Rare indeed, but not unfathomable.
Of course there are many ways in which to utilize the VIX to aid in market timing. Consider that the nature of the VIX calculation is such that the VIX acts as an unbounded oscillator whose values are derived from prices paid for options on the SPX. Like most oscillators, traders tend to use extreme values as opportunities to bet on a reversion to the mean.
A good deal of the difficulty in understanding the movements of the VIX is that some of the dominant patterns are different over the course of different time horizons. For instance, in the short-term, the VIX commonly spikes and mean reverts. Looking at the intermediate term, the VIX frequently establishes strong trends; and in the long term, the VIX has a tendency to move in cycles of 2-4 years. For the month of September and most of the month of October, the VIX was in an uncharacteristically strong sustained uptrend.
Part of the reason for the sharp move in the VIX during September and October is that it is highly dependent upon macroeconomic and fundamental events that help to shape investor perceptions of uncertainty, risk and fear. In the week leading up to the Lehman Brothers bankruptcy, for instance, very few investors believed that the government was prepared to let Lehman fail. Additionally, in retrospect is seems as if those who did believe failure was an option did not comprehend the nature of the systemic reverberations that a Lehman bankruptcy would trigger.
The bottom line is that during the second week in September, the VIX was pricing in a very low probability of a Lehman Brothers bankruptcy. Perhaps more important, investors were also assigning a significantly lower systemic threat potential as a result of the dominoes associated with a Lehman bankruptcy.
Ultimately, it took a full six weeks of a steadily trending VIX for the market to fully price in the global systemic risks associated with the sequence of events that began with the Lehman Brothers bankruptcy.
Consider that the prices and implied volatilities of SPX options have to account not just for the probabilities associated with various future scenarios, but also the magnitude of the impact of those scenarios on the stock market. For this reason, even while some of the probabilities may not have varied significantly from day to day during September and October, as the magnitude of the financial crisis was slowly revealed, the VIX continued to ratchet higher – and investors reacted to a rising VIX with increasing alarm.
Getting back to the question posed by Kearns and Tsang, yes the VIX underestimated future volatility back on September 11th. At that time, the prediction of a four year flood was consistent with mainstream thinking. Very few observers anticipated the SPX falling below 800 by Thanksgiving.
As the year winds down, I will have more about what we learned about volatility in 2008 and what some of the implications are for 2009 and beyond.
8 comments:
I am pleased that there is this sceptical view. Fewer people understanding what to do with implied volatility and fewer making an attempt to understand it suits me fine.
thanks for the insightful article! Forgive my math since I'm still learning but help me understand something; since you are measuring 30 day volatility shouldn't your data point to predict 1 in a 1000 event be in months instead of days? So instead of once every 4 years it should be once every 83 years? And if you are using rolling 30 day windows then surely the market moved more than 3.29 standard deviations in the ensuing days after Sep 11? (ie, T+1 to T+31, T+2 to T+32, etc). If so, then this would imply that this event occurred more than once in every 4 years. TIA for the reply!
Douglas,
Sometimes I tell myself to trade more and blog less...
Anon,
Actually, I think you nailed the math on both accounts (unit of time and rolling window instead of end to end calculation), so I went ahead and edited my post accordingly. My calculations didn't pass the sniff test, but I ignored that warning sign, even though these types of errors tend to happen when I rush to get something up in a short time frame.
Many of the other volatility measures I follow ended up concluding that September-October was a once in 75-100 years event. In 2008, the VXO did not reach the same level that it did in 1987, but clearly recent events were not a once every four years type of event.
Finally, I probably should have put more emphasis on the fact that the VIX does not approximate a Gaussian distribution, so it is dangerous to talk about standard deviations related to a Gaussian distribution when referring to the VIX.
Obviously I'm going to need to dig deeper into this after the holidays.
Cheers,
-Bill
nice post
Hi guys.
I am looking for a site where I can find at the money 30 day implied vols on the Nikkei 225 and on the Hang Seng indexes.
Essentially I'm looking for a number that I can observe for those Asian Markets that are comparable to that of the VIX.
Can anyone help or make suggestions?
Anon,
The only Asian volatility index I am aware of is the India VIX.
For Japan and Hong Kong, you can come up with a reasonable approximation using the implied volatility of options traded on the ETFs for EWJ and EWH. It's not perfect, but it might be close enough for your needs.
Cheers,
-Bill
The standard deviation calculations mentioned here are called "calibration" by the Wall Street "Quants" and are often used for valuation and risk management purposes. However, market timing using VIX is much more complicated than the "calibration". Simply apply the Mathematics to market timing is naive and misleading, as seen by the fact that many of these "quants" fail in the real markets.
the quants fail not because of vix distortions or the math behind it, but due to factor volatility.
as their models are using backward looking data across many dimensions to make investment decisions, the output is only as good as the data input. once the inputs demonstrate greater volatility, then the predictive value declines and the strategy lags until the data set becomes more consistent with smaller ranges between the upper and lower boundry of the set.
think of what is going on with earnings estimates, the ranges top to bottom and the speed with which they are getting marked down. or the jobs number and the range of estimates.
given the velocity of these adjustments and the ranges with which they travel, quant guys will stuggle until the data set shows less volatility and becomes more consistent and "factor vol" comes down.
Post a Comment