On Friday, I used Bollinger Bands: Why 20 Days? as an excuse to begin my “evangelical crusade against rampant defaultism.” I did this mostly so I could make up a word (or so I thought, apparently others beat me to it), but also to warn against the tendency to favor the default settings in charting software in lieu of rigorously exploring the alternatives.
Using Bollinger Bands as my guinea pig indicator, on Friday I explored what happens when you tweak the default setting of 20 days. The other important setting in Bollinger Bands, of course, is the standard deviation setting. In Bollinger on Bollinger Bands, John Bollinger cites tests he conducted on various stocks, indices, currency pairs, and commodities, the results of which led him to conclude that a 2.0 standard deviation setting was quite robust across a variety of asset classes and time frames. Bollinger went on to recommend that the standard deviation setting should be decreased to 1.9 for a time window of 10 days and increased to 2.1 standard deviations when the time frame is extended to 50 days. In other words, modifying the number of days is more likely to provide additional insight than adjusting the standard deviation setting.
Undeterred, I have had a lot of fun experimenting with the standard deviation setting for VIX charts and a number of other charts. My takeaway: even if you do not discover any particularly revealing new information, it is important to experiment with the default settings to get a better understanding of how the indicator works in default mode and why the bands act the way they do.
In the VIX charts below, I have included the 2.5 standard deviation setting on the top, with the default 2.0 standard deviation setting in the middle, and the 1.5 standard deviation setting on the bottom. As the graphics show, the standard deviation settings can be fine tuned to adjust the frequency of signals generated by the indicator in a given time period. The exact settings should be largely a function of one’s preferred trading time horizon.