Mean Reversion and ROC – PJ Sutherland – GREAT

One of the key parts here in my view is in fact the magnitude of the price change in the short term, and you can measure that in a number of ways. But one way that I measure that is quite simply with the rate of change. The optimal parameter that I found works really well, there is just across four days, but anything from 1 to 10 days even 20 days should work. Over a 4-day period relatively short period of time, we’re looking to measure how far a stock has moved from its close 4 days ago. If we’re looking to for instance if the stock had lost 5 percent in 4 days, that’s pretty steep but it’s not nearly as steep as a stock that has lost 20 percent in 4 days. A stock that has lost 20 percent is much more likely to reverse, and so it is the magnitude in my view that determines the (a) the success of a mean reversion trade, and (b) the likely payoff. Now the difficulty of course with rate of change is that it’s unbounded, and so stocks that are super volatile would have very different rate of change readings over a 4-day period relative to one that’s not as volatile. Then also the price series compared to itself would vary when trading in a bear market would have rate of change that is of course because of the high volatility, they’ll be a lot more extreme as opposed to bull markets. And so, one thing we need to do is attempt to normalize the rate of change, so we get a reading that is precisely the same or means the same to us in any environment and regardless of the stock we choose. There are any number of ways that we do it. I do that in a proprietary fashion but one simple way would be to prepare the existing reading to values prior to that. You could for instance say over the last 100 days, 200 days, 300 days or whatever the case maybe and see where the reading ranks relative to the previous history. That would be one way to normalize it, but there obviously are other ways, and we use quite a different technique. But once I started including and focusing on the magnitude of price change, that’s really where I started to see phenomenal test results

I found that focusing exclusively on the long side when we’re in a bull market is quite an effective approach, and then short side, I’m quite happy to short in any environment.

I found that focusing exclusively on the long side when we’re in a bull market is quite an effective approach, and then short side, I’m quite happy to short in any environment.

Just coming back to your previous question, one of the things I do in the more subjective approach that I run in our lighter version is I found it very effective to monitor the trends in the global environment, and I call it ‘big picture analysis’. Quite simply what I do is I assess the 37 exchanges in the world and simply apply moving averages to them, so it’s a 200-day moving average and then I would assess the percentages of those indices trading either above or below. Above 70 percent above the 200-day moving average, that’s a great bull environment. Then we’re looking to be aggressive on the long side mean reversion and then obviously when it deteriorates, we would do contrary positions. Then specifically in terms of risk preference, I run an indicator, which I’ve developed, which I call the synthetic volatility index. One of the reasons I developed that was because I use the VIX Index quite a bit in our trading in the US for all sorts of things, for position sizing, regime change, and I found it to be very, very effective. Now the problem with the VIX of course is that you can’t easily apply it to any exchange in the world. It’s quite a complex instrument made up of options, etc., so I came up with an indicator that when applied to the S&P 500 is correlated with the VIX above 0.9. I think it was about 0.92 to 0.93, so strongly, strongly correlated with the VIX, which means now we can go in and apply that to any index in the world. And actually the indicator is very, very simple. I take, if I recall, the moving average of the 1-day true range divided by the close and I just average it out over 20 days, that would be correlated with the VIX, and then I look to determine whether volatility is rising or falling so that I apply an additional moving average to that reading. If the first reading is above the moving average, then we’re in a rising volatility environment and vice versa. The top-down approach to that, that I applied the light version, we apply it across these 37 exchanges. And so obviously what I’m hoping to see, if I’ve got 70 percent of exchanges in bull runs, I’m hoping to see that coupled with low volatility. The minute that I see that both of these indicators are not moving in lockstep, then that would be naturally a concern.

But we did find that our strongest returns are in uncertain environments. The key reason for that really is the inefficiencies that exist because of mean reversion is driven by human emotion, and if the stocks are moving $0.01 a day, everyone is quite happy and when a stock is moving $10 a day, we start to have a lot of emotion. That’s where traders tend to push price to these temporary and unsustainable extremes that tend to be unsustainable. They are a lot more susceptible creating those opportunities that we look to exploit in uncertain environments, which would also explain why we see great returns in bear environments.

When we are switching environments, one trick is naturally to tighten up the parameters whereas going to bear markets, you might say readings below 5, make it really, really tight where in a bull run you would perhaps consider readings below 50. But in essence the emotional barometers are based on rate of change.

I started to iterate across what I call the mean reversion curve, so all the strategies that are tremendously robust and I had entries across that entire entry curve and exits on the exit side. Instead of allocating to a single set of parameters, we’ve gone and construct an algorithm that doesn’t automatically for clients in QuantLab, so literally kick a button and it would put together 30 or 40 strategies that are now diversified across the entire curve. Instead of making an otherwise 100 percent allocation through a single parameter combination, it would be a big 5 percent, 3 percent whatever the case maybe, since we’ve deployed that, we’ve seen really some phenomenal performance. We’re now achieving average results through time are much less susceptible to the effects of luck. In fact, if you do it correctly with a large enough account you almost completely eliminate luck. One of the big payoffs is you start to see performance that very closely resembles that of a test.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s