 Model Refactoring
 General Approach
 Market Assumptions
 Safety Nets
 Steps
 Results
 Bear Market Case Study
 Conclusion
This is my first attempt at making a blog post about a method in quantitative investing, where I investigate the capital asset pricing model, a popularly used equation that seeks to find expected return rates on assets given beta.
This is also the algorithm currently that I have conducted forwardtesting on since 201812.
A visualization of the backtesting results of employing a strategy using CAPM will be provided at the end of this post.
In later posts, I will also talk about the arbitrage pricing theory, FamaMacbeth regression, and various other techniques employed by hedge funds such as statistical arbitrage.
Model refactoring
The traditional equation can be represented as follows:
$r_i = r_f + \beta_i(r_m  r_f)$$r_f =$ Risk free market return
$r_m =$ Market return
$\beta =$ Beta
For the statistician and quant, however, it would be more plausible to represent and refactor the previous equation into the following for leastsquares regression analysis:
$r_i  r_f = \beta_i(r_m  r_f) + \epsilon$In this case, $\alpha$ is represented by the intercept, or $\epsilon$. The slope is $\beta$. A graph visualization can be seen as follows:
With this data, we can move onto establishing our approach.
General approach
Now that the general premises are established, we can move onto creating a strategy. Thinking on the broader scale, we can apply a strategy to the Dow Jones Industrial Average with some modifications to stock selection.
Based on observable trends, we notice that for the market in general, stocks that beat the market will continue to beat the market over prolonged periods of time.*
At the same time, learning how to detect when to pare down risk prior or during recessions is also important. As a result, our strategy will have the following guidelines:

Optimize returns on directional alpha over minimizing beta

Pick the top 5 stocks based on alpha from our regression analysis out of the DOW 30
 Regression is based on the past 15 trading days (3 weeks)


Pare down risk during recession
 Invest in bonds and gold (TLT and GLD)
 Reduce risk by decreasing holdings
 Hedge with long puts

Monthly executions
 Avoid large commission fees
 Less important to compete with HFT and hedge funds for intraday price execution
 Only a maximum of 5 trades a month, reducing commission fees

Selecting stocks outside of DOW
 New tech conglomerates not included in DOW (AMZN …)
 Defense companies because there the government will always be there to buy weapons (Raytheon, Lockheed Martin, etc.)
 Potential growth companies that achieve high fundamental evaluation (TWLO, SHOP, TTD, etc.) < though this had not been included in the algorithm
Market assumptions
With this strategy, one can reasonably assume that during a market the CAPM model will outperform the index through using monthly turnovers to automatically invest in high alpha stocks. Taxes and commissions could be issues, depending on the broker and whether or not one is using an LLC to handle profits.
As momentum and trendfollowing strategies suggest, the underlying assumption is that stocks that beat the market will continue to beat the market. Mean reversion is just the opposite  if a stock price goes up one should expect it to go back down. Both of these are right, depending on time frames.
Our strategy is based heavily on trendfollowing more so than momentum, as we are looking at price action over fundamentals. In this case, the bull market heavily favors us, but we still need to have a plan to reduce drawdown risk during recessions.
Safety nets
Momentum strategies that focus solely on alpha without accounting for volatility in the market are bound to fail epicly during recessions, and especially during depressions.
Expect more than 50% drawdown of an account value if there are no cases to test for a recession (i.e. no hedging, no reduction of holdings, etc.). Of course, not everyone can exactly time the recession, but there are ways to make sure there are at least some safety nets in place.
In the strategy implemented, we will only look at the moving averages to determine whether or not the possibility of entering a bear market exists. The problem is when the stock market has a panic in one day, ultimately resulting in a failure to minimize recession risk.
Here are a few potential datapoints to measure the recession:

Strength of the economy

Obtaining raw data to measure important metrics for determining the health of the economy
 Text mining, web scraping, NLP for news articles
 Inversion of the yield curve
 The debt cycle
 …


Simple technicals
 Trading below anything rolling 75+ EMA/SMA (we will be using this)
 Finding potential tops in the market
 Meanvariance analysis
 Volume analysis
 …
For this strategy, the importance of being able to adjust the model and holdings based on the state of the economy is quite important to reduce drawdown and increase our Sharpe ratio.
Steps
I won’t be publishing the algorithm here, but in general the idea is there. It goes something like this:
Initialization / Data Preparation (at beginning):

Obtain
 Historical data
 Extract

Filter
 Provide a list of tickers
 Create a filtered list
Rebalancing / Scheduled Functions (lowfrequency, monthly):

Analyze
 Regression
 Machine Learning
 Normalization

Select

Greeks
 Alpha, Beta, Pi, Theta, etc.

Optimization
 Grid search, random search, simulated annealing, RHC, genetic algorithms


Execute
Results
I executed this strategy on the QuantConnect platform and obtained the following results:
The returns are remarkable, but here’s the catch  this is just a backtest. To truly be confident that our algorithm works out as we would like to, there are still many factors to consider.
 Do not commit significant capital to an algorithm until it cannot tell the difference between any timeframe.
 Test during periods (our algorithm did not perform too well from 2014  2016).
 Changing parameters to overfit to the noise in data.
 …
Not to mention, the maximum drawdown in this case was at around 39%. This is pretty substantial, because imagine if during the next recession 70m becomes 42m in the span of a month. It wouldn’t feel so good.
Check out the trades here if you’d like, I will present forth a visualization of the trades at some point, or if you would like to provide one for me I’d gladly love to hear more about it.
Bear market case study
Luckily for us in the 2008 recession, since the algorithm switches to treasuries in the case of a recession, we were able to gain back the money lost and also made more through capitalizing on both the bear market and buying stocks at a discount, though this happened after months of painful losses.
A plausible solution would be to:
 Dynamically allocate holdings based on economic health

Instead of monthly executions, we check whether or not the bear criteria is met in the same day.
 Tradeoff: Being unable to tell the difference between corrections and recessions, and higher trading commisions.
In my recent forwardtest as well for the near bearmarket we just experienced, I was able to achieve a 50% return from December 2018 to May 2019, mainly due to the algorithm switching to TLT in November and essentially holding it until February, achieving a 10% return from this period.
The algorithm then was able to buy the stocks at a discount as well, which is precisely why we use a rolling 75 day mean in order to execute our treasury transfer signal or equity holding signals. The defense stocks also played a large factor as they tend to have higher alphas than other stocks when the market is relatively weak.
Conclusion
There are still many ways we can improve the model and make more probable decisions about how to play the recession. We can also develop an equation, a deep learning model, with hyperparameters to dynamically allocate our holdings in specific stocks and for stock selection.
I plan to utilize this strategy for my individual account and will make updates to how it performs in the next bear, sideways, or bull market. The 37% annualized return is too good to be true due to the biases I listed above, and I will already be delighted if it can even achieve 13% on average a year.