Jeff: Marc, can you tell us a little bit about your background as an analyst?
Marc: “Fifteen years ago, I was an assistant on the trading desk. Traders would shout their orders to me and it was my job to execute those orders.
“Marc, buy me Intel at 95 3/8!”
“Sell Softee (Microsoft) at 3/16!”
“Where’s my fill on Cisco?! Forget it!!! Cancel! CANCEL!!!!”
After two years, I got pretty good at “reading the tape” and having a feel for the direction of certain stocks and the market.
And over the next decade, I became skilled at reading financial statements and picked up my credential as an NASD licensed analyst for the most contrarian research firm on Wall Street. I also became an expert in technical analysis, serving as an officer for a prestigious technical analysis society on the West Coast.
Jeff: So how did you become involved in super-computing?
Marc: Some of the Hedge Funds that used my research were “quantitative trading” funds, meaning that they used computer systems to determine which stocks were on the rise…
Many of them frustrated me greatly because they had all these resources at their fingertips and yet they still were getting it wrong.
They were so rigid in their formulas that they refused to consider outside ideas.
What I mean is, when using their expensive computer systems, they would look for obvious patterns and indicators instead of trying to uncover the unique ones that people didn’t know about.
I knew I could do better and I wanted the chance to prove it.
Jeff: Is that when you got involved with this project at The Oxford Club?
Marc: At first, I tried to go it alone.
Developing my own systems, I was able to generate a pretty decent win rate, but the gains just weren’t big enough. I knew I needed more resources to create a better system.
So when I was asked to join The Oxford Club’s secret “super-computing” project, I was awfully excited.
I was blown away by what they were working with. The platform could incorporate thousands of variables and crunch way more data than any other system I’d ever seen.
Most importantly, the system could do one thing that most others could not – analyze both fundamental and technical data. It could isolate stocks that not only had strong financial performance but ones that were likely to make a major move.
Fundamental analysts experimenting with the system would have no idea which technical variables could be used to improve performance and minimize risk.
Similarly, chartists would be clueless about whether earnings, sales, or cash flow growth would have an impact on their stocks.
But that’s where my training came in.
I threw myself into the project, testing out thousands of combinations of variables. I stayed up nights playing with my new toy. My wife said I was acting (and sometimes looked) like a mad scientist.
There were so many different criteria to try out. Some, I instinctively knew would work – increasing book value per share, high cash flow, insider buying.
But a few caught me by surprise.
Jeff: Can you give us some specifics?
Marc: I’ll tell you about two of them:
1) The first is one you’ve already talked about in this presentation called the “Sweet Spot.”
I was using a technical indicator at the time known as the Relative Strength Index or RSI. The RSI measures the speed and change of price movements.
Most analysts will tell you to buy a stock when it’s “oversold.” But the computer helped me determine that this was in fact wrong.
Often, a stock is oversold because, quite frankly, it stinks. Investors sell it because the stock really is a dog and no one wants to own it. And they shouldn’t.
But what S.T.A.R.S. helped me see is that there was a “sweet spot” for selling activity… An area where the bad stocks separated from the good stocks. The bad stocks kept going lower while the good stocks suddenly shot back up.
By crunching the data, the computer could very quickly alert me to which stocks were in fact strong companies trading at inexpensive valuations.
The computer would then alert me to these stocks, and give me a chance to research them for myself.
2) The second one I call the “Magic Bullet.”
I have to admit with this second one, I discovered it totally by accident.
But sometimes mistakes result in surprising success…
The Post-it note, for instance, was the result of an attempt to make a strong adhesive. The glue didn’t work well, but worked well enough to stick small pieces of paper to other things
Silly Putty was a failed effort to make rubber during World War II.
And of course, penicillin was discovered after Sir Alexander Fleming noticed that a discarded Petri dish had mold growing on it that dissolved the bacteria around it.
My accidental discovery happened during the many hours that I spent tinkering with the system’s formulas.
While doing some refinement, I accidentally clicked on the wrong spot, asking the computer to search for a certain variable that I had never thought to include before.
And the difference in the results was staggering.
Prior to discovering the “Magic Bullet,” we ran a back-test to compare this early S.T.A.R.S. formula to the S&P 500 between April 2001 and March 2011 and found that S.T.A.R.S. outperformed the market by 584 percentage points.
Not bad right? Some people would be thrilled with that kind of outperformance.
But when I added this additional metric, the system out-gained the market by 1,568 percentage points!
Individual gains started rocketing out. 325% in six months… 491% in five months… 284% in 12 months.
That’s the beauty of a tool like this. The supercomputer can help us understand what combinations of variables can produce the best results.
After spending thousands of hours working on the formula, I knew I had reached a breakthrough… Which is why I’m creating a new publication share my research with a few of our members.
Jeff: So how will all this work?