Technically Speaking, May 2015


Ethics is the lead story in this month’s magazine. The MTA is adopting a large body of knowledge related to all areas of finance through a licensing agreement with the CFA Institute. All finance professionals, whether they analyze fundamental, quantitative or technical data, share common goals (finding profitable opportunities) and share a common operating environment. Given all of the commonalities, it’s not surprising we share the same ethical requirements.

This new body of knowledge will not require members of the MTA to change anything they do professionally. The original MTA Code of Ethics was comprehensive and covered all of the important standards of professional behavior. The shortcoming was a lack of case studies and examples of how to apply the Code. This licensing agreement makes all of the CFAI’s Code and Standards developed over several decades available to MTA members.

For CMT candidates, this licensing agreement provides clear readings which will make studying for the exam a more efficient process.

In short, there are no changes required of MTA members to meet the requirements of the new Code and Standards. One benefit is that there are now examples of how to apply ethics in everyday situations. Another benefit is the clear material that CMT candidates will have to study ethics so that there will be no surprises on the exam related to ethics. In short, we have found a risk-free opportunity to partner with the CFA Institute and we are excited to begin the next stage of the MTA’s growth.

We also have articles related to the tools technicians use to analyze the markets and the techniques they apply to find profitable trading opportunities. As always, we welcome your feedback on what you would like to see in future issues of Technically Speaking. Please let us know by emailing us at

Michael Carr

What's Inside...

A Code of Ethics is one of the minimum requirements of a profession. Since its founding in 1974, MTA members...

Read More


One of the advantages of the MTA’s licensing agreement with the CFA Institute is the ability to provide detailed study...

Read More


Long-time MTA member James G. Birmingham passed away on April 18,...

Read More


Editor’s note: Tucker Balch was one of the presenters at the 2014 Annual Symposium. That presentation can be viewed in...

Read More


Editor’s note: Milton Erzati was one of the presenters at the recent Annual Symposium. His presentation included some of the...

Read More


Editor’s note: longevity is rare in the investment newsletter business. It’s certainly not unheard of but it is rare. Tom...

Read More


Editor’s note: this was originally posted at and is reprinted her with permission. Inovancetech is offering the free use...

Read More


Editor’s note: The Chameleon Oscillator was detailed in the April 23 issue of Bloomberg Brief: Technical Strategies. This article is...

Read More


We all learn the basics of technical analysts differently. Some of us read the classics and think about how those...

Read More


Editor’s note: this article was originally published in Proactive Advisor Magazine and is reprinted here with permission.

A view into...

Read More


Point and Figure (P&F) is one of the oldest tools of technical analysis. It’s been used since at least the...

Read More


A previous blog discussed Wilt Chamberlain’s unsuccessful attempt at following up on his successful basketball career with a successful IPO....

Read More


Editor’s note: This was recently published at Alvarez Quant Trading and is reprinted here with permission. A spreadsheet with the...

Read More


Editor’s note: this article was originally published by the National Association of Active Investment Managers and is reprinted here with...

Read More


Editor’s note: this was published in late-April and is presented as an example of long-term technical analysis.

Two weeks ago...

Read More


Editor’s note: This was published by Intalus at their educational blog. The code for implementing this strategy in Tradesignal is...

Read More


Technical analysis has been used by traders for centuries. But it has only been taught as a for-credit course at...

Read More


In this year’s Charles H. Dow Award paper, I described how the VIX Fix indicator could be applied as a...

Read More


A Code of Ethics is one of the minimum requirements of a profession. Since its founding in 1974, MTA members have been required to adhere to a comprehensive Code of Ethics. The Code consists of a single page that defines the highest standards of ethics. Although it is brief, the Code is comprehensive and has met the needs of the Association.

If there is a shortcoming in the Code, it was that it did not include a large number of examples or provide additional study material for CMT candidates. This made ethics one of the more challenging topics to study for when preparing for the exams. The Code was clear and comprehensive but without examples, candidates for the exams often felt unprepared for the exams.

To address this concern, the Ethics and Standards Committee set out to develop a detailed Body of Knowledge related to ethics. In their research, they realized the CFA Institute had already completed this initiative and had an extensive Body of Knowledge that applied to all finance professionals. Rather than investing Association resources in reproducing the material the CFAI had built over a number of years, the Committee recommended that the MTA license the CFAI material. The MTA Board of Directors approved this recommendation and CMT test candidates now have access to extensive test preparation materials. The new Code and Standards will be tested beginning with the October 2015 CMT test.  All members will be required to comply with the new Code and Standards beginning June 1, 2015.

A rapid transition is possible because the spirit of the current MTA Code of Ethics is preserved along with the additional clarity of the CFAI Code and Standards. The MTA has always promoted high ethical standards and adoption of the Code and Standards won’t change that. This change brings a comprehensive list of best practices and clear examples to our members. It has long been a goal of the MTA to develop this Body of Knowledge. Licensing the content from CFAI makes this information available to members immediately.

The MTA and CFA Institute believe a single Code and Standards benefit the global financial community by setting high standards of education, integrity, and professional excellence.

The CFAI Code of Ethics can be downloaded for free. The Code is explained in detail with examples, best practices and sample questions in the Standards of Practice Handbook. The Handbook can also be downloaded for free and will be a valuable resource for members even if they are not studying for the CMT.

Craig Johnson, CFA, CMT, is President of the MTA.


Craig W. Johnson, CMT, CFA

Craig W. Johnson, CMT, CFA is a Managing Director and Senior Technical Research Analyst currently directing Piper Sandler’s technical research group. Johnson joined Piper Jaffray in 1995 as an analyst in the firm’s private client research department. He offers frequent technical commentary...


One of the advantages of the MTA’s licensing agreement with the CFA Institute is the ability to provide detailed study material for the CMT exam. Clear examples for all parts of the Code and Standards are available. This will help candidates prepare for the exam.

Below are several examples of the types of questions that will be on the October 2015 CMT exam. These questions are intended to demonstrate the type of question and are not actual exam questions.

1.  William is the research analyst responsible for following Company X. He uses a relative strength analysis model.  This model indicates the stock should be rated a weak “hold.” During lunch, however, William overhears a financial analyst from another firm whom he respects offer opinions using Elliott Wave analysis that conflict indicate the stock is a strong “buy.” Upon returning to his office, William releases a strong “buy” recommendation to the public and states the recommendation is based on his standard research techniques.  William:

A. Violated the Standards by failing to distinguish facts from opinions in his recommendation.
B. Violated the Standards because he did not have a reasonable and adequate basis for his recommendation.
C. Was in full compliance with the Standards.

Answer to question 1: The correct answer is B. The question relates to Standard V(A) – Diligence and Reasonable
Basis. The opinion of another financial analyst is not an adequate basis for William to change his
recommendation. Answer C is therefore incorrect. Answer A is incorrect because, although it is necessary to
distinguish facts from opinions, the question did not present a problem related to this point. Had William stated
the opinion was due to an Elliott Wave analysis, there would not be a violation.

2.  An investment management firm has been hired by ABC Corporation to work on an additional public offering for the company. The firm’s technical analysis unit now has a “sell” recommendation on ABC, but the head of the investment banking department has asked the head of the technical analysis unit to change the recommendation from “sell” to “buy.” According to the Standards, the head of the technical analysis unit would be permitted to:

A. Increase the recommendation by no more than one increment (in this case, to a “hold” recommendation).
B. Place the company on a restricted list and give only factual information about the company.
C. Assign a new analyst to decide if the stock deserves a higher rating.

Answer to question 2: The correct answer is B. This question relates to Standard I(B) – Independence and Objectivity. When asked to change a recommendation on a company stock to gain business for the firm, the head of the technical analysis unit must refuse in order to maintain objectivity in making recommendations.  Answer A is incorrect because changing the recommendation in any manner that is contrary to the analyst’s opinion violates the duty to maintain independence and objectivity. Answer C is incorrect because assigning a new analyst will not eliminate the conflict of interest.

3. Which one of the following actions will help to ensure the fair treatment of brokerage firm clients when a new investment recommendation is made?

A. Informing all people in the firm in advance that a recommendation is to be disseminated.
B. Distributing recommendations to institutional clients prior to individual accounts.
C. Minimizing the time between the decision and the dissemination of a recommendation.

Answer to question 3: The correct answer is C. This question relates to Standard III(B) – Fair Dealing. The steps listed in C will help ensure the fair treatment of all clients. Answer A may have negative effects on the fair treatment of clients because the more people who know about a pending change, the greater the chance someone will inform clients before the official release. Answer B discriminates against some clients based on the size and class of their accounts and is a violation of the Standard.

4. Which of the following is a correct statement of a member’s duty under the Code and Standards?

A. In the absence of specific applicable law or other regulatory requirements, the Code and Standards govern the member’s actions.
B. A member is required to comply only with applicable local laws, rules, regulations, or customs, even though the Code and Standards may impose a higher degree of responsibility or a higher duty on the member.
C. A member who trades securities in a securities market where no applicable local laws or stock exchange rules regulate the use of material nonpublic information may take investment action based on material nonpublic information.

Answer to question 4: The correct answer is A. This question relates to Standard I(A) – Knowledge of the Law.  Members who practice in multiple jurisdictions might be subject to various laws and regulations. If the applicable law is stricter than the Code and Standards, members must adhere to the law; otherwise, members must adhere to the Code and Standards. Therefore, answer A is correct. Answer B is incorrect because members must adhere to the higher standard set by the Code and Standards if local laws are less strict. Answer C is incorrect because when no applicable law exists, members are required to adhere to the Code and Standards and the Code and Standards prohibit the use of material nonpublic information.

Watch for more ethics quizzes and questions that can be used to study for the new CMT exam in future issues of Technically Speaking.



Long-time MTA member James G. Birmingham passed away on April 18, 2015 at the age of 84. Jim has been a member of the MTA since 1974.

Two of the defining characteristics of Jim’s life were his love of his family and the markets. He spent 59 years working in the markets yet his resume can be condensed into just three lines. Jim served in the Navy, worked briefly for a broker and then managed money through a hedge fund. Just three positions over six decades.

Jim was born in Boston on June 30, 1931 and graduated from Boston College in 1952.  From 1953 until 1956, he served as a Lieutenant JG in the United States Navy during the Korean Conflict. After leaving the Navy, Jim started as a broker in Boston with Proctor Cook and Company. As he recalled years later:

“I was a partner at a member firm called Proctor Cook at that time and we were kept alive by the fixed commission rules. We cleared through E.F. Hutton, and thus had the exclusive right to offer “The Granville Letter” in the Boston area.

We ran a small ad in the Boston Globe, and were overwhelmed with requests for Joe’s letter. Four brokers suddenly had almost 1000 leads. And they continued to pour in as long as Joe was hot. Institutions sent a runner over each morning to get the letter. Suddenly, we were “Institutional” brokers. We were experts because we knew how to use one of those revolutionary copying machines. The firm’s profits skyrocketed. And then, old Joe hit 50, his streak ended, and we had no more leads. So, cash in hand, I left to start a Hedge Fund, financed to a large extent by some of those leads that Joe led us to.”

Jim managed private investment partnerships until shortly before his death. He returned funds to his investors earlier this year and passed away unexpectedly shortly after that.

The fact that Jim managed these partnerships for decades demonstrates his success. Although he attended MTA Symposiums and Chapter meetings for many years, Jim did not discuss his performance which was described as “outstanding” by sources familiar with his work. It is believed he suffered just one down year in his career and he attributed the loss to trying something different.

Jim focused his energy on consistently finding winning investments. He understood fundamentals and technicals and took large positions when he knew he had an edge. He found success on both long trades (for example, Apple in recent years) and short trades (Broadcom in the 2000 bear market being one example).

One difference between Jim Birmingham and other money managers is that Jim understood what he wanted to do. He did not advertise his services or spend a great deal of time raising money and he rejected opportunities to manage larger hedge funds. He was happy with where he was in the business and in life. His is an example of a life lived well –doing what he loved, being satisfied to do what he loved and being surrounded by those he loved.

Jim is survived by his wife, Carolyn, and son, Stephen.



Editor’s note: Tucker Balch was one of the presenters at the 2014 Annual Symposium. That presentation can be viewed in the Symposium archives. This was originally posted at The Augmented Trader blog.

“I’ve never seen a bad backtest” — Dimitris Melas, head of research at MSCI.

About backtests

A backtest is a simulation of a trading strategy used to evaluate how effective the strategy might have been if it were traded historically. Backtestesting is used by hedge funds and other researchers to test strategies before real capital is applied. Backtests are valuable because they enable quants to quickly test and reject trading strategy ideas.

All too often strategies look great in simulation but fail to live up to their promise in live trading. There are a number of reasons for these failures, some of which are beyond the control of a quant developer. But other failures are caused by common, insidious mistakes.

An over optimistic backtest can cause a lot of pain. I’d like to help you avoid that pain by sharing 9 of the most common pitfalls in trading strategy development and testing that can result in overly optimistic backtests:

1. In-sample backtesting

Many strategies require refinement, or model training of some sort. As one example, a regression-based model that seeks to predict future prices might use recent data to build the model. It is perfectly fine to build a model in that manner, but it is not OK to test the model over that same time period. Such models are doomed to succeed.

Don’t trust them.

Solution: Best practices are to build procedures to prevent testing over the same data you train over. As a simple example you might use data from 2007 to train your model, but test over 2008-forward.

By the way, even though it could be called “out-of-sample” testing it is not a good practice to train over later data, say 2014, then test over earlier data, say 2008-2013. This may permit various forms of lookahead bias.

2. Using survivor-biased data

Suppose I told you I have created a fantastic new blood pressure medicine, and that I had tested it using the following

  1. Randomly select 500 subjects
  2. Administer my drug to them every day for 5 years
  3. Measure their blood pressure each day

At the beginning of the study the average blood pressure of the participants was 160/110, at the end of the study the average BP was 120/80 (significantly lower and better).

Those look like great results, no? What if I told you that 58 of the subjects died during the study? Maybe it was the ones with the high blood pressure that died! This is clearly not an accurate study because it focused on the statistics of survivors at the end of the study.

This same sort of bias is present in backtests that use later lists of stocks (perhaps members of the S&P 500) as the basis for historical evaluations over earlier periods. A common example is to use the current S&P 500 as the universe of stocks for testing a strategy.

Why is this bad? See the two figures below for illustrative examples.

The green lines show historical performance of stocks that were members of the S&P 500 in 2012. Note that all of these stocks came out of the 2008/2009 downturn very nicely.

What really happened: If, instead we use the members of the S&P 500 starting in 2008, we find that more than 10% of the listed companies failed.

In our work at Lucena Research, we see an annual 3% to 5% performance “improvement” with strategies using survivor biased data.

Solution: Find datasets that include historical members of indices, then use those lists to sample from for your strategies.

3. Observing the close & other forms of lookahead bias

In this failure mode, the quant assumes he can observe market closing prices in order to compute an indicator, and then also trade at the close. As an example, one might use closing price/volume to calculate a technical factor used in the strategy, then trade based on that information.

This is a specific example of lookahead bias in which the strategy is allowed to peek a little bit into the future. In my work I have seen time and again that even a slight lookahead bias can provide fantastic (and false) returns.

Other examples of lookahead bias have to do with incorrect registration of data such as earnings reports or news.  Assuming for instance that one can trade on the same day earnings are announced even though earnings are usually announced after the close.

Solution: Don’t trade until the open of the next day after information becomes available.

4. Ignoring market impact

The very act of trading affects price. Historical pricing data does not include your trades and is therefore not an accurate representation of the price you would get if you were trading.

Consider the chart below that describes the performance of a real strategy I helped develop. Consider the region A, the first part of the upwardly sloping orange line. This region was the performance of our backtest. The strategy had a Sharpe Ratio over 7.0! Based on the information we had up until that time (the end of A), it looked great so we started trading it.

When we began live trading we saw the real performance illustrated with the green “live” line in region B– essentially flat. The strategy was not working, so we halted trading it after a few weeks. After we stopped trading it, the strategy started performing well again in paper trading (Region C, Arg!).

Performance of a strategy that looked great in backtesting (region A). When traded live, it didn’t work well (region B).  When we stopped trading it, it went back to working well (region C).

How can this be? We thought perhaps that the error was in our predictive model, so we backtested again over the “live” area and the backtest showed that same flat area. The only difference between the nice 7.0 Sharpe Ratio sections and the flat section was that we were engaged in the market in the flat region.

What was going on? The answer, very simply, is that by participating in the market we were changing the prices to our disadvantage. We were not modeling market impact in our market simulation. Once we added that feature more accurately, our backtest appropriately showed a flat, no-return result for region A. If we had had that in the first place we probably would never have traded the strategy.

Solution: Be sure to anticipate that price will move against you at every trade. For trades that are a small part of overall volume, a rule of thumb is about 5 bps for S&P 500 stocks and up to 50 bps for more thinly traded stocks. It depends of course on how much of the market your strategy is seeking to trade.

5. Buy $10M of a $1M company

Naïve backtesters will allow a strategy to buy or sell as much of an asset as it likes. This may provide a misleadingly optimistic backtest because large allocations to small companies are allowed.

There often is real alpha in thinly traded stocks, and data mining approaches are likely to find it. Consider for a moment why it seems there is alpha there. The reason is that the big hedge funds aren’t playing there because they can’t execute their strategy with illiquid assets. There are perhaps scraps of alpha to be collected by the little guy, but check to be sure you’re not assuming you can buy $10M of a $1M company.

Solution: Have your backtester limit the strategy’s trading to a percentage of the daily dollar volume of the equity.  Another alternative is to filter potential assets to a minimum daily dollar volume.

6. Overfit the model

An overfit model is one that models in-sample data very well. It predicts the data so well that it is likely modeling noise rather than the underlying principle or relationship in the data that you are hoping it will discover.

Here’s a more formal definition of overfitting: As the degrees of freedom of the model increase, overfitting occurs when in-sample prediction error decreases and out-of-sample prediction error increases.

What do we mean by “degrees of freedom?” Degrees of freedom can take many forms, depending on the type of model being created: Number of factors used, number of parameters in a parameterized model and so on.

Degrees of freedom (X), versus error (Y). Overfitting is occurring in the region to the right of the yellow symbol as out of sample error increases.

Solution: Don’t repeatedly “tweak” and “refine” your model using in-sample data. And always compare in-sample error versus out-of-sample error.

7. Trust complex models

Complex models are often overfit models. Simple approaches that arise from a basic idea that makes intuitive sense lead to the best models. A strategy built from a handful of factors combined with simple rules is more likely to be robust and less sensitive to overfitting than a complex model with lots of factors.

Solution: Limit the number of factors considered by a model, use simple logic in combining them.

8. Trust stateful strategy luck

A stateful strategy is one whose holdings over time depend on which day in history it was started. As an example, if the strategy rapidly accrues assets, it may be quickly fully invested and therefore miss later buying opportunities. If the strategy had started one day later, it’s holdings might be completely different.

Sometimes such strategies’ success vary widely if they are started on a different day. I’ve seen, for instance, a difference in 50% return for the same strategy started on two days in the same week.

Solution: If your strategy is stateful, be sure to test it starting on many difference days. Evaluate the variance of the results across those days. If is large you should be concerned.

9. Data mining fallacy

Even if you avoid all of the pitfalls listed above, if you generate and test enough strategies you’ll eventually find one that works very well in a backtest. However, the quality of the strategy cannot be distinguished from a lucky random stock picker.

How can this pitfall be avoided? It can’t be avoided. However you can and should forward test before committing significant capital.

Solution: Forward test (paper trade) a strategy before committing capital.


It is better to view backtesting as a method for rejecting strategies than as a method for validating strategies. One thing is for sure: If it doesn’t work in a backtest, it won’t work in real life. The converse is not true: Just because it works in a backtest does not mean you can expect it to work in live trading.

If you avoid the pitfalls listed above, your backtests stand a better chance of more accurately representing real life performance.


Tucker Balch, Ph.D.

Tucker Balch, Ph.D. is a former F-15 pilot, professor at Georgia Tech, and co-founder and CTO of Lucena Research, an investment software startup. His research focuses on topics that range from understanding social animal behavior to the challenges of applying Machine Learning...


Editor’s note: Milton Erzati was one of the presenters at the recent Annual Symposium. His presentation included some of the information in his book which is reviewed here.

Thirty Tomorrows starts with well-known, bearish demographic facts and then demonstrates that the outlook is actually bullish. Milton Erzati is an economist who does an excellent job summarizing the challenges the global economy faces.  Any economic data series can be presented on a chart and as Erzati points out, it can be used to develop a forecast but the right side of the chart (the future) is uncertain and is likely to provide upside surprises. That’s an important point that is often overlooked. Many forecasts consider downside surprises but the upside surprises are often overlooked although they are usually just as likely to occur.

The demographic data Erzati focuses on can be summarized in the dependency ratio, a measure showing the number of dependents (the number of people who are less than 14 years old and over the age of 65) to the total population of working aged citizens (those between 15 and 64 years old). The table below, adapted from Thirty Tomorrows, shows the problem:

In developed and emerging economies the trend in the dependency ratio. A decline in the dependency ratio indicates there will be an increasing number of older and younger citizens dependent upon a shrinking pool of working age citizens. Based solely on this data it’s reasonable to assume that economic growth will slow and the standard of living could decline.

This is often where the analysis of demographics ends. The saying that “demographics are destiny” is widely accepted as an absolute truth of investing. This statement leads to the belief that the stock market is doomed because as Federal Reserve economists explained in a 2011 paper called “Boomer Retirement: Headwinds for U.S. Equity Markets?”

Historical data indicate a strong relationship between the age distribution of the U.S. population and stock market performance. A key demographic trend is the aging of the baby boom generation. As they reach retirement age, they are likely to shift from buying stocks to selling their equity holdings to finance retirement.  Statistical models suggest that this shift could be a factor holding down equity valuations over the next two decades.

Using historical data, the Fed economists were able to construct a market forecast that is bearish for the next decade.

Source: Federal Reserve Bank of San Francisco

Demographic arguments are logical and appealing, but as Erzati explains in Thirty Tomorrows demographics aren’t necessarily destiny. This economist’s view of the world is that the future is far from predetermined.

Economies and markets adapt over time. Markets are also efficient to at least some degree and that means they will almost certainly not follow the path projected by simple demographic models. Instead, markets will move in line with economic and policy changes that will be occurring in response to demographic changes.

This demographic shift seen in lower dependency ratios is important, but it is not a sufficient amount of data to predict the future. As an economist, Ezrati points out that David Ricardo’s work from the 19th century describes one means of adapting.

Ricardo developed the concept of “comparative advantage” and argued international trade is mutually beneficial because it allows each country to specialize in the areas it holds an advantage in. As the labor pool in the United States and other developing markets declines, they should turn away from manufacturing low value added products and services. This will require nations to adopt sound trade policies and avoid protectionist tendencies. Erzati is optimistic international leaders will rise to the occasion and the global economy will adapt to overcome the economic challenges.

Erzati’s work is interesting and important to an understanding of the future investing environment. There will be broad global changes in market leadership. Demographic data and policy decisions will offer insights into the sustainability of trends as they emerge. Traders who understand these trends could be on the right side of decades-long trends, the type that defined the U.S. markets from 1982 to 2000.


Michael Carr, CMT

Mike Carr, who holds a Chartered Market Technician (CMT) designation, is a full-time trader and contributing editor for Banyan Hill Publishing, a leading investment newsletter service. He is an instructor at the New York Institute of Finance and a contributor to various additional...

Milton Ezrati

Milton Ezrati, a Partner at Lord Abbett as well as the firm’s Senior Economist and Market Strategist, is responsible for economic research and strategy, enabling clients to gain context and a further understanding of today’s global markets. Milton joined Lord Abbett in 2000 and...


Editor’s note: this was originally posted at and is reprinted her with permission. Inovancetech is offering the free use of an algorithmic trading platform to MTA members

Artificial intelligence, machine learning, data mining and big data seem to get thrown around in everything from business intelligence to financial services.

Artificial intelligence is used where machine learning should be and machine learning is often confused with data mining. The goal of this post is to clarify these buzzwords, explore how they apply to trading and then explore an ideal subcategory of data mining for traders.

Clarifying the Buzzwords

Artificial intelligence, a subfield of computer science, has three main subcategories; machine learning, curated knowledge and reverse engineering the brain. Machine learning is a method of developing algorithms for recognizing patterns within data. Data mining, also a subfield of computer science, is the whole information discovery process; from preparing and cleaning data to analyzing to post processing and visualizing your results. Data mining uses techniques developed in machine learning, i.e. machine-learning algorithms, and statistics. Here is a diagram of the big picture:

The reason why big data gets thrown into the mix is because data mining and machine learning often involves large and/or complex data sets where traditional data management and processing tools won’t work. For example, if you wanted to capture, curate, store, and analyze blog posts over the past 10 years for a sentimental indicator, traditional data management and processing tools probably wouldn’t cut it.

Applying the Buzz Words to Trading

When I make an investment decision I go through 3 steps.

1. Idea:
Our intuition tells us there is some relationship, or predictive power, between a few indicators and the price of an asset.

2. Analysis:
We analyze the relevant data, whether it be in a chart, CSV, R, etc,…

3. Decision:
We trade. We make an investment decision, an educated bet, based on our analysis.

If we could improve any one of these three steps, we could improve our investment decision. Step 1 is really up to you; there is no substitute for your intuition. Improving your order execution, in step 3, unless you are in the HF space or you need to break your orders down to minimize the impact in the market, is not going to improve the performance of your trade. Step 2, the analysis step, is where we can make the biggest impact. How can we make our analysis as good as possible? This is where the buzzwords come in.

In short, a machine-learning algorithm is better than you or me in analyzing data and discovering valuable information.  Instead of scrolling through charts and creating pivot tables in Excel, we can use a machine-learning algorithm to do the work for us.

Let’s say I have an intuition that GOOG’s PEG ratio, the MACD and StockTwit sentiment have a relationship to Google’s stock price. I could spend a lot of time analyzing that data myself or I can use an algorithm developed in machine learning to mine for patterns for me. These algorithms (decision trees, support vector machines, Naive Bayes classifiers, etc.) uncover the relationship between the indicators I want to analyze and their affect on GOOG’s stock price. The results from the algorithm’s analysis are objective and mathematically supported. Here is a step-by-step tutorial in R using a Naive Bayes algorithm and a few technical indicators to predict the price of AAPL. You can download R for free and copy and paste the code to do it yourself! It will give you a good understanding of the overarching concepts and general process.

One of the main reasons why these technologies are taking time to trickle down to the individual investor is because the results are difficult to interpret. For example, it is difficult to translate beta coefficients, decision matrices, and probability density functions to actionable trading logic, something that a trader without a background in machine learning or data mining is going to understand and be able to use. This is why I would like to introduce a specific subset of data mining that is perfect for the individual trader.

Data Mining in Expressive and Human-readable Form

Data mining is composed of 6 subcategories:

Data mining has a wide application in the financial industry. For example, anomaly detection is used to detect insider trading and fraud and clustering can be used for portfolio optimization and it turns out association rule learning is a perfect fit for traders.

Association rule learning translates the complex output of a machine-learning algorithm into an expressive and humanreadable form. You get an objective, mathematically supported analysis that is easy to understand, and most importantly, easy to apply to your own trading. You have also created a one-of-a-kind strategy, presented in a series of “if-then” statements, that is based on your intuition and fine-tuning. For example,

This is why we developed TRAIDE. You use your intuition to select from the inputs you want to analyze; whether it’s fundamental, technical, macroeconomic or sentimental indicators. You can then select an asset and timeframe just like you would in your trading platform. TRAIDE will select a machine learning algorithm to data mine your inputs for information and then display the results in interactive charts where you can manipulate the inputs to see how they interact with each other. In other words, TRAIDE shows you the valuable information within your data and displays it to you in interactive charts. You select the indicator values you want for your trading strategy and TRAIDE writes them out in plain English. You can print out the rules, code them yourself, test them over new data, or export a report.

Artificial intelligence, machine learning, data mining, and big data have attracted so much attention recently due to the advantages they can provide to a variety of industries. Within the financial industry, these buzzwords have wide applications. For traders, there is a particular subset of data mining that can be utilized to improve upon our trading. We are used to clear and concise trading rules.  We are also used to analyzing charts and row-after-row in Excel. To get the analytical capabilities within data mining and machine learning with a clear set of rules as the output, we can utilize association rule learning.


Justin Cahoon

Justin Cahoon is Chief Operating Officer of Inovancetech. The company offers TRAIDE, a platform designed to discover the underlying patterns in the assets and indicators you use to trade. Those patterns can be used to create your own reliable trading strategy. Justin...


Editor’s note: The Chameleon Oscillator was detailed in the April 23 issue of Bloomberg Brief: Technical Strategies. This article is an extract of that article.

In Bloomberg Brief: Technical Strategies Alex Cole illustrates how the Chameleon Oscillator can be used to spot oversold and overbought extremes. As an example, he showed the Canadian dollar which he believes “has reversed its slide against the U.S. dollar, according to the Trend Chameleon. “ That indicator, shown as a “paint bar study” in the chart above has been pointing towards a potential reversal since April 15. In the chart, we can see the first red bar in the series occurs when prices break to the downside from a consolidation channel that developed after an extended rally.

The chart also shows the Chameleon Oscillator at the bottom. This indicator showed potential signs of trend exhaustion during the consolidation period.

Cole explained the indicator which can be reproduced in almost any software environment:

The Chameleon Oscillator is similar in concept to the Trend Chameleon in that it is a blend of several indicators and changes colors based on the indicators’ combined readings.

The Trend Chameleon establishes a reading from -6 to 6 and is based on three momentum indicators: Bollinger Bands, RSI and Stochastics. The indicator is colored based on how many overbought/oversold criteria are being met — it turns bright red when the most overbought signals are sent and bright green when most oversold, with varying shades in between.

Specific readings of the indicators are as follows:

    1. Bollinger Band readings focus on price relative to one and two standard deviations from the mean.
    2. RSI readings use a traditional 14- period look back and a faster nine-period input.
    3. Stochastics readings include both the % D and % DS lines.

In the chart of the Canadian dollar, as its price against the U.S. dollar weakened from September 2014 to January 2015, the oscillator found support several times at zero. Cole notes, “there is a tendency for this oscillator to bounce off zero when in an upward trend as the pullbacks in a strong trend are not enough to force it into negative territory.”

This can be seen in the chart as “the oscillator struggles to hold at zero and then in February it made a lower high that shows a clear divergence between the oscillator and price. It breaks below zero again and cannot rise back above, finding resistance at the zero line (short aqua line) before price finally breaks down out of the channel and the Trend Chameleon starts to paint bright red bars.”

On the chart, Cole uses a purple oval to highlight times when the oscillator reached 6, the highest possible value. These readings indicated the beginning and end of the trend. The first high reading came as the Canadian dollar was breaking above short-term resistance in December. Subsequent peaks show the trend was nearing its end.

Combined with other tools, the Chameleon Oscillator could be a valuable addition to any technical analyst’s tool box.

Alex Cole is a technical analysis specialist in Bloomberg’s analytics department. He can be contacted at


Alex Cole

Alex Cole is a Technical Analysis and Charting product specialist. With almost 20 years of financial market experience, Alex has led Technical Analysis and data visualization teams, frequently contributed to financial news/ industry organizations such as Bloomberg and the Chartered Market Technicians...


We all learn the basics of technical analysts differently. Some of us read the classics and think about how those ideas still apply to the current markets. Some of us find mentors who explain the important concepts to us. Some of us prefer traditional classrooms and other like self-paced video courses. No matter how we learn technical analysis, the next step is to apply the knowledge.

Kase on Technical Analysis is a one-stop training source. Cynthia covers the basics, puts them all together into a coherent trading strategy and then provides tests so you can apply the knowledge without risk in the markets.

The big question is who would benefit from this material. The answer is anyone wanting to learn about technical analysis, anyone studying for the CMT exam and anyone wanting to learn about the Kase suite of indicators or strategies. CMT test candidates might find this course to be an important source of information on “how to put it all together” especially if they don’t work at a firm where they can benefit from mentorship.

The course includes sections on:

  • Charts. This includes the types of charts used in technical analysis; how they are traditionally used; new ways of looking at price charts, gaps, geometric patterns, bar, and candlestick patterns; and some unique ways of applying these patterns to analysis and trading.
  • Indicators and Studies. These are mathematical algorithms of varying complexity that are used for entries, exits and risk management. Moving averages, and the directional movement index and average directional index (DMI/ADX) are show as entries or “stop and reverse” systems. Some special rules and patterns relating to DMI/ADX are addressed. Traditional momentum indicators such as Oscillators, MACD, Stochastic, RSI and Ultimate Oscillator are taught, along with how periodicity impacts indicator performance, momentum divergence, overbought and oversold signals, and how to use momentum to enter trades. For those wanting more advanced tools, the math behind Kase’s proprietary indicators, the Kase PeakOscillator and KaseCD, are discussed.
  • Stops and Risk Management. Various sorts of stops are discussed including fixed-value from entry, trailing fixed value, trailing range-based, and Kase’s DevStops and KaseX two-sided Dashes based on probability theory. Many chart examples show how stops fit into a trading system.
  • Trading Techniques. Most trading techniques involve combining indicators, stops and patterns in sensible ways so that the indicators confirm, correct, and augment one another. Additionally, the use of multiple time frame indicators simultaneously is discussed.
  • Swings, Waves, and Forecasting. Defining a swing, a wave, a wave cycle, and how to draw and label them is discussed. Using waves and wave targets calculated using Fibonacci numbers and retracements, as well as the number Phi is explained, along with a step-by-step how to succinctly develop a market view with particular target, expanded with real life examples.

The videos are filled with charts and include a large number of examples. This is an excellent resource for those learning the basics of technical analysis or for seasoned analysts looking for new ideas.


Cynthia A. Kase, CMT, MFTA

Cynthia A. Kase, CMT, MFTA is the president of Kase and Company, Inc. CTA, founded in 1992. With a BS UMass and an ME Northeastern, both in chemical engineering, she worked in that field for 10 years before beginning her trading career at...

Michael Carr, CMT

Mike Carr, who holds a Chartered Market Technician (CMT) designation, is a full-time trader and contributing editor for Banyan Hill Publishing, a leading investment newsletter service. He is an instructor at the New York Institute of Finance and a contributor to various additional...


Editor’s note: this article was originally published in Proactive Advisor Magazine and is reprinted here with permission.

A view into how professional strategists might classify active strategies.

When someone is first exposed to actively managed strategies, they may think they just stepped foot on another planet.  They hear a new language that includes many foreign terms such as tactical asset allocation, statistical arbitrage, momentum, mean reversion, etc. The complexity can seem overwhelming when compared to the relative simplicity of passive strategies. There appear to be more flavors of active strategies than anyone could ever hope to make sense of.

This article attempts to chart this world through a simple classification scheme. To do that, I will discuss actively managed strategies through the lens of four vectors: time frame, asset classes, systematic factors, and use of discretion.  Through a general look at active strategy characteristics, advisors will get a glimpse into performance drivers and be able to enhance their understanding for effectively matching strategies to their clients.

To start off, let’s clarify the time frame vector. Active strategies, to give the extremes, can operate anywhere from ultrahigh frequency algorithms running on servers co-located at stock exchanges, to strategies that may rebalance only once a year. For the most part, strategies offered by advisors will operate on the daily time frame and longer. While active managers do react as market conditions dictate, changes in allocations are typically made far less frequently than might be expected.

Just like passive strategies, active strategies seek exposure to certain asset classes like stocks of different size characteristics, treasuries, investment grade credit, high-yield credit, commodities, real estate, currency carry, volatility, or international versions of any or all of these. There are many more ways to slice the investment universe, but regardless of the particular asset classes sought, the idea is to capture the risk premiums from the exposure. In academia, this exposure is known as beta. The only difference between passive and active strategies when it comes to beta is that active strategies often attempt to capture the premiums while avoiding the risk.

How? The answer leads us to the third vector of systematic factors. In academia, these are known as alpha, or performance unexplainable by exposure to asset class risk factors. And according to efficient market theory, these should not exist.

When developing an active strategy there are basically two schools of thought. On one side are those that attempt to find some market inefficiency that becomes the basis of a truly unique strategy which holds the promise of outstanding risk-adjusted returns. Techniques such as statistical arbitrage, pattern recognition, and machine learning come from the “unique” school.

Although truly unique strategies may have great potential, the creator must navigate a sea of development biases and overcome a gauntlet of formidable competition commanding armies of Ph.D.s and almost unlimited computer power.  Because of these issues, performance of such strategies can be fleeting, or worse, just a mirage from data mining.  Unique strategies may also have a tough time during the due diligence process as vetting usually involves understanding the drivers of risk and return.

The other school of thought starts from a foundation of academic and practitioner research. The core tenet of the “research” school is to leverage market behaviors that are well-studied and likely to continue. In academia, these are known as anomalies. So what are they? This, too, is a source of debate, because depending on the criteria, there may be anywhere between a handful and several hundreds.

Two recent studies make valiant attempts to uncover systematic factors identified in top-tier research using logical and stringent criteria:

  1. Performance has been consistent over many years and has survived numerous database revisions as well as extensive out-of-sample data.
  2. The factor has been vetted, replicated, and debated in top academic journals over many years.
  3. The factor works across multiple asset classes.
  4. Minor variations in definition/construction do not significantly impact performance.
  5. There is a credible reason to offer a persistent edge. Either it is related to a human behavioral bias that is not likely to change, or it is related to an institutional feature that cannot be easily changed.

Using these criteria they find that only the valuation, low volatility, illiquidity, and momentum factors are significant when adjusted for data snooping bias. However, because illiquid securities have very little capacity, that factor is not commonly capitalized on by professional managers.

Additionally, research indicates that the momentum factor can be logically subdivided into three separate factors (absolute, relative, and mean reversion), leaving five factors that are widely used. All of these factors are summarized in the following table.

An active strategy may use one or a multitude of these systematic factors. It is important to note that these are just high level concepts. The classification does not prescribe any specific trading rules, just general tendencies. For example, one trend-following system may use moving average crossovers whereas another uses rate-of-change of returns. They are constructed differently but capitalize on the same systematic factor.

The last vector to discuss is the use of discretion, and here there is a continuum. On one end are the mechanical systems that are 100% rules-based, driven totally by programmed models. On the other end are the fully discretionary traders who use intuition and experience-based pattern recognition to make seat-of-the-pants trading decisions. Somewhere in the middle are the rules-based traders that sometimes override signals or position sizing decisions based on conviction or extenuating circumstance. In general, the more discretion involved in a strategy, the harder it is to estimate the expected performance characteristics moving forward.

In conclusion, we’ve boiled down the active strategy universe into four simple vectors: time frame, asset classes, systematic factors, and use or non-use of discretion. There are certainly other ways to classify strategies, but this method gives some insight into how professional strategists might think about the active strategy universe. Although exceptions exist, active strategies are generally disciplined and based on well-researched market behaviors.

It is important for advisors to have a basic understanding of the general characteristics so they can more effectively match strategies to their clients.


Dave Walton

Dave Walton has been involved in active investing since 1999. Prior to co-founding StatisTrade, Mr. Walton spent 18 years ensuring the quality and reliability of cutting-edge technology products for one of the world’s largest computer chip manufacturers. Mr. Walton received his BS...


Point and Figure (P&F) is one of the oldest tools of technical analysis. It’s been used since at least the 1880s yet the technique works just as well today as it did 130 years ago. The basics of P&F haven’t changed over time but computers and software have allowed skilled analysts to derive more information from the columns of Xs and Os that are found on P&F charts.

In this book, du Plessis demonstrates how to apply 20th century tools like Bollinger Bands®, Donchian channels and Welles Wilder’s parabolic stop and reverse strategy to P&F charts.

Below are examples from the book showing how Donchian channels could be applied to P&F charts. Donchian channels were developed by Richard Donchain in the 1960s. They consisted of two lines on a price chart. One line showed the highest high over a time period, usually 20 days or 4 weeks. The second line showed the lowest low over that time frame. There are a number of possible variations such as using closing data only or a different number of days in the look back period. Buy signals are given when prices reached a new 20-day high and sells are made at new 20-day lows.

du Plessis notes

“Donchian channels are ideally suited to P&F charts because, by plotting lines across the highest X and lowest) over a selected number of columns, they can be used to provide levels at which a change in the trend is signaled.

Since they are drawn the highest high and lowest low of columns under consideration, by definition the price can never break out of the channels, so in order to allow a price break, a number of columns must be omitted before the look back. Although omitting 1 column is enough to achieve a breakout, around 5 is more common.”

He explains that more columns can be omitted and changing the number of columns omitted can increase or decrease the number of trading signals. This means the system is easily adapted to meet individual preferences for risk and trading frequency.

The first chart below shows channels applied to the S&P 500. The red lines ignore the last 1 column. Blue lines omit the last 5 columns and black lines ignore the last 20 columns.

The next chart, also from the book, offers a detailed look at when trade signals are given.

du Plessis also explains how time-based indicators such as RSI, directional movement and MACD can be used with P&F charts. Among the other new insights provided in 21st Century Point and Figure are explanations of how column volume can be used to assess a column’s strength and how volume at box level can be used to assess support and resistance areas. New P&F-based market breadth indicators are also introduced as is a new P&F-based oscillator.

This book provides a number of new tools for P&F analysts and truly does bring this technique into the modern era.


Jeremy du Plessis, CMT, FSTA

Jeremy du Plessis, who holds the Chartered Market Technician (CMT) designation, has been involved with technical analysis since 1981 and founded Indexia Consulting Ltd to produce technical analysis software. He pioneered the computerization of Point and Figure charts, in particular the automation...

Michael Carr, CMT

Mike Carr, who holds a Chartered Market Technician (CMT) designation, is a full-time trader and contributing editor for Banyan Hill Publishing, a leading investment newsletter service. He is an instructor at the New York Institute of Finance and a contributor to various additional...


A previous blog discussed Wilt Chamberlain’s unsuccessful attempt at following up on his successful basketball career with a successful IPO. Wilt Chamberlain wasn’t the only sports star to go into the restaurant business. Few people realize it, but Mickey Mantle set up restaurants twice, once in Texas in the 1950s when he struck out, and a second time in the 1980s in New York City when he hit a home run.

Mickey Mantle played for the New York Yankees between 1951 and 1968 and is one of the few baseball players to hit over 500 home runs during his career, and believe it or not, without the aid of any steroids. Many consider Mantle to be the greatest switch hitter of all time, a skill which enabled him to hit for both average and for power. His lifetime batting average was 0.298 and he hit some of the longest home runs in history, one measuring 643 feet at Tiger Stadium in 1960.

Mickey Mantle replaced Joe DiMaggio in centerfield in 1952, appeared in 12 World Series, helped the Yankees win seven of them, and played in sixteen All-Star Games. Mantle was Most Valuable Player three times and by 1961 he was the highest paid player in baseball.

Country Cooking Without the Sizzle

Mickey Mantle incorporated Mickey Mantle’s Country Cookin’, Inc. in Texas on April 22, 1968 during his last year of his baseball career. The company authorized 2,000,000 shares, and 1,000,000 shares were outstanding on July 11, 1969.  Mickey Mantle’s Country Cookin’, Inc. offered 200,000 shares at $15 per share on July 11, 1969 through Pierce, Wulbern, Murphy, Inc. of Jacksonville, Florida and through D.A. Campbell Co., Inc. in New York.

The restaurant’s menu focused on country vittles, including Chicken & Dumplins’, Ham and Lima Beans, Country Beef Stew, Country Fried Chicken, Texas Chili, Catfish Filets, Chili & Beans, Chicken Fried Steak, a Country Smoked Ham Sandwich and a Country Pork Sausage Sandwich. Meals were $1.25 and the sandwiches were $1.00. If you wanted to go all out, you could get an eight-piece Chicken Bucket for $3.25.

The first Country Cookin’ restaurant opened up at 3651 Martin D. Love Freeway. Company-owned restaurants were also located in San Antonio and in Irving, Texas. Mickey Mantle franchised the restaurant and ones soon popped up in Florida, Louisiana and Texas. In 1970 the company tried to acquire 35 Best Steak Home restaurants, but the deal fell through. Largely because of law suits, overextension and poor management, the restaurants did not do well.

Shares in Mickey Mantle’s Country Cookin’, Inc. traded over-the-counter, and as you can see by the graph below, the shares steadily declined in value. Offered at $15, shares fell below $10 by the end of the month and were down to $1 a year later. The company received more income from franchise fees than from sales, and this could only spell long-run trouble for the restaurant. As a sign of its problems, the company changed its name to Invesco International, Inc. on June 30, 1969, and reincorporated in Nevada where it became a coal mining company.

The 1952 Topps Mickey Mantle #311 rookie baseball card sells for over $10,000, but if that is beyond your budget, hundreds of souvenirs remain from the Mickey Mantle’s Country Cookin’ restaurant and can be purchased on Ebay. Souvenirs include postcards, china, coffee mugs, carrying trays, menus, chairs, pot holders, ordering pads, stock certificates and prospectuses. Souvenirs are also available from Mickey Mantle’s Holiday Inn located in Joplin, Missouri, including bars of soap, post cards, matches and even room keys, though they probably don’t work anymore.

Mantle’s Business Career Booms

The Country Cookin’ restaurants weren’t Mickey Mantle’s only business venture. As mentioned above, he ran a Holiday Inn in Joplin, Missouri. Mantle also created real estate and land developments (WIllowwood and Arbolado), the Mickey Mantle Billiard Center in Milwaukee, Wisconsin, and the Mickey Mantle Bowling Center at 200 Exchange Park North in Dallas from which some cigarette lighters survive.

In 1988, Mickey Mantle’s Restaurant and Sports Bar was opened in New York at 42 Central Park South at 59th Street.  Although Mantle made frequent appearances at the restaurant, he let others run the business. The sports bar continued to operate successfully for over 20 years, perhaps because it focused on a single theme, avoided expansion, and Mantle acted as promoter rather than manager.

For the record, Mantle’s favorite food was the chicken-fried steak, usually topped off with some drinks from the bar.  Mantle died in 1995 and the sports bar closed on June 2, 2012 after failing to pay rent for several months. Though the sports bar remains closed, it has a Facebook page for the curious.

Mickey Mantle was an entrepreneur throughout his life, and the success of the sports bar shows that he learned from his first time at bat in the restaurant business.


Dr. Bryan Taylor

Dr. Bryan Taylor President & Chief Economist, Global Financial Data Dr. Bryan Taylor serves as President and Chief Economist for Global Financial Data. He received his B.A. from Rhodes College, his M.A. from the University of South Carolina in International Relations, and...


Editor’s note: This was recently published at Alvarez Quant Trading and is reprinted here with permission. A spreadsheet with the complete results of this strategy is available at the web site.

My recent research has been in ETFs which I have not explored in several years. ETF sector rotation has always intrigued me. The idea seems so simple that it should work. Always be in the sector that has been doing the best. I like simple but does it work? If not, can we make it work?

ETF Universe

For these test, we will using the Select Sector SPDR ETFs. They have a long history. The list is

  • Consumer Discretionary (XLY)
  • Consumer Staples (XLP)
  • Energy (XLE)
  • Financials (XLF)
  • Health Care (XLV)
  • Industrials (XLI)
  • Materials (XLB)
  • Technology (XLK)
  • Utilities (XLU)

Testing will be from 2005 to 2014.


Our baseline will be buy and hold on the SPY. My typical goal when trading ETFs is to beat CAGR of Buy & Hold by 50% but with significantly less drawdowns. Under 25% would be good and under 20% would be great.

Simple Rotation Test

The first test is a simple momentum method.

Rotation Rules

  • At the end of the month, rank the ETFs from high to low of their (6,12) month returns
  • Buy the top (1,2) ETFs on the next open

The six and twelve month values were chosen because those are the ones that are most often referenced in the research I have seen. The spreadsheet contains 3 and 9 month rotation periods.


Not a great start. The results match buy and hold but with a lot more trading. How can we improve on this? Something that I have learned about most rotation strategies is that there are times when even the top ranked ETF is not something that you want to be trading.

Rotation with trend filter and backup ETF

A very popular topic recently is Dual Momentum which has the concept of when an ETF does not pass some filter, instead of investing in that ETF you invest in some alternative ETF. This ETF could be SHY (iShares 1-3 Year Treasury Bond) or TLT (iShares 20+ Year Treasury Bond).

New Rotation Rules

  • At the end of the month, rank the ETFs from high to low of their (6,12) month returns
  • Buy the top (1,2) ETFs on the next open
  • But if the top ranked ETF is below the their (6,12) month moving average, then instead of buying the ETF buy the alternative ETF(SHY,TLT)

In the following example we are using 6 month return and TLT as the alternative ETF. If XLE and XLF are the two top ranked ETFs by 6 month return. Before buying them we check and see if they are trading above their 6 month moving average. Say that XLE is not, then instead of buying XLE we buy the alternative ETF, of TLT.


The results have improved but are still short of my targets.

Rotation with dual ranking, trend filter and backup ETF

Another concept that I have seen with ETF rotation strategies is using more than one ranking. What we do is rank the ETFs by two time periods, add the rankings and then rank again. Taking the top ranked.

New Rotation Rules

  • At the end of the month,
    • Rank1 = rank the ETFs from high to low of their (3,6,9,12) month returns
    • Rank2 = rank the ETFs from high to low of their (3,6,9,12) month returns
    • Rank3 = Rank1 + Rank2
    • Rank rank3. In case of ties, use Rank1 as the tie break.
  • Buy the top (2) ETFs on the next open
  • But if the ETF is below the their (6,12) monthly moving average, then instead of buying the ETF buy the alternative ETF(TLT)


Now we are getting somewhere. Both CAR and MDD meet my goals but only barely so and only with one variation. Not shown here but you can see in the spreadsheet, is that the top variation made money every year.

Final Thoughts

I did not quite get the results I wanted but I am getting close. I will have to think about other ideas to try. I would like to find a sector rotation strategy that works. Got any ideas on what to try? Send them my way by clicking here.

Cesar Alvarez attended the University of California, Berkeley where he received his Bachelors of Science in Electrical Engineering and Computer Science in 1989 and his Masters of Science in Computer Science in 1990. Cesar was a Software Engineer on Excel versions 3, 4, and 5, helping Microsoft Excel go from single digit market share to owning the market. Cesar spent nine years as a professional market researcher for Connors Research and  Cesar has been at the forefront of stock market research, having developed a number of successful trading systems now used by numerous investors and fund managers in the United States and internationally. Cesar has given trading presentations both over the web and in person to hundreds of traders.


Cesar Alvarez

For the last six years, Cesar Alvaraz has written for his popular quant blog, Alvarez Quant Trading helping traders learn about the markets. He spent nine years as the Director of Research for Connors Research and Numerous strategies he created have...


Editor’s note: this article was originally published by the National Association of Active Investment Managers and is reprinted here with permission.

I was recently named Chief Investment Officer at Sowell Management Services, a registered investment advisor that is responsible for about $500 million of client assets. Part of the gig is to communicate all things market related to the portfolio managers, the sales team, and the independent advisors who represent the firm.

During a staff meeting on Monday, the National Sales Manager wanted my take on the current environment for “tactical” investment strategies. He noted that most tactical managers had a rough go last year and wondered if the environment had changed with the calendar. My response was brief and to the point, “It’s been ‘sheer misery’ in the tactical space since the beginning of 2014… and no, nothing has changed.”

Not surprisingly, I was asked to put “some color” behind my view and to extrapolate on why the environment has been tough on folks trying to employ a tactical approach to investing (which according to Investopedia is defined as: “An active management portfolio strategy that rebalances the percentage of assets held in various categories in order to take advantage of market pricing anomalies or strong market sectors.”)

I tried to respond as succinctly as possible and when I was done, I realized that others might find my explanation of the current environment worthwhile.

So, here was my response and some thoughts on what investors/advisors should be doing now…

Since the beginning of 2014, stocks have working higher in a very choppy, sometimes violent fashion. Intraday volatility has increased dramatically as high-speed trend-following has become increasingly prevalent. For example, the S&P regularly moves plus or minus 0.5% within a matter of minutes these days – and without any news.

The next point is that each new high in the indices (and there have been more than a few over the last 15 months) has been met with selling. This price action has been affectionately referred to as the dreaded “breakout fakeout.” The problem is that “breakouts” tend to be considered a good thing by most trend or momentum-based market indicators.  So, in short, this means that the indicators most tactical managers employ have been consistently “fooled” each time a “fakeout” occurs.

Given that this trend has quickly became quite obvious to most traders, a great many of these “breakout fakeouts” quickly evolved into pullbacks as the “fast money” types sold short with glee each time a breakout failed.

However, after pulling back a handful of percentage points, each and every one of these declines (which are traditionally a friend to tactical managers) was quickly reversed as the next round of QE/monetary stimulus/friendly Fedspeak arrived from the likes of Mr. Bullard (see the October decline bottom), the ECB, China, Japan, etc.

In addition, the market has been a bit of a bucking bronco of late as we’ve seen no fewer than 10 changes in direction since December alone – and 7 already this year!

Perhaps the best statistic I’ve seen that describes the environment is this… From 1955 through 2012 the S&P experienced a “V-Bottom” every 1.6 years on average. However, since the beginning of 2013, the market has experienced a “V-Bottom” every 1.5 months on average.

Then there is the “fund flow” issue, that is likely responsible for the V-Bottoms. By now, most folks know about the money that has been flowing steadily into the U.S. market. But let’s spend a moment to review the cause and effect here.

With the U.S. Fed and the Bank of Japan printing trillions in fresh cash via their QE programs last year, the bottom line is that money had to go somewhere. And what we saw is that an awful lot of it wound up in the U.S. stock and bond markets (remember, money goes where it is treated best).

And while the U.S. has stopped its QE program, the ECB has now stepped in to pick up the slack to the tune of €60 billion a month for a total of 19 months. As such, there is underlying support for the stock market whenever a “dip” takes place.

So, with the “fast money” selling all new highs and the fresh new QE money buying the declines, you wind up with a choppy, mean-reverting environment that has eventually managed to move steadily higher.

In sum, the fast-paced changes in direction have been tough as tactical managers have had a very difficult time dealing with this up-one-minute and then down-the-next market.

So, what is working, you ask? In essence, this continues to be a time to be more “strategic” in your approach to the markets. Or another way to put it is that your risk management techniques have needed to be longer-term in nature in order to succeed.

But perhaps the best answer is that investors, advisors, and clients alike have to learn to diversify their portfolios properly. And no, I’m not talking about that ancient MPT stuff.

What I’m talking about is diversifying your portfolio by strategy, by manager, and by methodology. In other words, don’t put all of your eggs in one basket. The bottom line is that ALL (yes, ALL!) investment managers/strategies experience periods of underperformance from time to time. So, don’t freak out when it happens to you or a money manager near you employ.

In 2013 it was asset allocation that was made to look silly as the term “diworsification” made a comeback. But since then, allocating across asset classes has worked just fine. Then since the beginning of 2014, tactical strategies have clearly been picked on by Ms. Market.

So, is it time to give up on all those folks trying to manage risk or stay on the right side of the market? In a word, no.

While no one knows how long the current environment – the same environment that is causing tactical managers to struggle – will last, there is one thing we do know for certain about the stock market. And that is right about the time you’ve got the answer, the game changes.

So, instead of dumping those underperforming tactical managers, a true contrarian might be licking their chops at what will likely amount to an opportunity when the environment changes. And it will change, you can count on it.


Dave Moenning

David D. Moenning is President and Chief Investment Strategist for Heritage Capital Research.  Dave owned his own investment management firm and has been managing client accounts on a discretionary basis since 1987. Thus, Dave has been live, on the market’s firing line...


Editor’s note: this was published in late-April and is presented as an example of long-term technical analysis.

Two weeks ago our Market Comment summarized the big picture for the 2009-2015 bull market. We concluded that after a 97% gain over nearly three and a half years, the markets were due for a well deserved rest. While some market analysts expect a big correction or even the end of the bull market after such a strong advance, our view is that the markets can remain bullish and “correct” by marking time within a relatively narrow trading range.

What has happened so far? The S&P 500 has moved sideways for two months. Last Friday’s decline, caused mainly by being both option-expiry day and the end of the 70-day cycle, was contained comfortably above the 1,975 to 2,025 zone that we highlighted last time. On Monday the loss was recovered.

As the S&P 500 moves sideways, it is clear that the bearish forces are growing in confidence. The S&P 500 has been sluggish since the end of February and to date is unable to break out above 2,120. The bears claim that the Index is beginning to roll over. Yes, the Transports continue to lag the Industrials, internal momentum is weakening and investment adviser sentiment (a contrary indicator) is still relatively bullish. And we can expect the annual chorus of “sell in May and go away” to break into song shortly.

However, the bears have to prove their case convincingly, and up until now they have been unable to do so. Putting the markets under modest selling pressure for limited periods of time is not a fatal wound to this powerful bull uptrend.  There have been no serious breakdowns in the major market indices and they remain above their respective 200-day Moving Averages.

Toronto, with a significant correction behind it and being historically strong in the latter stages of bull markets, remains bullish. Sector rotation continues, and as our recent stock reports on IMO, CNQ and SU suggest, the energy sector (after a small correction) might be emerging as a new leader in the next advance.

There are several important levels to watch in the next few weeks. For the S&P 500: 2,040 is the immediate key support level and below that the mid-December low of 1972.56 is critical. On the upside any sustained breakout above the late February high of 2119.59 would be very bullish. For the S&P/TSX Composite Index a sustained move above 15,500 would be very bullish. The mid-March low of 14,606 is important for the bulls to defend.

Our expected scenario: With the markets slightly overbought, we expect that the bears will try once again to test important support levels. If supports are broken, it will only be a temporary victory for the bears. The bulls, armed with positive cycle forces appearing in May, will fight back successfully.

It is nevertheless important to recognize that any late-stage bull market contains risk. Stops should be set appropriate to risk tolerance, but any further weakness should present buying opportunities in selected sectors and stocks.

In sum, we expect that some selling pressure will continue but could soon exhaust itself and the bulls could re-assume clear control of the markets. A “spring fling” in the form of a “May upside surprise” – possibly in the midst of bad news – is exactly the outcome that will fool many market participants. This bull market has over six years’ history of confounding the majority.

The S&P 500 has made three peaks in the past two months on slightly declining internal momentum. The Index continues to move around its 50-day Moving Average. The inability to breakout above 2,120 remains a concern.

Good support remains from 1,970 to 2,050, which includes the rising 200-day Moving Average. The short-term bull/bear battleground could revolve around the 2,040 to 2,050 zone, with the bears looking to push the S&P 500 Index below this area.

If the bears retain the short-term initiative then we expect the S&P 500 to probe down towards its rising 200-day Moving Average. The bulls should be able to hold their ground in the low 2,000s and then launch an attack on the 2,119.59 high once again.

Last week’s rise by the S&P/TSX Composite Index above 15,500 was bullish. Toronto has been in a well-defined uptrend since its December 2014 low.

Toronto is slightly overbought after its recent rally. The 200-day Moving Average should provide good support around 14,900 along with the uptrend line from mid-December. There is further support in the 14,500 to 14,600 zone.

A pause may be needed to work off the short-term overbought condition. But as we said in the last Market Comment, the Toronto market is in a potentially very bullish position. A sustained move above 15,500 will be the prelude to a move to new bull market highs above 15,685.


Ron Meisels

Ron Meisels is Founder and President of Phases & Cycles Inc. with over 50 years of stock market experience.  He specializes in the independent research of Canadian and U.S. securities and market using Behavior Analysis.  Institutions ranked him among the top three analysts...


Editor’s note: This was published by Intalus at their educational blog. The code for implementing this strategy in Tradesignal is available by clicking here.


Every year, the old adage “Sell in May and go away“ is widely discussed in the financial media. But is this adage reality or just a myth? And can this simple seasonal strategy work at all if everyone knows it? Using Tradesignal; traders, portfolio managers and analysts can examine these and other seasonal pricing anomalies quickly and safely to improve their investment decisions.


In 1964 the Financial Times wrote about the “Sell in May“ effect, it took decades for the academic world to study this phenomenon, which says shares should be avoided during the summer months. The most comprehensive study on this subject was published in 2002[1]. The research duo Sven Bouman and Ben Jacobsen examined a total of 37 international stock market indices of different industrialized and emerging countries and calculated the respective performance and risk figures for the period from November to April (winter period) and the period from May to October (summer period) in order to compare them.

The results were clear: 36 of the 37 examined stock markets revealed a significant “Sell in May“ effect, that is the performance of a buy-and-hold investor was worse than that of the investor, who did not hold any shares in the months of May through October and re-boarding in November to stay invested until the end of April. Also, from a risk perspective, the winter period turned out to be a better choice over the summer period: The standard deviation was lower for the “Sell in May” strategy compared to the passive buy-and-hold approach.


To ensure the extent to which the seasonal pattern still works today, every trader should create their own analysis. One possibility here is the application of an indicator, which enables you to visualize such a seasonal pattern in no time at all.  Figure 1 shows the seasonal pattern of the Dow Jones Industrial Average on the basis of the last 10, 20, 30 and 40 years.  The result confirms the formation of a major high between late April and mid-May and the subsequent sideways or downward movement of the stock markets in the summer months. At the same time the formation of a seasonal low typically can be observed at the beginning of October. However, interestingly, is the fact that this pattern has not undergone any great variations with time.

Let‘s take a look at how the equity curve of this primitive seasonal approach would have developed in the past for the DAX. Figure 2 shows impressively that staying away from the stock market during the summer months on balance represented a good decision. On the basis of the equity curve you can see that this simple trading rule has outperformed a passive buy- and-hold investment in the DAX over the last 20 years both in terms of performance and on the basis of volatility.

This statement is also true for other benchmarks like the S&P 500. Especially in bear market phases the long exit by the end of April turned out to be an excellent timing signal, since the majority of the drawdowns could be avoided by following the “Sell in May” approach – the summer months of 1990, 2001, 2002 and 2008 serve as good examples here.

Admittedly one thing may not be concealed at this point: No seasonal pattern is set in concrete! While skipping the summer months has been of beneficial in strong downward phases, in certain bull market years, the flip side of the coin is evident: An investor who followed the “Sell in May” strategy in the years 1993, 1997 or 2005 and 2012 missed significant gains.


Finally let us analyze the results.

For the DAX, the “Sell in May” approach can be summarized as follows: Since 1991, 22 trades were carried out – 19 of which ended with a positive result. Excluding transaction costs, a net gain of around 12000 points total was generated, the largest drawdown was around 3100 points. Figure 3 shows an excerpt of the performance report – this time for the Dow Jones Industrial Average since 1950.

With a hit rate of 75 per cent, the seasonal strategy was successful. The report also includes another important message: The “Sell in May” strategy generated four false signals in a row in the late seventies – proof that the seasonal approach by no means has to work every year.


The reasons for the “Sell in May” or Halloween effect still aren’t exactly clear, but one cannot be dismissed out of hand: In the past, the famous adage of selling at the beginning of May and reentering the stock market in October very often proved to be the right decision. However, it must be noted that the assessment of these seasonal strategy greatly depends both on the chosen market and the investment period. Basically, this price anomaly is a kind of loss limiting approach, which also has its price if the market rises over the summer.

[1] The study can be downloaded here:


David Pieper

David Pieper is co-founder and Systematic Strategies Specialist at Intalcon GMBH. David started his professional career as an investment advisor and has gained several years of experience in investment research at LBBW, Landesbank Baden-Württemberg. He holds a certificate as a Certified International...


Technical analysis has been used by traders for centuries. But it has only been taught as a for-credit course at universities since 1996. In that year, the MTA Educational Foundation partnered with the University of Richmond to teach “Technical Analysis and the Securities Market” to undergraduates.

Dick Dickson, CMT, was the instructor in 1996. Dick worked with Professor John Earl, who is now Chair of the University’s Finance Department, Ralph Acampora and Martin Pring to introduce the course to the school. With Dr. Earl’s assistance, a presentation to the Dean was completed and the course which focused on the practical application of technical analysis to the markets was immediately approved.

In addition to learning from Dick, students were provided with insights from some of the country’s leading technical analysts. Ralph Acampora, CMT, Steve Nison, CMT, Alan Shaw, CMT, Phil Roth, CMT and Martin Pring presented as guest lecturers.

One of the students in that first class, Katie Stockton, CMT, described a memorable moment from the class:

I remember Ralph tearing The Wall Street Journal in two pieces, suggesting it was something we could ignore because charts contain all of the information we need. Dramatic, yes – but it certainly got my attention!

Katie is now the Chief Technical Strategist for BTIG in New York City and believes that class provided a benefit to her career. She recalled, “My interest in technical analysis started in college. I was fortunate to be a student at the University of Richmond’s undergraduate business school, which pioneered a program in technical analysis in 1996.”

Technical analysis has been taught at the school ever since then. Other schools also offer courses in technical analysis.  This is a sign that “the acceptance of technical research as a viable tool for investing has increased dramatically” according to one of the first students in the University of Richmond’s class, Katie Stockton.

The course description in the current University of Richmond student catalog informs students that the course “involves study of supply and demand through data generated by the action of markets and through the study of psychology and behavior of the various market participants. Will cover basic tools of technical analysis including the Dow theory, techniques of chart construction and interpretation, momentum and cycle studies, relative strength, industry group analysis, investor sentiment, contrary opinion, and intermarket relationships. Emphasis will be placed on practical application of these tools to the investment decision-making process for both the short- and long-term. Studies will be taken from both historical and real-time situations.”

Technical analysis is also used to manage one of the two student-managed investment funds at the University of Richmond. Dick advised the students managing this fund in addition to teaching until he relocated to Florida to work with Lowry Research.

The syllabus for the course Dick taught in 1996 is reproduced below. This document could serve as a roadmap for CMT candidates. Each week builds on knowledge gained in the previous weeks. This is not just the model for a successful course in academia, it is also the model for acquiring the knowledge needed to pass the CMT exams.


Michael Carr, CMT

Mike Carr, who holds a Chartered Market Technician (CMT) designation, is a full-time trader and contributing editor for Banyan Hill Publishing, a leading investment newsletter service. He is an instructor at the New York Institute of Finance and a contributor to various additional...


In this year’s Charles H. Dow Award paper, I described how the VIX Fix indicator could be applied as a trading strategy.  The VIX Fix is an indicator developed by the 2014 MTA Annual Award winner Larry Williams. It applies the concept of the VIX to any stock or stock market index. Complete details of how the indicator is used are in the paper but the basic idea is that low volatility is bullish. I use a moving average to define when volatility is low or high.

In the first chart below I have applied the VIX Fix to the monthly chart of PowerShares QQQ ETF (QQQ), a tradable security that tracks the NASDAQ 100 index. For now, the VIX Fix is below the moving average and there is no sell signal in this security. The chart highlights how well the indicator works with monthly data to identify bear markets.

In the next chart, I apply this indicator to India’s Nifty Fifty index. Here we see the VIX Fix has just moved up above its moving average indicating a potential downturn in the market.


Amber Hestla-Barnhart

Bio coming soon.