Technically Speaking, September 2015

LETTER FROM THE EDITOR

This month’s issue of the magazine is the sixth month in a row we are featuring content from the Annual Symposium. That meeting lasts just a few days but it truly does provide months worth of ideas for attendees. Planning is underway for the 2016 Symposium and each year is always better than the previous year. It’s not too early to start making plans to attend.  This month’s magazine also includes examples of the latest research into technical analysis and historical perspectives of the field. As always, we hope you find actionable ideas in each issue.  Remember, submissions for the 2016 Charles H. Dow Award are now being accepted. More details are available by clicking here. Submissions for other awards, including the MTA Annual Award and the Memorial Award, will also be accepted soon. If you know of someone who should be recognized with one of the MTA’s awards, now is the time to plan their nomination.  You can always provide feedback on Technically Speaking by emailing us at editor@mta.org.

Sincerely,
Michael Carr

What's Inside...

QUALITY TRENDS: BET ON QUALITY AND MONITOR CONSISTENCY

Editor’s note: Eoin Treacy is s a global strategist at Fullermoney.com. He provided details on a strategy that could help...

Read More

DOW JONES’S 22,000 POINT MISTAKE

Editor’s note: In the previous article, Eion Treacy highlighted the importance of the index weighting scheme to performance.  In this...

Read More

SAM HALE, CMT - IN HIS OWN WORDS

Sam Hale, CMT, passed away on August 4, 2015 at the...

Read More

Editor’s note: Richard Wyckoff was an early influence on Same Hale, CMT, as noted elsewhere in this issue. This article...

Read More

DISCIPLINARY ACTION

On August 11, 2015, the Board of Directors of the Market Technicians Association (MTA) suspended the MTA membership of Vishal...

Read More

INTERVIEW WITH TYLER YELL

How would you describe your job?

My role at DailyFX is as a currency analyst and trading instructor. As an...

Read More

BLOOMBERG BRIEF HIGHLIGHTS: CHART-BASED TRADING STRATEGIES

Editor’s note: This article was originally published in the August 27 issue of Bloomberg Brief: Technical Strategies. Below is an...

Read More

THE SANDPIPER AND TRADING: HOW YOU CAN USE VOLATILITY MEASUREMENT TO GAIN AN ADVANTAGE IN YOUR TRADING

Editor’s note: This article was originally published at The Educated Analyst, an education blog maintained by Market Analyst.

As a...

Read More

PREDICTIVE MEDIA CONTENT ANALYTICS; 24/7 INFORMATION HAS FOREVER CHANGED FINANCIAL MARKET STRATEGIES

Editor’s note: sentiment analysis has long been a part of technical analysis. In the past, sentiment was often measured in...

Read More

THERE ARE NO LONG-TERM STOCK MARKET TRENDS

In mid-July I was interviewed Business Talk Radio.net and something very interesting happened. I was asked how long the longest...

Read More

RESEARCH UPDATE: HIGH FREQUENCY TRADING: A TOOL OF PANACEA OR FIASCO FOR MARKET LIQUIDITY AND ROBUSTNESS?

Abstract: The report discusses the merits of the entitling High Frequency Trading and its rapid growth as the future of...

Read More

QUALITY TRENDS: BET ON QUALITY AND MONITOR CONSISTENCY

Editor’s note: Eoin Treacy is s a global strategist at Fullermoney.com. He provided details on a strategy that could help investors beat their benchmark at the MTA Annual Symposium in March. The complete presentation is available at the MTA’s Knowledge Base, the web’s free repository for everything related to technical analysis.

Many investment managers and individual investors set a relatively simple goal for themselves. They want to beat the benchmark. Eoin Treacy explained the problem with the goal and then explained how it can be done at the beginning of his presentation.

  • To begin with, beating the benchmark is hard.
  • It can be done, but the only way it can be done is to try something different.

Once again, this goal sounds deceptively simple. There are many ways for an investment manager to do something different. The biggest problem here, as many investment managers have learned, is that doing something different can get you into trouble with you clients if it doesn’t work. Doing something different could also attract the attention of regulators if it doesn’t work well. In other words, career risk is often much more of a challenge than any market risk. In other words, being wrong by yourself could get you fired but being wrong with everybody else by closely tracking a benchmark carries little career risk.

There are some general rules to help investment managers think differently. One idea is to focus on equal weighting rather than following the market cap weighting scheme used in most indexes. Successful investment managers often manage concentrated rather than broadly diversified portfolios. Portfolio concentration requires a high degree of conviction rather than a desire to hedge both market and career risks. Eoin also believes successful managers should bet on quality rather than mediocrity when selecting markets, sectors and individual stocks. To achieve success, he believes it’s important to Identify where the crowd is going rather than following the crowd.

To focus on something different, Eoin focuses on themes. He is looking for the best country or the best sector to beat the market. Looking ahead, he sees three big themes affecting the stock market:

  1. The rise of the global middle class.
  2. The exponential pace of technological innovation.
  3. Lower energy prices in real terms are not a short-term phenomenon.

A bull market requires liquidity and productivity growth. Global central banks are providing liquidity growth. Productivity growth is driven by two factors – the number of workers and the ability of each worker to produce more. Over the next twenty years, about 2 billion consumers will be entering the global middle class which create demand for new products while supplying the labor pool to create those products. Technological innovation will also help to increase productivity and is likely to exceed expectations. In the past, innovation has often been underestimated.

The third theme, lower energy prices can be seen in a long-term chart of oil. Eoin noted, “Looking at the long-term chart of oil prices on a log scale we are presented with a series of cycles where prices rally impressively for a number of years but give way to more than decade long volatile ranging patterns. This is now the most likely path for oil prices following what had been a prolonged period of strong pricing.”

At the end of March, he noted the lows for oil are likely to be near $35 to $45 a barrel, a price range covering the recent price action in the market. Eoin also noted in his presentation that trends to follow in energy prices include more than just the price of oil. Solar power is an example of technological innovation. He believes technological innovation will be an important factor in oil markets which contributes to increased productivity and lower prices but technological innovation will also lead to increased availability and lower costs of solar and possibly other alternative power sources.

To benefit from the big themes, Eoin believes global large cap stocks that Fuller Money calls Autonomies will provide longterm profits. These companies are leaders in their fields and are truly global. He defines a global leader as one that generates at least 40% of their revenue outside their domestic marker. Autonomies have solid records of expanding into new markets and they have strong balance sheets. These companies often have a long history of paying dividends and their charts often show a pattern that can be thought of a “big base.”

In his presentation, Eoin showed several charts of Autonomies:

These companies could be in the early stages of a multi-year bull market. Momentum will be a factor that helps define when the bull market comes to an end and as technicians we know momentum could change at any time.

Contributor(s)

Eoin Treacy

Eoin Treacy is a technical analyst at Fuller Treacy Money, as well as a writer, strategist, commentator and lecturer. He joined David Fuller at Fuller Money Global Strategy Service in 2003, and engaged in a management buyout of the company with David to...

Michael Carr, CMT

Mike Carr, who holds a Chartered Market Technician (CMT) designation, is a full-time trader and contributing editor for Banyan Hill Publishing, a leading investment newsletter service. He is an instructor at the New York Institute of Finance and a contributor to various additional...

DOW JONES’S 22,000 POINT MISTAKE

Editor’s note: In the previous article, Eion Treacy highlighted the importance of the index weighting scheme to performance.  In this article Dr. Bryan Taylor demonstrates that decisions made by index committees can have large impacts on an index’s performance. This article was originally published at the Global Financial Data blog and is reprinted here with permission.

One of the long-term components of the Dow Jones Industrial Average has been IBM. The company was originally added to the Dow Jones Industrials on March 26, 1932 in a reshuffle involving eight stocks including Coca-Cola, Nash Motors (later American Motors) and Proctor & Gamble. On March 13, 1939, however, both IBM and Nash Motors were removed from the average and replaced by American Telephone & Telegraph and United Aircraft Corp. (now United Technologies).

AT&T was in the Dow Jones Utilities Average until June 1, 1938. Until then, the Dow committee had interpreted utilities in a broader sense to include electric, gas, and communications companies as providers of essential services. In 1938, the Dow Jones committee decided to restrict membership in the Utilities Average to power utilities.

The resulting reshuffle removed nine stocks, including AT&T, International Telephone & Telegraph, and Western Union, all of which were communications utilities rather than power utilities, from the Dow Jones Utilities Average. Since AT&T was such a huge company, it was moved over to the Dow Jones Industrial Average which required that another stock be removed to make room for AT&T. Thus, IBM was kicked out of the Dow Jones Industrial Average.

What if the Dow Jones committee had not redefined the Utilities Average to only include power utilities? What if IBM had stayed in the Dow Jones Industrial Average between March 13, 1939 when it was removed and June 29, 1979 when IBM replaced Chrysler in the Dow Jones Industrials? Obviously, the Dow Jones Industrials would be higher than it is today, but how much higher?

International Business Machines incorporated on June 16, 1911 as The Computer-Tabulating-Recording Co., a merger of The Computing Scale Company of America, The Tabulating Machine Company and The International Time Recording Company of New York. The company listed on the NYSE in November 1915, and on February 14, 1924, the company acquired International Business Machines and changed its name in a reverse acquisition.

IBM has been one of the best performers on the stock exchange in history. If you had invested $1 in IBM when it started trading OTC in August 1911, it would have grown to $40,000 today on a price basis. If you had reinvested your dividends, your $1 investment would have grown to $1,434,300. In the past 100 years, IBM has given over a million-fold return. The graph below shows the performance of IBM stock over the past 100 years.

AT&T incorporated in New York on March 3, 1885 and began trading on the NYSE in May 1900 after it had acquired American Bell Telephone Co. in March 1900. The company was forced to split up into “Ma Bell” and the “Baby Bells” by the U.S. Government on December 31, 1983. On November 18, 2005, AT&T Corp. (“Ma Bell”) was acquired by one of the Baby Bells, SBC Communications, which then changed its name to AT&T Inc. in a reverse acquisition.

AT&T has not performed as well as IBM over the past 100 years. If you had invested $1 in AT&T in May 1900, your investment would have grown to only $4.26 on a price basis, or $639 if you had reinvested all of your dividends back in the company, by the time AT&T was broken up in February 1984.

So what if the Dow Jones Committee had kept IBM in the Dow Jones Average between March 1939 and June 1979 and had never admitted AT&T, keeping it in the Utilities Average? What would the result have been? IBM closed at 187.25 on March 14, 1939 while AT&T closed at 166.125. IBM closed at 73.375 on June 29, 1979 while AT&T closed at 57.875. Price wise, the results appear to be similar.

The difference is that both stocks split, and the stocks had several rights offerings in the intervening 40 years. The cumulative effect of these stock splits and rights offerings is significant. You would have to adjust the stock price of AT&T by 7.15 to allow for the impact of stock splits and rights offerings, but you would have to adjust IBM stock by a factor of 562.48. If neither stock had split or provided rights offerings in those intervening forty years, AT&T stock would have been at 414 in June 1979, but IBM would have been at 41,272. IBM increased one hundred times more than AT&T during those intervening forty years.

The DJIA stood at 151.1 on March 14, 1939 and 841.98 on June 29, 1979. Since the DJIA is price weighted, you can remove the impact of AT&T on the DJIA by subtracting out the price of AT&T allowing for the splits, and replacing this amount with the value of IBM stock, allowing for the splits in IBM. If you do this, you would find that the DJIA would have been at 23,582 in June 1979, not 841.98. In other words, IBM would have added 22,740 points to the DJIA had it never been removed.

The DJIA is currently trading above 16,000. If you add 22,740 points to this value, you would arrive at a DJIA close to 39,000. If IBM had stayed in the DJIA, CNBC and The Wall Street Journal would be preparing for the DJIA’s approaching rendezvous with 40,000. However, since the Dow Jones Committee removed IBM from the Dow Jones Industrial Average in 1939 and kept it out for forty years, we will have to wait several more decades to reach that goal.

Contributor(s)

Dr. Bryan Taylor

Dr. Bryan Taylor President & Chief Economist, Global Financial Data Dr. Bryan Taylor serves as President and Chief Economist for Global Financial Data. He received his B.A. from Rhodes College, his M.A. from the University of South Carolina in International Relations, and...

SAM HALE, CMT - IN HIS OWN WORDS

Sam Hale, CMT, passed away on August 4, 2015 at the age of 78. Sam spent many years as the Senior Technical Analyst – Futures and Options Divisions at A. G. Edwards before retiring in 2002. Prior to that, Sam enjoyed a successful, although brief, career in broadcasting in Atlanta, Chicago and New York, Sam shifted from the markets’ being his avocation into his full time career in 1966.

His Wall Street career included seven years at Dean Witter before forming his own NASD research boutique. After three years as president of this firm, he became a member of the CBOE and a registered Market Maker in IBM, EK, and GM. While on the floor he
remained a market timing consultant to other market makers, as well as to a member firm. He was recruited by A. G. Edwards & Sons, Inc. in 1984 where, as the senior technical analyst for the Futures and Options Department, he provided daily comments and forecasts on the stock index futures and options, as well as U. S. Treasury bonds and soybean futures.  He was awarded the CMT designation in 1995. Sam was a past MTA Board Member and affiliate of the New York Society of Security Analysts. He was Chairman of the memorable MTA Silver Anniversary Seminar. That Seminar included remarks by eight MTA Annual Award winners and the never-to-be-forgotten laser show presented by the 2015 MTA Annual Award Winner, Walt Deemer. Sam was recognized with the Annual Award in 2009.  

Sam’s retirement was eventful as he explained:

In March 2006, I suffered “sudden cardiac death” and was miraculously resuscitated three times. This was a life changer and, since, I’ve devoted much of my time ministering to widows and others who have suffered losses as a result of mismanagement, unauthorized discretion and, or, failure to supervise. This ministry began after I received a phone call from the newly widowed friend of mine. I had only met her at the memorial service, though I had known him from my days in broadcasting. She asked if I would look at her accounts to see if she should “be in those stocks”. I sensed right away there had been serious abuse and began an effort to right that wrong. It required a period of 10 months, but, in the end, without the assistance of an outside attorney (who would have taken 30% of the settlement proceeds – if there were any) I was able to get her a cash deposit back to her account of $300,000. This was the beginning and I’ve now been approached, and helped, 11 widows. Only one knows any of the others. I consider this a clear revelation of my “calling” and give thanks for every breath I take.

Over the years, Sam answered many questions from MTA members. Below are some the words of wisdom he shared
previously with readers of Technically Speaking.

When you started your professional career in the markets, the Dow had just touched 1,000 for the first time. It would be 18 years before it finally broke that resistance level for good. Did you employ market timing techniques during this time period? If so, what tools worked well? If not, what was the key to your success in this tough market?

I actually became interested in the market in 1960. I was doing the morning show at WJJD from studios at 230 N. Michigan Avenue in Chicago. (For more on Sam’s radio career, please click here.) Every day I would walk past the Merrill-Lynch showcase office a few buildings up the street where all the teleregister quotation boards were clicking and clacking. I had often read the “Dow Jones” numbers in newscasts but did not even know what those numbers represented. I stopped in the ML office one morning and was handed a book, How to Buy Stocks, by Louis Engel. I was “hooked”.

There weren’t the thousands of market related books back then. I soon had soaked up all I could get my hands on as if I were a sponge. I soon discovered “The Wyckoff Course” offered by Bob Evans out in Park Ridge and purchased the two leather-bound volumes with the fold over locked flap. Back then, long before other publishers bought SMI, Mr. Evans circulated taped “lessons” weekly. To this day, I rely on some of the principles I learned from the writings of Mr. Wyckoff.

My air shifts were generally only three hours and production recordings would require no more than one to two hours.  So, there was plenty of time to sit in the visitor’s galleries that were available in most brokerage offices back then and “read the tape”. After making eight times in the market what I made in broadcasting in 1965, I decided to test for a brokerage company job.

Enough background. Now to your question.

The backbone of my analyses was the Wyckoff P&F methodology. When based upon accurate data, correctly plotted, and a couple of simple “rules” followed with complete discipline, it has stood the test of time. The problem most would encounter with these graphs is the lack of quality data as there is no such thing as “tape reading” today.

I was surprised to learn that most commercial providers of P&F graphs interpolate the changes from the daily open, high, low and close data; not from the tick by tick data. Of course, the consolidated tape has become like the old commodity tickers that show representative prices, not every transaction. The entire data stream is not readable by the human eye – it’s a blur. Only with computers capable of capturing all these data points can one be confident that their graphs are accurate. Then, there is the problem of “corrections”. My computers scan the tape from the end of the session to pick up all the deletions, insertions and corrections.

In my opinion, those who ridicule the P&F approach are either not doing it correctly, or use inaccurate data. Or, maybe they are too lazy to do the work! Even if I had no other indicators, I could operate successfully in the markets with P&F.  Ask Bob Farrell, Alan Shaw or John Greeley what they think about this method.

You maintained P&F charts by hand. Can you comment on why you prefer this technique to merely looking at a softwaregenerated chart?

I do have programs that generate these graphs but, you’re right, it also prints out a listing of the daily changes that I plot by hand. It keeps my attention focused and “close” to the market. Perhaps it is just psychological, but the habit persists.  I know a few other successful money managers who do a select number of bar charts (and P&F) by hand.

You’ve known many market technicians over the years. In the program for the MTA 25th Anniversary Seminar you cite James Alphier as, “the most thoroughly knowledgeable person I have ever met.” Many of our newer members know nothing about Jim except that his extensive research collection was recently donated to the MTA (Technically Speaking, June 2004). Can you tell us a little about the person and technician that Jim Alphier was?

Wow! You’ve done you’re homework on me. Actually, the comment about Jim was prefaced with … “In market history and lore”; he was the most thoroughly knowledgeable person I have ever met. That is true. There are others right up there with him, with whom I’ve become more acquainted in recent years, including Bill Doane and Walter Deemer.

Jim Alphier’s “passion” was the study of the markets. When we worked together (over 25 years ago), Jim had already accumulated a half-dozen filing cabinets filled with correspondence, data, research reports, etc. He contributed the chapter on technical analysis to the original issue of the classic, “The Commodity Futures Game”, for which he received minimal credit. Years ago, Bob Prechter told me that Jim’s paragraphs on the Elliott Wave were the most clearly written piece on EW by another that he had ever read.

Jim was doing all his “sentiment” indicators by hand when he arrived. Only after we entered a cross-confidentiality agreement did he permit me to do him the favor of programming those indicators on my old PDI 8-I mini-computer. To this day, I have honored that confidentiality and have never personally used or divulged his formulae. I was disappointed to learn from John Bollinger that these were not among the items contained in the treasure chests contributed to the MTA Educational Foundation. I plan to seek an opinion from an intellectual property attorney to see if it would be proper for me to leave them to the MTAEF upon my death. However, they may be of little, or no, current value in today’s market. I don’t recall their composition and shall not even look at them. I do recall they had intriguing titles.

In your work as an analyst, what analysis framework did you employ?

I am a “pure” technician. Long ago I realized that I did not have the talent for synthesizing fundamental data into my work and comments. This actually was a handicap in writing for brokers as they (and their clients) expect to have a “reason” for the observed effect, or forecast. Such things as John Carder’s fantastic “Earnings/S&P 500” graph of the 1973-1974 period; Paul Montgomery/Ned Davis’ magazine covers; or Walt Deemer’s McDonald’s earnings growth/stock price pieces are enough to convince me that I must stay focused on the methods that work for me. Fortunately, over time, my performance record overcame the need to comment on the fundamentals. Don’t get me wrong, I am not criticizing those who are capable of economic commentaries, I admire that talent. I just know that it is beyond me. My motto has always been, “I don’t care what they’re saying, I just care what they’re doing”!

What were your favorite indicators? Did you rely on the same indicators throughout your career? If not, did you stop using indicators because the indicator lost effectiveness?

In a project with GA Tech, I was the first “individual” allowed by the Communications Department of the NYSE to connect a mini-computer to their ticker in 1968. One of the indicators I constructed from those data was the best measure of supply/demand that I could ever imagine — the average volume per trade on price advances vis a vis declines.

In researching for the 1995 MTA Journal paper on the ticker tape, I spoke with Don Worden and asked why he no longer used tick data in the studies for which he had become famous. He said because, “It doesn’t work anymore”. At that time, I reasoned that perhaps the change for it’s “not working” for him was that he might be accepting all the trades from the consolidated tape as if they were chronologically sequential. They are not.

Trades in NYSE stocks from the regional exchanges and NASDAQ may be reported to the tape as much as 90 seconds behind the actual trade and still be within the rules. Therefore, I looked only at the NYSE prints. (The high-speed tape contains an electronic flag indicating the origin of the print.) However, now with “program trading” having increased to represent half, or more, of the total volume, I question the validity of my readings as program trades are arbitrage/hedge transactions that do not represent “pure” demand and supply.

Do the same analytical techniques apply to tick data as to daily or longer-term charts?

I assume you are referring to classical charting techniques and the simple answer would be yes. If a 15-minute bar chart of QQQ were not labeled, you would not be able to discern that it was not a daily graph just by glancing at it.

One important difference I have observed is volume. There is a common adage that a reaction on decreasing activity (volume) is bullish. When I traded on the floor of the CBOE I found that to generally be true on an intraday basis. We called it “tone change”. However, in the macro view, belief in such a “principle” could be costly. Looking at historical graphs you will notice that (on a trend basis) bear markets are accompanied by decreasing volume. Lows are recorded on low activity and highs on high activity.

You say, “Wait a minute! What about selling climaxes?”

My answer to that is that a sustained up-trend rarely, if ever, begins from a selling climax. The sustained trend begins from the secondary test. The volume contracts on that successful secondary test.

You have done some work with Fibonacci targets. Do you have any thoughts as to why this technique works?

Of course it is pure conjecture, but I think there is so much yet to be learned and these relationships remain a mystery.  The notion that such “retracements” are of actual value to traders is “poo-pooed” by some noted analysts. They have run historical tests of the simple phi and PHI measures on a large set of data and determined no predictive value. I would suggest that their analysis is too simplified. One must dig deeper.

Can you comment on any specific Fibonacci techniques you find useful?

I look at a market index or average as being a simple measure of mass psychology. There are patterns in human behavior, some of which can be discerned in market statistics. Analysis that incorporates “Fibonacci” techniques requires ones going well beyond the cursory analysis that I often seen quoted. I touched on some of this in the chapter I contributed to the 2002 book, Market Analysis for the New Millennium, edited by Robert R. Prechter, Jr. and available in the MTA library.

That other irrational number, PI, is also valuable in such studies. In fact, the 8.6-year “cycle” promoted by “Princeton Economics” a few years ago is simply PI in calendar days (3,141).

Over your career, you succeeded in almost every position the investment business offers. Do you think there is a different psychology required to succeed in publishing analysis and trading? Or, do you simply apply the same personal discipline to master diverse tasks?

In my humble opinion, there are at least three distinct qualities required for an individual to be successful in the markets.  Here, I am referring to a discretionary, not systems, trader. It is imperative that the operator understands these and honestly evaluates their own skills. Analysis, strategy and execution are the areas to be mastered. Some creditable analysts are not successful traders. Some may actually also be a good strategist, yet not be successful in the long term trading. Few, in my opinion, have all these skills. Therefore, one must recognize where they are weak and secure input from one talented in that area. So, first you must know yourself. Importantly, this means knowing your risk tolerances and the time frame in which you are most comfortable operating.

Do you have any advice for those starting out in the analyst profession today?

If you don’t have a “passion” for market analysis, discover where your passion is and go there. If you have the passion, nothing will keep you from reading, listening and then, proving it for yourself. There are many that could be classified as “charlatans” who have done a tremendous disservice to the craft by marketing their “discoveries” as the Holy Grail. No one has “the” answer. You must learn what works for you and then simply have the discipline to ALWAYS play YOUR game.

If you can, hook-up (intern) with an achiever in the market. I am so very proud of the interns I engaged from Emory and GA Tech over a number of years. They had little or no introduction to technical analysis before coming with me and several will tell you it changed their life. One is a Goldman Sachs-SLK Specialist on the NYSE floor; one a quantitative analyst with an international hedge fund, another an analyst with a boutique whose service is used by the elite at a base cost of $75K  per year. These guys (and gals) were not successful because of me. I selected winners to begin with. Super bright, eager to learn and the discipline to expend the time and energy required – I simply recognized their ability and pointed them toward their “passion”.

Thanks, Sam, for your time and insights.

Contributor(s)

WYCKOFF LAWS AND TESTS

Editor’s note: Richard Wyckoff was an early influence on Same Hale, CMT, as noted elsewhere in this issue. This article was originally published at www.HankPruden.com as an example of the application of Wyckoff’s principles.

Wyckoff is a name gaining celebrity status in the world of Technical Analysis and Trading. Richard D. Wyckoff, the man, worked in New York City during a “golden age” for technical analysis that existed during the early decades of the 20th Century. Wyckoff was a contemporary of Edwin Lefevré who wrote The Reminiscences of a Stock Operator. Like Lefevré, Wyckoff was a keen observer and reporter who codified the best practices of the celebrated stock and commodity operators of that era. The results of Richard Wyckoff’s effort became known as the Wyckoff Method of Technical Analysis and Stock Speculation.

Wyckoff is a practical, straight forward bar chart and point-and-figure chart pattern recognition method that, since the
founding of the Wyckoff and Associates educational enterprise in the early 1930’s, has stood the test of time.

Around 1990, after ten years of trial-and-error with a variety of technical analysis systems and approaches, the Wyckoff Method became the mainstay of The Graduate Certificate in Technical Market Analysis at Golden Gate University in San Francisco, California, U.S.A. During the past decade dozens of Golden Gate graduates have gone to successfully apply the Wyckoff Method to futures, equities, fixed income and foreign exchange markets using a range of time frames. Then in 2002 Mr. David Penn, in a Technical Analysis of Stocks and Commodities magazine article named Richard D. Wyckoff one of the five “Titans of Technical Analysis.” Finally, Wyckoff is prominent on the agenda of the International Federation of Technical Analysts (IFTA) for inclusion in the forthcoming Body of Knowledge if Technical Analysis.

The Wyckoff Method has withstood the test of time. Nonetheless, this article proposes to subject the Wyckoff Method to the further challenge of real-time-test under the natural laboratory conditions of the current U.S. Stock market. To set up this “test,” three fundamental laws of the Wyckoff Method will be defined and applied.

THREE WYCKOFF LAWS

The Wyckoff Method is a school of thought in technical Market analysis that necessitates judgment. Although the Wyckoff Method is not a mechanical system per se, nevertheless high reward/low risk opportunities can be routinely and systematically based on what Wyckoff identified as three fundamental laws (see Table #1):

PRESENT POSITION OF THE U.S. STOCK MARKET IN 2003:

BULLISH Charts #1 and #2 show the application of the Three Wyckoff Laws to U.S. Stocks during 2002-2003. Chart #1, a bar chart, shows the decline in price during 2001- 02, an inverse head and-shoulders base formed during 2002-2003 and the start of a new bull market during March/June 2003.  The upward trend reversal defined by the Law of Supply vs. Demand, exhibited in the
lower part of the chart, was presaged by the positive divergences signaled by the Optimism Pessimism (on-balanced-volume)
Index. These expressions of positive divergence in late 2002 and early 2003 showed the Law of Effort (volume) versus Result (price) in action. Those divergences reveal an exhaustion in supply and the rising dominance of demand or accumulation.

The bullish price trend during 2003 was confirmed by the steeply rising OBV index; accumulation during the trading range this continued upward as the price rose in 2003. Together the Laws of Supply and Demand and Effort vs. Result revealed a powerful bull market underway.

THE “NINE CLASSIC BUYING TESTS” OF THE WYCKOFF METHOD

The classic set of “Nine Classic Buying Tests” (and “Nine Selling Tests”) was designed to diagnose significant reversal formations: the “Nine Classic Buying Tests” define the emergence of a new bull trend (See Table #2). A new bull trend emerges out of a base that forms after a significant price decline. (The “Nine Selling Tests” help define the onset of a bear trend out of top formation following a significant advance.) These nine classic tests of Wyckoff are logical, time-tested, and reliable.

As the reader approaches this case of “Nine Classic Buying Tests,” he/she ought to keep in mind the following admonitions from the Reminiscences of a Stock Operator:

“The average ticker hound – or, as they used to call him, tapeworm – goes wrong, I suspect, as much from overspecialization as from anything else. It means a highly expensive inelasticity. After all, the game of speculation isn’t all mathematics or set rules, however rigid the main laws may be. Even in my tape reading something enters that is more than mere arithmetic. There is what I call the behavior of a stock, actions that enable you to judge whether or not it is going to proceed in accordance with the precedents that your observation has noted. If a stock doesn’t act right don’t touch it; because, being unable to tell precisely what is wrong, you cannot tell which way it is going. No diagnosis, no prognosis. No prognosis, no profit.

“This experience has been the experience of so many traders so many times that I can give this rule: In a narrow market, when prices are not getting anywhere to speak of but move within a narrow range, there is no sense in trying to anticipate what the next big movement is going to be – up or down. The thing to do is to watch the market, read the tape to determine the limits of the get-nowhere prices, and make up your mind that you will not take an interest until the price breaks through the limit in either direction. A speculator must concern himself with making money out of the market and not with insisting that the tape must agree with him.

“Therefore, the thing to determine is the speculative line of least resistance at the moment of trading; and what he should wait for is the moment when that line defines itself, because that is his signal to get busy.”

Point #4 on the charts identifies the juncture when all Nine Wyckoff Buying Tests were passed. The passage of all nine tests confirmed that an uptrending or markup phase had begun. The passage of all Nine Buying Tests determined that the speculative line of least resistance was to the upside.

FUTURE: A MARKET TEST IN 2004

The authors as academics are intrigued by the natural laboratory conditions of the stock market. A prediction study is the sine quo non of a good laboratory experiment. The Wyckoff Law of Cause and Effect seemed to us to provide an unusually fine instrument of conducting such an experiment, a “forward test.” Parenthetically, it has been our feeling, shared by academics in general, that technicians have focused too heavily upon “backtesting” and not sufficiently upon real experimentation. The time series and metric nature of the market data allow for “forward testing.” Forward testing necessitates prediction, then followed by the empirical test of the prediction with market data that tell what actually happened.

How far will this bull market rise? Wyckoff used the Law of Cause and Effect and the Point-and-figure chart to answer the question of “how far.” Using the Inverse Head-and-Shoulders formation as the base of accumulation from which to take a measurement, of the “cause” built during the accumulation phase, the point-and-figure chart (Chart #2) indicates 72 boxes between the right inverse-shoulder and the left inverse-shoulder. Each box has a value of 100 Dow points. Hence, the point-and-figure chart reveals a base of accumulation for a potential rise of 7,200 points. When added to the low of 7,200 the price projects upward to 14,400. Hence, the expectation is for the Dow Industrials to continue to rise to 14,400 before the onset of distribution and the commencement of the next bear market. If the Dow during 2004-2005 comes within + or – 10% of the projected 7,200 points we will accept the prediction as having been positive.

Editor’s note: In 2005, the high was 10,984. Although this was below the forecasted high, the Dow continued moving higher without a significant pullback until October 2007 when it reached 14,197.

CONCLUSIONS

In summary, U.S. equities are in a bull market with a potential to rise to Dow Jones 14,400. The anticipation is for the continuance of this powerful bull market in the Dow Industrial Average of the U.S.A. through 2004. This market forecast is the “test” to which the Wyckoff Method of Technical Analysis is being subjected.

Part (B) of “Wyckoff Laws: A Market Test” will be a report in year 2005 about “What Actually Happened.” As with classical laboratory experiments, the results will be recorded, interpreted and appraised. This sequel will invite a critical appraisal of the Wyckoff Laws and in particular a critical appraisal of the Wyckoff Law of Effort vs. Result. The quality of the author’s application of the Wyckoff Laws will also undergo a critique. From these investigations and appraisals, we shall strive to extract lessons for the improvement of technical market analyses. Irrespective of the outcomes of this market test, we are confident that the appreciation of the Wyckoff Method of Technical Market Analysis will advance and that the stature of Mr. Richard D. Wyckoff will not diminish.

REFERENCES

  • Forte, Jim, CMT, “Anatomy of a Trading Range,” Market Technicians Association Journal, Summer-Fall 1994.
  • Hutson, Jack K., Editor, Charting the Market: The Wyckoff Method, Technical Analysis, Inc., 1986.
  • Lefervé, Edwin, Reminiscences of a Stock Operator, Wiley Press (original, Doran & Co, 1923).
  • Penn, David, “The Titans of Technical Analysis,” Technical Analysis of Stock & Commodities, October, 2002.
  • Pruden, Henry (Hank) O., “Wyckoff Tests: Nine Classic Tests For Accumulation; Nine New Tests for Re-accumulation,” Market Technicians Association Journal, Spring/Summer 2001.
  • Pruden, Henry (Hank) O., “A Test of Wyckoff,” The Technical Analyst, February 2004.
  • Charts courtesy of Wyckoff/Stock Market Institute, 13601 N. 19th Avenue #1, Phoenix, Arizona, U.S.A. 85029-1672.

Contributor(s)

Henry Pruden, Ph.D.

Henry Pruden, PhD, was a leading technical analyst with decades of active trading experience until his sudden passing in 2017. He was the Executive Director of the Institute of Technical Market Analysis and President of the Technical Securities Analysts Association of San Francisco,...

Bernard Belletante, PH.D.

Dr. Bernard Belletante is a Professor of Finance and Dean of the Euromed-Marseille Ecole de Management. He holds Ph.D. degree from Universite Lumiere, Lyon II, France. Dr. Belletante has published 17 books and over 90 papers. He has served as director of...

DISCIPLINARY ACTION

On August 11, 2015, the Board of Directors of the Market Technicians Association (MTA) suspended the MTA membership of Vishal B. Malkan, including the right to use the Chartered Market Technician (CMT) designation, for a period of 36 months. The suspension results from a determination that Mr. Malkan violated Ethical Standard 1 of the MTA’s Code of Ethics, which requires all members to maintain at all times the highest standards of professional integrity.  Mr. Malkan failed to disclose in his Personal Conduct Statement (PCS) that he had been the subject of a customer complaint. The MTA suspension was imposed for failing to disclose this action for several years.

Contributor(s)

INTERVIEW WITH TYLER YELL

How would you describe your job?

My role at DailyFX is as a currency analyst and trading instructor. As an analyst, I take monetary policy and look for relative edges to present to our clients. As an instructor, I teach traders new to the FX or currency market how individual account management can be optimized through understanding risk in this particular market as well as using technical analysis in order to recognize when a trade idea has been invalidated.

What led you to look at the particular markets you specialize in?

My prior role was focused on equities however, an increasing number of my high net worth clients began asking about large moves in the currency market. This was back in 2006 and I knew very little about currencies’ effects on other markets much as described by John Murphy in Trading with Intermarket Analysis: A Visual Approach to Beating the Financial Markets Using Exchange-Traded Funds.

As I began to look into the currency market, I saw that the health of an economy’s currency relative to other trade partners was much like the health of the blood in a doctor’s patient. In other words, if the patient is sick the doctor can likely tell from the blood profile that certain measures are off. Similarly, if a currency becomes too strong relative to trade partners or too week that will spin off multiple effects that can lead to excellent trading opportunities.

Once I began to realize how the relative value of a currency affected assets like bonds, stocks, and commodities I was hooked.

Do you look at any fundamental or economic inputs to develop your opinions?

Yes, a key part of my daily role starts with the Bloomberg terminal’s economic calendar as well as overnight events in Europe and Asia. Much of the valuation and currencies is based on anticipation of future monetary policy adjustments.  Therefore, it is very helpful to know what high important economic events are coming up that can weigh on monetary policy.

I help run a live trading room called DailyFX on-Demand. This trading room analyzes in real time how news events affect the currency landscape. Because it is trading focused I would say there’s a nice blend of fundamentals and technicals though as a trader myself it is likely two thirds technical and one third fundamental.

For example, inflation has taken the hill as the number one component missing from multiple central banks in order to hike interest rates. Because interest rate direction is seen as the basic barometer for currency strength or weakness it is very important to keep tabs on economic releases that impact inflation.

Another key component of currency strength or weakness is GDP. Therefore, I need to keep tabs on components of growth like retail sales and employment data.

What advice would you have for someone starting in the business today?

Find a market you are passionate about and then get really good at applying analysis to that market. You likely know if the market is right for you if you easily slip into a state of flow where time passes faster than you thought possible when analyzing the charts.

Learn to write clear, succinct, and actionable analysis and always remain humble. Anyone who is too proud likely has too little if any risk on the table and runs the risk of not being taken seriously. Become comfortable with communicating your ideas verbally and on video. If you expect your role to be client driven, you will likely be asked to provide timely analysis in an easy to digest manner.

Find heroes in the business across multiple assets. Find men and women who have traded these markets, respect the risk, and have a better grasp on how to manage their risk within the chaos of financial markets periods. The first place a lot of people look for these heroes is Jack D. Schwager’s Market Wizards. Another new and upcoming resource is a RealVisionTV, a subscription-based on-demand video service providing a library of content from industry insiders on economics and investing.

What is the most interesting piece of work you’ve seen in technical analysis recently?

Being on the front lines of working with individual traders, I would say sentiment or contrarian analysis and behavioral finance continue to be the more interesting pieces of work I’ve seen in technical analysis. An incredibly valuable tool that Daily FX offers is the speculative sentiment index which aggregates retail open positions and then displays them over price and quantifies them in ratio form. The chart below of GBPUSD helps to visualize net retail positioning over price.

You can see there was an aggregate demand to effectively call a top as the British pound was rising versus the US dollar during the second half of 2013 and first half of 2014. While some traders caught the top on short positions, they only wrote it for about a few hundred basis points before flipping that long and aggregately fighting the trend until the March 2015 current low. Just a simple understanding of biasing your signals against the retail crowd could go a long way and the effectiveness of your analysis.

While the DailyFX Speculative Sentiment Index is big picture, behavioral finance helps investors see what’s happening internally in a helpful manner. Once investors understand their propensity to become overconfident, or to only look for data that confirms their trade ideas, or to try and trade last week’s news story today they may find the need to automate their approach. As long as automation or a quantitative approach is founded on sound principles, that can be a very good thing in terms of affective analysis and investing.

What research area do you think offers the greatest potential in technical analysis at this time?

If possible, I would like to hedge my bets by offering a few different fields. Given the increase in technology and computing power, I see synergy of multiple forms of analysis allowing more confident and probable directional calls in the near future.

John Murphy has done a wonderful job of pioneering the field of Intermarket analysis and given the market I am in, I would not want to be without it as it is pivotal to my approach. Naturally, I believe Intermarket Analysis also complements other forms of analysis that I find very effective such as Elliott wave. Elliott wave is about as controversial as religion and politics in the world of analysis but I have found it consistently too be effective to put aside. I believe combining Intermarket analysis, Elliott wave, and the confirmation of sentiment analysis offers the greatest potential for technical analysis at this time. Additionally, given the increase in data and the ability to write scripts for computers to run so that analysts do not hit brain fatigue earlier in the process should also propel the value we bring to our clients or employers.

Tyler Yell is a Currency Analyst and Instructor for DailyFX, the research arm of FXCM. He started at JPMorgan & Chase before coming to FXCM in late 2007 to focus on the Foreign Exchange market and its effect on the macro picture from a purely equities point of view. Over the years, he has gained an appreciation for the unique ability of Technical Analysis to manage risk and help investors scan the globe for favorable investments that are lacking in other analytical disciplines. Tyler focuses on Intermarket Analysis, Sentiment, & Volume to add depth to his Technical analysis, which is built around Elliott Wave and Ichimoku as a foundation. You can talk markets with Tyler on Twitter @ForexYell.

Contributor(s)

Amber Hestla-Barnhart

Bio coming soon.

BLOOMBERG BRIEF HIGHLIGHTS: CHART-BASED TRADING STRATEGIES

Editor’s note: This article was originally published in the August 27 issue of Bloomberg Brief: Technical Strategies. Below is an extract of that article.

Charts have been the basis of technical analysis since Charles Dow learned how to draw point and figure charts from other
traders on the floor of the New York Stock Exchange. Despite the proliferation of indicators, charts are still the basis of
technical analysis. In a recent Bloomberg Brief: Technical Strategies, Dean Rogers, Senior Analyst at Kase & Co., presented
a complete trading strategy for natural gas.

Blue trend lines highlight a bearish flag pattern. This indicates traders should consider short positions. However, a move above $2.75, based on a confluence of several important retracement and extension levels, would indicate potential upside in natural gas. Indicating a reversal level is an important part of any analysis.

Simple trendline analysis can offer valuable trading insights. From the group of charts below, it’s possible to conclude:

  • The dollar trend has lost impetus with euro and sterling grinding upwards.
  • Commodities continue to flounder though the downward momentum has decreased.
  • Equities are trying to correct following a sharp drop.
  • Rates are climbing slowly higher from their low base.

Contributor(s)

THE SANDPIPER AND TRADING: HOW YOU CAN USE VOLATILITY MEASUREMENT TO GAIN AN ADVANTAGE IN YOUR TRADING

Editor’s note: This article was originally published at The Educated Analyst, an education blog maintained by Market Analyst.

As a trader do you find yourself sometimes looking at a price chart and wondering why a stock reversed at a specific level?  Or possibly more often you may want to know where it will stop falling so you can buy it. If so, you are thinking about what the extreme is…“what is the farthest?”

You are thinking about volatility.

You are not alone. Every person and institution involved in traded markets thinks about volatility. The difference is you are not likely thinking about volatility the same way most of your competitors are.

The real point of this series of articles is to show how you can use volatility measurement to gain an advantage in your trading.

When I use the term ‘competitors’ I’m referring to participants in markets that determine price direction. The big money is what moves markets; investment banks, hedge funds, pension funds, large proprietary trading firms, private and public corporations hedging their risks.

Here’s the difference that really matters: The big money rarely measures prices. Most of their trading is based on volatility measurement.

Read on and I’ll explain why this is.

The Big Money

Institutional firms comprising of big money do most transactions in the form of options, futures and derivative contracts.  They find it necessary to be highly specific about what they are trying to achieve, and outright ownership positions rarely accomplish that. However this also makes their trading very complex and opaque. No matter that we all see excessive claims of ‘transparency’ advertised, they try hard to make transactions confusing; mostly so they can charge clients larger fees.

Along came the year 2008. This market crash was specifically a liquidity crisis. The lack of liquidity was due to excessively leveraged positions of the derivatives contracts mentioned above.

When central banks were forced to bailout and backstop the world’s largest financial institutions, they did so. But not without future restrictions. It has taken some years to implement, but the iron-clad rules are currently implemented and being enforced.

For instance in the US the Federal Reserve requires that a multitude of risk calculations be performed daily and reported to them. These calculations are complex risk models and directly affect the amount of cash a bank must hold as capital reserves; which directly affect the amount of money to be made by bank traders and executives.

Always follow the money right?

Why Care About the Whims of the Central Banks?

The principal input into any risk model is volatility. More specifically it is Implied Volatility.

In order to see a very real and practical example of this, let’s look at how a stock option is priced. Below is an illustration of inputs to the common Black Scholes option pricing model.

The model above calculates theoretical option price. It takes seven inputs. The input with the most effect on the price is Historical Volatility.

In the real world the pricing model is different in that it swaps the position of volatility with price.

As you can see above, once someone has actually paid money for the option the model is changed to produce a volatility value called Implied Volatility. Specifically the ‘historical volatility’ value at upper right is now the option price. The result of the model then produces a volatility value.  This implied volatility value is a direct result of what the market is telling us.

Demand for purchasing derivatives contracts (options, futures, OTC derivatives) moves prices of those contracts. As you can see above, the price of those contracts changes the implied volatility. The implied volatility data is what institutions use to make decisions.

Last and most importantly: Commitments made via derivatives contracts forces buying and selling to fulfill the contract terms.

Big Money Pays for Volatility Data

I spent a year working on a consulting contract at a major US bank. Because of my experience in volatility data analysis, I was part of a team that evaluated large sets of implied volatility data for use in risk models. This was a very high profile project because it directly affected the paychecks of some important people in the bank’s upper hierarchy.

These institutions buy daily feeds of overnight implied volatility for literally thousands of global derivatives contracts.  Some of the sources were Thomson Reuters, Bloomberg, Ivolatility, MarkIT, etc. I was astonished to learn that the coming year’s budget for this data was growing over 50% to north of three million dollars annually.

Volatility measure is central to all of their risk and trading decisions. For instance if volatility increases, then risk increases and they must decrease the institution’s positions. That means selling. They do this because they have no choice. The central bank regulators require it. Negotiating is not an option for them.

This is why volatility measurement is the single most important technical component for any form of financial trading.

Volatility is not a new phenomenon. It’s all around us. What follows is an excerpt from my book Volatility –Based Technical Analysis.

Volatility In the Natural World: The Sandpiper

Do you know what a sandpiper and successful trader have in common? This is not a jest, or a trick question. Before this series of articles is done, you will understand their common ground.

The lowly little sanderling, a type of beach-dwelling sandpiper, stands just four to eight inches tall. This small bird eats by dodging the surf and digging up tiny marine life at the water’s edge. Even a very small wave is much taller than the sanderling, and this little guy can only see the next wave in front of him.

Its challenge is to gauge when to run out into the dangerous crashing waves, and do it without hesitation. The water will not pause for the convenience of a simple little bird. The waves come in regularly, then recede to reveal all the little aquatic invertebrates on which it feeds. The best food is the furthest out, nearest to the greatest wave height. Timing is everything!

Watch a sanderling a bit longer and you notice something even more incredible than their innate sense of wave timing.  They know exactly how far to go into the surf’s receding wave to dig for the best food, then dart back towards the shore just ahead of the next incoming wall of bird bone-breaking water. Think about it. They know the farthest point which can be reached before they need to turn around and come back, all the while achieving a successful bite of food necessary for sustaining another ten minutes of life on the beach. This bird does this over and over, all day long.

Does this challenge sound familiar? When to move? How far to go? When to get out? These are all critical choices we make each time we trade. If you could make these decisions as flawlessly while trading as the sandpiper does while eating, wouldn’t riches be just around the next bend? But how does the bird do it?

Each wave comes in at different heights with varying levels of stored energy which propel it towards the beach. But there is much more to consider if you want to get really scientific about it. After a succession of incoming waves, there is a surplus of water rushing back to sea and thus opposing the effectiveness of the next wave. Our little feathered friend on the beach is so good at gauging all of these factors that he is actually measuring volatility.

Yes, somewhere in that tiny brain is a volatility computer. It’s measuring all of the factors mentioned, plus more that we probably don’t comprehend. The sandpiper’s computational ability is able to calculate, based on what has been happening all around him, just where the next wave will break. He observes the past and predicts the future very accurately. Perhaps most importantly, the bird knows the best food is to be had at the extremes. That is to say, the best eating is when the sea has receded the farthest.

This last concept is one most technical traders overlook very often. The middle ground is safe but there is not much there to eat. It’s out near the edge of perceived danger where most of the money can be made. Maybe Wall Street should be recruiting sandpipers instead of quantitative analysts and traders!

Volatility and TA

Instead of looking at how far price moved, technical analysis calculations should consider the movements of volatility.  Indicators and oscillators typically measure movement of price, and sometimes volume. While this is helpful, it only gives a small part of what your competitors (big money) are using in their trading decisions. Remember the big money is driving supply and demand with volatility data fed into trading algorithms and risk models.

Wouldn’t it be more appropriate that an oscillator calculation measures the movement of the instrument’s volatility? Yes it is, and that’s the secret sauce.

Even more important is the proximity of price to previous volatility extremes. Remember that out near those extremes is where the sandpiper finds the best food, and where we find the best trading decisions. To do this we must first convert volatility into price. Specifically for technical analysts, we need volatility to be plotted as a price level on a chart.

Contributor(s)

Kirk Northington, CMT

Kirk Northington, who holds a Chartered Market Technician (CMT) designation, is a quantitative technical analyst and the founder of Northington Trading, LLC. He is also the creator of MetaSwing, advanced analytic software for Bloomberg Professional, MetaStock and TradeStation. He trades his own accounts, and...

PREDICTIVE MEDIA CONTENT ANALYTICS; 24/7 INFORMATION HAS FOREVER CHANGED FINANCIAL MARKET STRATEGIES

Editor’s note: sentiment analysis has long been a part of technical analysis. In the past, sentiment was often measured in surveys and a significant lag existed between the time questions were answered and results were distributed. Real-time communications lessen that time lag and create new opportunities for measuring sentiment. Rather than measuring just attitudes towards the market, it’s now possible to quantify the tone of headlines related to economic news. That application is the topic of this paper which was originally published by TrendPointers and is reprinted here with permission.

Executive Summary: The new methodology of MacroSentiment Analytics© (MSA) provides an application of causal or anticipatory MacroSentiment Analytics that are shown to significantly outperform the benchmark S&P 500 over a recent nearly five-year period. The complex-text analytics process, applied to the 24/7 flow of an adaptive lexicon of financialrelevant information, is a new design to extract the net meaning of content from the continuous and cumulative body of decision-relevant “news”. The results provide new leading measures of macro-influences on market behavior which are used for biweekly models of market direction.

Market Performance. MSA Signal directional accuracy is high, averaging 57%, but with the ability to catch the largest macro-influenced turns, which account for the majority of the profits to be obtained. A hierarchy of three variations of the Signals each outperformed the S&P 500 benchmark by 4.52%, 5.49%, and 11.87 % using a simple binary entry/exit signal execution.

Applications. There are many styles of investment management, from long-term asset allocation to active trading strategies. But all styles are subject to the same continuous flow of economic reports, measures of consumer activity, and many analyses of the influences on forward expectations and economic behaviors. Large-scale portfolio management can apply the continuous MacroSentiment Analytics trends and signals to their asset management strategies. Traders and funds that plan around shorter time frames and persistent volatility can use the biweekly signals and interventions. The TrendPointers/RavenPack analytics provide a superior set of core signals and intervention metrics that are applicable to long and short portfolios, and over time frames from one to eight weeks.

Introduction

How The Internet Has Provided Us With Predictive Content Analytics

The 18th-century philosopher/economist David Ricardo is credited with popularizing the investing concept of “buy low, sell high.” But as one 21st century writer noted, Ricardo failed to explain how to identify the “high” and the “low”. We’ve now spent more than two centuries trying to reliably identify and quantify those decision points.

Ricardo had the right instincts, but lacked the proverbial 30,000-foot view of the markets to observe and anticipate the speed and shifts of lows and highs along continuous trajectories. The necessary analytics Ricardo could not have dreamed of are now available.

The rapid development of 24/7 Internet information has put the entire realm of the potential knowledge of the buy low/sell high quest at our literal fingertips –if we know exactly what data to look for, how to extract it from all the noise, and how to correlate the data with our ultimate investment targets and risk parameters.

In the late 1990s the founders of TrendPointers began working on Internet-based analytics. It was evident that information available via the net would soon replace survey, consensus and audit systems with more accurate and timely research methods. We could now capture nearly real-time activity to better isolate the decision-relevant variables, eliminate many data collection limitations, and continuously test and update for statistical correlation to the targets.

Hence, TrendPointers developed a set of indicators that model the antecedents of macroinfluences on economic attitudes and behavior that eventually manifest in consumer spending and market performance.

But with the “Big Data” explosion of investment-relevant information, there is even less time to acquire, analyze and act upon the available information. The increasingly complex, volatile and uncertain corporate, macroeconomic and geopolitical influences have driven the interest to find analytics that provide measurable anticipatory benefits over conventional measures.

To capture the most contemporaneous market relevant influences, TrendPointers has partnered with RavenPack to provide timely, forward-looking, market-specific sentiment. The respective data sets are combined and used as both initial inputs and feedback within the nonlinear modeling process.

This paper describes how TrendPointers and RavenPack data produce new leading indicators that:

  • Accurately capture the variable and often unstructured influences on overall economic behavior and specifically financial markets.
  • Are able to continuously adapt to the changing configuration of the influences.
  • Create effective leading indicators and models of the shifts in market influences to provide more time to plan, modify or take preemptive asset management actions.

The MacroSentiment Concept – Why Better Leading Indicators Are Now Possible

The phrase “cause-and-effect” may be a cliché, but the reality is that the sheer speed, volume and volatility of today’s “news” is both a report and an influence on decisions that eventually manifest in measurable activity and the subject of this paper, financial market performance.

TrendPointers began work in 2004 to develop a predictive media content analytics methodology and captured the necessary 4+ years of data to test the methods and forecasting capabilities. The system has been updating the weekly and biweekly MSA database in its present format since September 2006.

The financial markets offer the most leveraged applications of predictive content analytics because of the natural responsiveness of the markets to contemporaneous information. We simultaneously harvest multiple facets of economic issues as they occur, and recalibrate the models as new data enters the systems via continuous feedback of the observable behavior.

The extraordinary benefit of Internet-based news is that it contains virtually all public factual and speculative issues, opinions and economic information. Relevant news and information emerges daily and is continuously changing. The total body of news that we define is effectively self-weighted by the real or perceived importance as exhibited by the duration and depth of coverage of the individual topics.

The current volume of financial/investment information has expanded while the temporal value of the data has been compressed. The importance of macro-factors as a driving element of portfolio performance attribution continues to increase. The macro-influences can be seen in investment returns as the standard deviation of excess returns has been decreasing steadily over the last several decades. For large-portfolio asset management or short-term trading, there is an increased risk to a strategy if the available information is not properly aligned with contemporaneous behavior, or if the data does not represent the key current influences in their proper proportions.

The Architecture of MacroSentiment Content Analytics

Transforming Unstructured Qualitative Data Into Systematic Quantitative Measures

We do not want to create the impression that the Internet just delivers macro-sentiment solutions in our e-mail every day. The 24/7 news flow provides the raw ingredients, but the actual creation of highly-correlated, leading indicators is the result of a definition/sourcing/ analytics/modeling process that is continually adaptive to the nature of the information being collected.

The agnostic TrendPointers approach to complex text analytics is to harvest the depth and breadth of relevant news from which to create trajectories of sentiment that will be highly correlated to target outcomes. Since we cannot know in advance which issues will have the most impact, we capture a broad range of issues that have been historically identified through extensive editorial analysis. The analysis process codes each article to yield a single net expression of sentiment, across up to 10,000 news items per month, and yields quantitative measures of the distribution of sentiment on a scale of Positive/Uncertain/Negative.

The Methodology: Why a True Breakthrough Is Now Possible

The fundamental concept of the Macro Sentiment Analytics methodology is to eliminate as many of the filters and lagging information as possible in the collection of sentiment data. Thus, the MSA methodology samples the original news as it emerges with the following design:

  • Sources: News analyzed independently from mass media and business media, as each offers different perspectives, depth and duration of coverage.
  • Three universal macro-economic contexts that are continually present in the news, but whose individual contributions are continually variable. Thus, the MSA methodology is continually adaptive to the changing lexicon and impact for the three categories:
    • Overall Economy
    • Housing / Real Estate / Construction
    • Recession / Recovery – the relative perspectives of a strong or weak economy

The sampling of news is conducted daily, summarized weekly and biweekly, and the raw data is then used to create a variety of summary analytic measures. The current database has been in effect and updated continuously since September 2006.

The MacroSentiment Signal Methodology – Creating A True Leading Indicator

No one number can ever be the definitive leading indicator of market direction. But a composite of correlated variables that continuously capture the most important environmental influences on the markets will provide the foundation for a sensitive leading indicator.

Step 1: Economic Macro Sentiment Index (EMSI). TrendPointers first developed an absolute sentiment measure, the Economic Macro Sentiment Index (EMSI), a summary of multiple measures from the original internet-based news.

Step 2: Economic Environment Status Index (EESI). With EMSI as the foundation the next step was to create a singular leading index, the Economic Environment Status Index (EESI), with significant and causal explanatory power and where the underlying relationship of its components retains power and relevance over time.

The EESI composite indicator was tested against 170+ internally and externally published variables to explain S&P 500 price changes. Comparisons were made using lags of 1 to 7 biweekly periods in the explanatory variable. The resulting EESI index was found to have the largest correlation (CV=.46) in explaining change in the S&P 500. The next strongest predictor was found to be (CV=.29). Figure 2 displays the cumulative correlations of the ESSI sentiment index.

The EESI uses the S&P 500 as the most universally accepted benchmark for economic markets. TrendPointers also produces signals for several other asset classes and custom asset targets can also be developed.

The EESI Leading Indicator – The composite leading indicator is shown to have a consistently high correlation to core economic measures of the overall economy. While there are hundreds of economic metrics, we use three that have both high popular interest and also represent the fundamental nature of the consumer economy, major capital investment and the financial markets.

Step 3: Market Potential Indicator (MPI). Once the underlying leading economic indicator was developed, the next step was to refine it for the financial markets using the benchmark S&P 500. After extensive testing of external candidates for capturing the market feedback data, we found that the RavenPack data provided the most beneficial enhancement to augment the power of the EESI composite index to anticipate movements in the broader markets; and the S&P 500 index specifically. The RavenPack proprietary data is organized as equal-weighted S&P 500 components with pricing related to sentiment measures on a daily basis – these provide the most immediate measure of the influences of macro sentiment on actual market pricing.

The process uses literally thousands of model simulations and dynamic variable compositions for a single asset signal to find the highest probability outcomes. Unlike many technical analyses, which look for tops, bottoms, formation of a trend, etc. the Market Potential Indicator (MPI) incorporates multiple representations of direct and indirect market influences and activity.

The MPI indicator may be used as an input in more complex modeling systems. Its benefit is illustrated in a simple regression analysis. By regressing MPI (t-1) on the change in the S&P 500 index, the regression coefficient has high statistical significance (t-value of 6.88).

MacroSentiment Model Results

The objective of MSM signals is to capture and transform MacroSentiment trends and environmental influences into high
probability directional market signals.

Editor’s note: additional test results can be found in the original paper. Two other tables are reproduced below.

Summary & Conclusions

Alan Greenspan, November 2007: “You know, I’ve been dealing with these big mathematical models for forecasting the economy… if I could figure out a way to determine whether or not people are more fearful, or changing to more euphoric, and have a third way of figuring out which of the two things are working I don’t need any of this other stuff. I could forecast the economy better than any way I know. The trouble is that we can’t figure that out.”

It’s been a long stretch from David Ricardo to Alan Greenspan. Neither of them had the benefit of what we take for granted today: the enormous, continuous flow of information from countless perspectives but that typically has a very short time utility, and where insights are often buried in the vast flow of Internet information.

Historically we all have used relatively few standard sources for analytic inputs, from official government reports to surveys of consumers and businesses. The one common element was that we all had to wait for what was essentially lagging information.

We can now begin to address Greenspan’s lament of not being able to read people’s minds. While we cannot literally read minds, we can now capture the information that is going into those minds, in near real-time, and then apply those measurements to traditional statistical methods and model the likely outcome of the information on behavior.

The macro-sentiment we capture is real, but its ultimate accuracy is only evident after the fact. The reality of the modern markets is that we must all act on the best available information. So the new, efficient market is now based on the earliest available information that will affect the markets and consumers, and the ability to act on it before it becomes common knowledge.

Macro Sentiment Analytics and models provide the high-value tools that can be employed by professionals with their usage specific knowledge and corollary data and within their own risk management strategies.

About RavenPack Data: RavenPack News Analytics (RPNA) provides real-time structured sentiment, relevance and novelty data for entities and events detected in unstructured text published by reputable sources. Publishers include Dow Jones Newswires, Barron’s, the Wall Street Journal and over 19,000 other traditional and social media sites. Over 14 years of Dow Jones newswires archive and 7 years of historical data from web publications and blogs are available for back testing. RavenPack detects news and produces analytics data on over 34,000 listed stocks from the world’s equity markets, over 2,500 financially relevant organizations, 138,000 places, 150 currencies and 80 commodities.

About TrendPointers MacroSentiment Analytics: TrendPointers has developed a new complex text analytics methodology, Macro Sentiment Analytics, to define, harvest, analyze and transform the complex text in the 24/7 news flow into quantitative measures of macro influences on economically sensitive behavior. The MSA analytics and leading indicators are more highly correlated to and lead conventional economic indicators by two weeks to months or more, including Retail Sales, Housing Starts and market indices. The Macro Sentiment Analytics are a near universal measure of leading economic sensitivity and are being applied to digital media spend optimization, business planning and equity market signals. The MSA database is updated weekly since its inception in September 2006.

Contributor(s)

Richard Spitzer
Bill Lattyak
Peter Hafez

RESEARCH UPDATE: HIGH FREQUENCY TRADING: A TOOL OF PANACEA OR FIASCO FOR MARKET LIQUIDITY AND ROBUSTNESS?

Abstract: The report discusses the merits of the entitling High Frequency Trading and its rapid growth as the future of computerized trading. A lack of evidence and analysis in the recent literature endows most of the contemporary discussion as only that of an interim typology and in turn grants an element of caution and vigilance in the use of HFT, in the light of the May 6th 2010 flash crash. The effort to introduce controls has been gradually surging in the recent years, through the likes of MiFID I and MiFID II by the European Union. Then, the aim of this paper is to provide an overview of recent research on the effects of HFT on market quality and its robustness. The objective is to evaluate recent literature and assess main theoretical frameworks and empirical findings. Understanding the implications of HFT on market quality could potentially offer a better understanding of cost of capital. Furthermore, the awareness of the changing trading market microstructure opens up new policy options in controlling for market manipulation and aid in sustaining investors’ longterm welfare.

1. Introduction

The advent of algorithmic and high frequency trading (HFT) has become a popular topic over the last decade. The proliferation of digitization in securities market has enabled trade transactions to be settled in the speed of milliseconds.  The implication of HFT on market liquidity, short-term volatility and price discovery has caught the attention of many researchers and academics. Yet, the impact of automation and HFTs on market quality still remains controversial. Not only has HFT been associated with increased probability of flash crashes, but also probes some questions about market abuse.

The aim of this paper is to provide an overview of recent research on the effects of HFT on market quality and its robustness. The objective is to evaluate recent literature and assess main theoretical frameworks and empirical findings.  Understanding the implications of HFT on market quality could potentially offer a better understanding of cost of capital.  Furthermore, the awareness of the changing trading market microstructure opens up new policy options in controlling for market manipulation and aid in sustaining investors’ long-term welfare.

1.1 The ambiguity surrounding the High Frequency Trading and the role of latency

Perhaps one of the reason why the impacts of HFT on market quality has not been clear-cut is the fact that (I) the exact definition of high frequency trading has not been constructed yet (according to the US SEC) and (II) the role of latency in the market is faced with ambiguity. First, it is important to make a distinction between algorithmic trading and high frequency trading.

Algorithmic trading (AT) refers the use of computer-based systems to execute trading decisions, usually used by institutional investors, large hedge funds and trading desks. Algorithms serve as a tool for controlling for execution costs by spiting large orders and progressively place them in the market. Submitting large orders all at once may cause negative price impact, thus feeding fractions of orders in a timely manner allows for more effective control of market risk (Kim, 2007).

High frequency trading is associated with algorithmic trading in a sense that it also uses automated system to execute trades. However, common consensus relates HFT to the speed of accessing and analyzing new information and consequently executing trades in high-speed concessions. Unlike algorithmic trading, where the spit orders may be held for longer periods, i.e. days or weeks, HFT closes positions by the end of each trading day (Benos and Sagade, 2012).

As the main distinguishing factor of HFT is speed, also as suggested by the term itself, it is crucial to point out the role of latency in assessing market quality. The fact that different empirical studies use different estimates as a proxy for latency and different measures for HFT activity implies that any inference regarding HFT’s impact on market quality should be taken with a grain of salt. For example, Zhang (2010) in his study of “The Effect of High-Frequency Trading on Stock Volatility and Price Discovery” infers HFT activity based on institutional holdings and turnover over every calendar quarter.  Boehmer, Fong and Wu (2012) indirectly estimate HFT activity by calculating the “trading volume to message traffic” ratio.

A vast of research advocates in favor of HFT and its impact on market quality. Most addresses the role of low latency in absolute terms. The idea is that low latency trading benefits the market by (i) reducing intermediation costs, (ii) improving liquidity, and (iii) reduces short-term volatility. However, latency could also be thought of in relative terms, as suggested by Friederich and Payne (2012). In their paper, focusing on the negative externalities of HFT and market abuse, it was suggested that the implication of latency is channeled through the widening discrepancy in latencies experienced by different market participants. Biais and Foulcault (2014) also discuss the impact of HFT in terms of relative latency, i.e. speed differences across market participants, and conclude that as HFT possess speed advantage, this might discourage ‘slow’ traders to participate in the market and increase systemic risk.

2. High Frequency Trading strategies

HFT is not an independent trading strategy, but a different and incredibly faster way of implementing existing strategies.  The objective of HFTs is to execute these transactions before anyone else, exploiting greater processing and execution speed to obtain trading profits while holding essentially no asset inventory (Cartea & Penalva, 2011). The main strategies can be divided in three different categories (Chlistalla, 2011):

  • Liquidity providing strategies: HFTs replicate the role of traditional market makers, using the same business model, but incurring in lower costs due to automation. Every millisecond, they post several limit orders, providing liquidity and immediacy on both sides of the trading book. In contrast to designated market-maker, however, they do not have any obligation to stand ready to provide liquidity, even in adverse market conditions. In addition, they do not have any restriction in terms of amount of liquidity they can demand through market-orders (Brogaard, et al., 2013). Finally, HFTs tend to assume small inventory risk, as they try to end the trading day with a delta-neutral position.
  • Statistical arbitrage strategies: HFTs ensure that the same asset trades for the same price in different venues, by contemporaneously selling the overpriced asset and buying the underpriced one. Alternatively, they exploit discrepancies between derivatives theoretical and market prices, or between an index (through ETF or futures) and its underlying components.
  • Liquidity detection strategies: Institutional traders use trading algorithms to minimize the market impact of a large trade. These algorithms typically break the order in small pieces, in order to hide the real size of the transaction. HFTs try to uncover the existence of these large trades by sending lots of small orders waiting to be executed. Once they spot the opportunity, they hit the liquidity ahead of the institutional investors ad drive the price up (momentum – ignition).

Other strategies include: quote – stuffing, i.e. flooding the market with huge numbers of orders and cancellations in rapid succession, in order to cause stale pricing or false mid-prices; layering, which consists in placing a number of sell orders – at several price points – to simulate a strong selling pressure that drives the price down (vice versa for buy orders) (Tse, et al., 2012).

All these strategies have an unclear impact on market quality, in terms of liquidity, information content of trades, price discovery, volatility and robustness.

In the past years, academics have long debated on the pros and cons of this ultra-fast form of trading, and they produced over time a nourished literature on the subject.

3. Evidence for the High Frequency Trading

3.1. Liquidity implications

During the last several years HFT have been gradually replacing human market makers as main source of liquidity, especially in stock markets. At the same time, the bid – ask spread, one common liquidity measure, has been consistently reducing (Figure 1). Of course, correlation does not imply causation, so we need to demonstrate how HFT increased market liquidity.

There are several pieces of evidence that show the increasingly important role of HFTs in providing market liquidity. Using a unique dataset from Nasdaq OMX that distinguished HFT from non-HFT quotes and trades (Brogaard, 2010), the paper shows that HFTs participate in 77% of all trades, demanding liquidity for 50.4% of all trades and supplying liquidity for 51.4% of all trades. Similar results are obtained by (Hendershott & Riordan, 2013), who examine AT and their role in price discovery in the 30 DAX stocks on the Deutsche Boerse in January 2008. The AT liquidity demand represents 52% of the volume; whereas the AT supplies liquidity to 50% of the volume.

(Hendershott, et al., 2011) try to address the problem of causation and correlation by focusing their analysis on the larger category of algorithmic trading (AT), which includes HFT. They argue that if algorithms are cheaper and/or better at supplying liquidity then traditional market makers, then AT may result in more competition in liquidity provision, which, in turn, should lower the cost of trading. However, the effects could go the other way if the algorithms are used to demand liquidity. As a matter of fact, when used by liquidity demanders, algorithms may make them better able to identify and exploited by liquidity providers, which could lead to higher transaction costs.

Since they cannot directly identify the trades generated by algorithms, they use, as a proxy, the number of NYSE electronic message traffic, normalized by the trading volume. If we assume that AT are providing liquidity, the variation of this proxy is essentially due to the submissions or cancellations of limit orders, which should represent the AT activity. The authors used an event study approach, exploiting the introduction of the autoquote [1] in the NYSE market structure as an instrumental variable. They showed that, for the largest capitalized stocks, AT effectively improves liquidity, as bid – ask spreads narrow after the introduction of autoquote. Perhaps this is due to a decline in adverse selection, or a decrease in the amount of price discovery associated with trades.

This result is consistent with (Hasbrouck & Saar, 2013), who link higher low-latency activity on NASDAQ market to lower posted and effective spreads. However, there are not significant effects for small-cap stocks. In addition, this analysis focuses on the AT overall, and it is not possible to isolate the effects produced by HFT.

In addition, one of the clearest evidence of liquidity was noticed by (Menkveld, 2013) in the European market. After studying the presence of HFT in both Euronext and Chi-X market, the paper shows that more than 60% of the market are passive traders (providing liquidity). Moreover, the author also shows that while the Belgian market (without the presence of HFT) still has to wither a high bid-ask spread, the participant of HTF has helped the Dutch market achieve an approximate 50% reduction of the spread.

Most of the research has been in favor of HFT, trying to demonstrate their important role in improving market liquidity.  However some other researches did not lead to the same result as they argued that HFT is able to damage market liquidity hence it reduces market robustness. One of the clearest strategies employed by HFT traders is to earn profits by front-running non-HFT activity. HFT funds achieve front-running by analyzing the pending orders of non-HFT funds, identifying their strategies then using these asymmetric information to process trades ahead of those traditional fund. As a result, HFTs are able to either drive the price up or down by buying or selling with their own accounts before filling customers’ order or non-HFTs. This makes non-HFTs incur higher transaction costs; hence, HFTs gain through direct expenses of nonHFT funds. Moreover, (Grossman and Stiglitz, 1980) shows that this type of strategy pause a large threat of reducing not only liquidity but also price efficiency. By declining non-HFT’s gain, front-running (or anticipatory trading) will then decrease those non-HFTs’ motivation to perform fundamental researches as well, which leads to a reduction in market information production in the long run.

Employing data of return and trade in NYSE and NASDAQ-listed stocks for one year, (Hirschey, 2013) provides further information this problem by analyzing the period of aggressive buying and selling of HFTs. The paper shows evidences supporting the hypothesis that an aggressive trade of HFT will be followed an aggressive trade of non-HFTs on the same stock as well as the rise of this stock price. In addition, the front-running evidence is much stronger when non-HFTs do not properly focused on disguising their orders flow. Although the hypothesis can be caused by other reasons than frontrunning (such as HFTs analyze news faster by using better technology), the remaining tests in the study have helped to slightly remove these doubts. Still, the author believes that there are still limitations in the study and proposes further researches should focus more on this aspect in order to deliver a clearer evidence of front-running.

3.2. Information content and price discovery

Asymmetric information and price discovery have long been of interest of economic models. These models can be easily applied to HFT, because the basic economics of market – making and the effects on markets of differentially informed investors are the same whether the market is automated or manual (Jones, 2013).

Brogaard (2013) analyzes the effects HFTs have on price discovery, using transaction level data from NASDAQ that identifies HFTs. They argue that informed HFTs play a beneficial role in price efficiency by trading in the opposite direction to transitory pricing errors, and in the same direction as future efficient price moves. They create a model in which price movements are decomposed into permanent and temporary components: the former is considered the result of new information and the latter is interpreted as pricing error or noise. This is done through liquidity demanding orders (market orders), which are subject to bid – ask spread and trading fees. The informational advantage, however, is still able to generate trading profits, as it is still higher than the costs. In contrast, their liquidity supplying orders (limit orders) are adversely selected. Nevertheless, their adverse selection costs are lower than the bid – ask profit and liquidity rebates.  Overall HFTs have a beneficial effect in the price discovery process because, as a result of their activity, prices have are more informative, and this can lead to a better resource allocation. When trading on superior information, HFTs impose adverse selection costs on other market participants. These costs, however, are balanced by the positive externalities generated by greater price efficiency.

AT monitor markets more efficiently than human traders, and they update their quotes in response to the information they collect. An increase in algorithmic activity causes more changes in the efficient through a quote update rather than via trade (Hendershott, et al., 2011). AT then increases the amount of price discovery that occurs without trades, i.e. just by looking at the different quotes. This informational advantage is even higher for HFTs, as they are able to extract information from observing prices and react in few milliseconds. However, a limitation in both Brogaard (2010) and Hendershott et al. (2011)’s data set is pointed out by Cartea and Penalva (2012), showing that those research was not able to identify the proportion of AT and HF in the activity of the sample firms. Moreover, it is possible that a large proportion of HFT strategies was not listed in the data set as it should be (Cartea and Penalva, 2012, page 35).

In one of the most influential paper on market microstructure, Glosten & Milgrom (1985) argue that one of the reason for the existence of the bid–ask spread is information asymmetries. In their model, market makers compete against each other when transacting with potentially informed traders. Thus market makers face a “lemon” problem (Akerlof, 1970), since a customer agreeing to trade at the specialist’s quotes may be trading because he knows something that the specialist does not. Thus, the specialist must recoup the adverse selection costs suffered in trades with the well informed by gains in trades with liquidity traders. These gains are achieved by setting a spread.

HFTs, however, are not exposed to such informational disadvantage, as they have a superior ability to extract information from the market. As a result, they incur in lower adverse selection costs, and this cost reduction should be reflected in a narrower bid–ask spread (Jones, 2013). In addition, since they replaced human activity with ultra–fast automated processes, they have significantly lower variable costs, than traditional market makers, which can further narrow the spread.

Along with the mentioned benefits, many researchers have analyzed the other side of HFT in producing price efficiency.  Since HFT strategies offer a very short holding period, those researchers have been questioning whether the price efficiency caused by HFT is also a short-term phenomenon or it is a long-term effect that can improve market robustness.  In theory, (Froot et al. 1992) demonstrates that the short holding period might make HFT trades give more weights in short-term information, which lessen the market efficiency by reducing the incentives to perform fundamental researches.  In addition, momentum and positive-feedback strategies, when excessively employed by HFT traders, will make the stocks significantly deviate from its fundamental values. Hence, market will become less and less price efficient in the long run (De Long et al., 1990). In a more recent research, Zhang (2010) also draws a same conclusion when showing HFT is negatively correlated with price efficiency and market ability to reflect information.

4. Evidence against the High Frequency Trading

4.1. Volatility implications

One of the wide spread concerns about HFTs is that their activity could increase short-term volatility, which could ultimately harm non HFTs. For example, in its concept release on equity market structure, the Securities and Exchange Commission (SEC) argued that: “Short-term price volatility may harm individual investors if they are persistently unable to react to changing prices as fast as high frequency traders. […] Excessive short-term volatility may indicate that longterm investors, even when they initially pay a narrow spread, are being harmed by short-term price movements that could be many times the amount of the spread”. (Securities and Exchange Commission, 2010).

However, the academic response to this matter is not entirely clear; hence we are not able to provide a unique answer.

Brogaard (2010) uses a unique dataset from Nasdaq OMX that distinguished HFT from non-HFT quotes and trades. He then runs an OLS regression to observe whether there is a relation between HFT and volatility. The results suggest that HFT and volatility are not highly related, especially contemporaneously. Since he can identify high-frequency trades, he compares the price path of stock with and without HFT being part of the data generation process. It also suggests that HFT reduces volatility to a degree. Similar results are obtained by (Hasbrouck & Saar, 2013) who use OLS to show that higher low-latency activity is associated with lower short–term market volatility. Finally, (Hendershott & Riordan, 2013) argue that AT demanding liquidity during times when liquidity is low could result in AT exacerbating volat

Contributor(s)