New Banking Crisis, No Double Dip and Flattening the Yield Curve for Growth

Jack Ewing, in The New York Times, writes about an impending banking crisis arising from the need for banks around the world to roll over more than $5 trillion in short term debt by 2012. The problem is probably biggest for European banks, but about $1.3 trillion is in the U.S.

Dave Kansas, in the Wall Street Journal, writes about why the double dippers are wrong.

While I won’t dispute much of what Kansas cites as his reasons for optimism, I can’t help but notice that some of the things are what he expects will happen. Ewing, on the other hand, discusses some facts that are pretty cut and dried. The consequences of the facts cited are where the debate would lie.

Take a look at both articles. If you had to vote for one or the other as being a more significant discussion of what lies ahead, which would it be?

Now, take a look at what Caroline Baum has to say at Bloomberg.com. She says that a recession is virtually impossible with zero interest rates at the short end of the yield curve unless long rates also go to zero. This is close to what happened in Japan over the past twenty years when they held short term interest rates near zero for much of the time. Baum does not discuss the Japanese experience in her article.

Baum says that a flat or inverted yield curve is a necessary condition for a recession. And it is true that it has occurred before every U.S. recession for the last 40+ years. And it is also true that a steep yield curve, 300 basis points or more, traditionally has preceded or accompanied strong growth. We now have a yield curve steepness well above 300 bp.

However, banking analyst Chris Whalen has suggested that growth could be encouraged by flattening the yield curve with an increase in the short term rate. Whelan’s logic seems quite perverse, but it is centered on the thesis that flattening the yield curve would decrease the attractiveness of the Treasury “curve trade” and cause banks to start seeking private lending opportunities. One weakness in Whelan’s argument is that there seems to be little demand in the private sector for loans. The low level of C&I (commercial and industrial) loan activity by banks was discussed yesterday.

If you think that anyone has figured out what the coming months, quarters and even years holds in store, then you better stop reading what others are saying. When you pick several essays at random it is likely that each will have major points in disagreement with all the others.

Disclosure: No positions.

S&P 500 Earnings Can Move Higher

The following graph shows inflation-adjusted earnings for the S&P 500 for the past 75 years. I often see patterns when looking at charts and have added some visual “guidelines” to the graph.

The implication is that earnings are right where they belong, if the growth trends of the past 40 years are to be continued. This is debatable, of course, for the following reasons:

1. Unprecedented unemployment problems compared to the last 40 years;

2. Continued deleveraging from the credit bubble;

3. Higher energy costs likely inhibiting any growth scenario;

4. Lack of any new economic expansion driver being evident; and

5. The current rebound is the result of temporary inventory rebuilding.

Other arguments could be used to counter the idea that S&P 500 earnings are right where they belong. One is that we have rebounded too fast, resulting in an overshoot that will be pulled back as inventory rebuilding subsides and expense/revenue ratios increase.

Part of the rebound in earnings has resulted from cutting expenses, particularly payroll expenses. If employment starts to improve as payrolls start to increase, productivity growth will probably be reduced, and earnings may give back some of the recent gains. Of course, increased employment should increase consumption. However, a more conservative attitude toward spending may be the new normal. In such a situation, payrolls could increase faster than earnings.

Jeff Miller has questioned a number of things about the discussion up to this point. This will address some of his concerns.

Here is another view of the earlier graph:

This view of the graph sees only one excursion outside of trend: the crash of 2008-09. There was no dot.com stock bubble or credit-derived stock bubble; these two peaks are just high points in the 75-year trend. This trend in real earnings is more than double what was previously proposed as the normal trend. Here the growth rate is about 125% in 45 years (vs. the 50% growth in the earlier view). The 125% rate is 1.8% growth per year compounded.

Jeff has also questioned the adjustment of S&P 500 earnings per share for inflation. He raises a good point; I have not justified that. It may be more meaningful to look at the total earnings of corporations when adjusting for inflation – this definitely would offer a strong measure of real earnings for stocks. The adjustment of earnings per share may or may not reflect the total of earnings.

I also think that real corporate earnings should be normalized to population to get a good measure of national productivity in so far as corporate earnings reflect that. All of these questions will be examined as I develop future articles in my series on the business cycle.

The final takeaway from this discussion is that projections of further real earnings growth occurring in 2010 and 2011 are predicting either (1) a new earnings bubble, or (2) a new normal which establishes a higher trend rate for earnings growth, or (3) continuation of a 75-year trend.

Of course, there may not be further S&P 500 earnings growth in 2010 and 2011, and the entire discussion of bubble or a magical new normal becomes rhetorical.

Many thanks to ChartOfTheDay.com for another seemingly simple chart that evoked great discussion. This article could be called a “macro view” as opposed to bottoms-up earnings potential analysis company by company. I believe this “micro view” has more merit, but the long-term trend analysis does add an element of perspective for me.

Disclosure: No positions.

On Leverage

Steve Hanke at The Cato Institute has constructed the following picture of worldwide leverage (here), based on dollar denominated activities.

This picture assumes that the leverage of the world is balanced on the financial construct of the United States. In view of the ownership of U.S. sovereign debt and dollar denominated financial derivatives throughout the world, this is very logical.

Use of Leverage

Accepting the concept that the U.S. financial structure is leveraged, as shown in Hanke’s diagram, and the world financial structure is leveraged off the U.S. financial structure, let’s look at how levers are supposed to work. The three types of lever mechanical constructs are shown in the following diagram from The Free Dictionary.com (here).

The world financial system is presumably the fulcrum for increasing the leverage for doing work in the world that depends on financing productive activities. The fulcrum that Hanke has drawn looks nothing like the fulcrum in any of the three stable mechanical systems. In fact, if you place Hanke’s inverted triangle in place of the fulcrum in any of the three lever type diagrams, the existence of even the slightest load or effort tips the fulcrum off its balancing point and the mechanical system collapses.

Over Leverage

I would suggest that the financial system, existing in a vacuum, can remain stable, balanced on the inverted pyramid point. But, when financial leverage becomes extreme, the internally stable structure can not withstand any external force (economic activity) outside the complex interrelationships that are so carefully balanced on the single point.

The situation described in the previous paragraph is called a metastable condition in the physical world. Stability exists only in the absence of a stimulus sufficient to initiate movement to a more stable condition. A metastable condition can be compared to a balancing act.

Systems of fractional reserve banking are utilized to increase the velocity of money. One dollar of money can have the effect of several dollars when credit is created several times the value of the reserve dollar. This works well when the credit is used to increase production of things of utility. It leads to metastable conditions when the credit is used simply to create more credit.

Valueless Money

Credit is surrogate money. Money has no value; only the things it can be exchanged for can have value. If credit is used to do nothing but create more surrogate money, it is valueless. A measure of marginal valuelessness is inflation. It can be argued that 95% of today’s U.S. dollar is valueless. That is the value of the U.S. dollar destroyed by inflation in the past century.

The above argument overstates the loss of value because many things of utility have been created during this century that did not exist at the beginning. But it is certainly true that a significant part of the inflation of the past hundred years represents no value. A Ph.D candidate could do a thesis on an analysis to determine the value obtained and the valuelessness accumulated in the past century.

Limits of Leverage

I expect that such a research project would produce a curve such as the following, which would define the limits of leverage:

This curve could have as much (and as little) value as the famous (infamous) Laffer Curve relating the amount of tax revenue to the level of taxation (
here

Laffer_curve
). It has a relationship to the following graph based on real data, which is adapted from what I have published earlier this year in “the Declining Usefulness of Debt” (here). Go to that article for details and acknowledgements regarding the origin of the relationship graphed.

The above graph shows that the added improvement in GDP per dollar of debt has been declining for the past 40 years and is approaching only ten cents of increased GDP for every added dollar of debt. The debt referred to here is total debt: all public and private debt.

Summary

Archimedes famously said, “Give me a place to stand and with a lever I will move the whole world.” He didn’t mention the fulcrum; that was simply implied. Today, in finance, the fulcrum is the crux of the matter and it has been neglected. In addition, if the lever becomes long enough to have a very large mechanical (or financial advantage), destabilizing forces can cause the slightest deviation from equilibrium to produce loss of control and a crash.

Archimedes may have been able to move the world with a big enough lever, but it would be quite another matter for him to have controlled the movement.

Healthcare Costs and the Human Genome

Nicholas Wade had a very interesting article Tuesday morning in the New York Times (here). He reports that Dr. Stephen R. Quake, a Stanford University engineer, has invented, designed and built a machine, called the Heliscope Single Molecule Sequencer. The machine is about the size of a refrigerator and costs $1,000,000 to buy from Helicos Biosciences, a company founded by Dr. Quake.

The machine works by splitting the double helix of DNA into single strands and breaking the strands into small fragments that, on average, are 32 DNA units in length. Light emitted when a new helix is formed on these fragments characterizes the sequence in the original DNA. The light is computer analyzed and compared to human genome sequences already on file. This is done for billions of these small fragments and the differences in the test sequences from those most common in the general population identify what makes that individual unique.

Only Seven Individual Genomes Recorded to Date

Dr. Quake has decoded his own genome. He says the cost was $50,000. It is not clear if this was the operational cost only, or includes some depreciation allowance for the machine. I infer that some machine capital cost was included based on the time of analysis (four weeks) and the human involvement (three people). I was amazed to learn that Dr. Quake is only the seventh individual to be decoded in history. The first individual human genome decoding was in 2003. Only two of these individuals are named: J. Craig Venter, a pioneer of DNA decoding; and James D. Watson, the co-discoverer of the DNA double helix. Dr. Quake has compared his genome with those two and found the DNA overlaps shown in the following graphic from the NYT.

In addition to the seven individuals, the Human Genome Project has mapped the DNA sequences of a large group collectively to represent a human population “mosaic”. The project website is here. In spite of the Human Genome Project being declared complete in 2003, approximately 8% of human sequences remain to be characterized according to Wikipedia.

Technology Roadmap

Wade gives a brief history and projection of the future:

For many years DNA was sequenced by a method that was developed by Frederick Sanger in 1975 and used to sequence the first human genome in 2003, at a probable cost of at least $500 million. A handful of next-generation sequencing technologies are now being developed and constantly improved each year. Dr. Quake’s technology is a new entry in that horse race.

Dr. Quake calculates that the most recently sequenced human genome cost $250,000 to decode, and that his machine brings the cost to less than a fifth of that.

“There are four commercial technologies, nothing is static and all the platforms are improving by a factor of two each year,” he said. “We are about to see the floodgates opened and many human genomes sequenced.”
He said the much-discussed goal of the $1,000 genome could be attained in two or three years. That is the cost, experts have long predicted, at which genome sequencing could start to become a routine part of medical practice.

(My underlining added for emphasis.)

The dramatic progression of cost per individual for genome characterization is shown in the following graph:

Dr. Quake’s process is accurate to within 5 errors per 100,000 sequences. George Church, a leading biotechnologist at the Harvard Medical School, has said that the next real breakthrough in this technology should see a cost of $5,000 per individual with only one error per 100,000 sequences. I have put my SWAG (Stupid Wild Ass Guess) regarding a possible timing for achieving Church’s breakthrough requirement on the graph in red. The reader should recognize the great uncertainty in putting a breakthrough on a future timeline.

What is the Potential Market for Decoding Machines?

That is a difficult question to answer because the competing technologies are still rapidly evolving. With Dr. Quake’s machine, the time for analysis was four weeks with three persons involved. To do market estimates, one has to make assumptions about how analysis time per person will decline and how many individuals will be decoded per unit of time.

One thing is evident. At whatever level of usage, the cost of the machines is a small part of the total cost. Look at 100,000 people per year with the Quake machine (red region). Assume a 10 year machine life. Let’s use $20,000 per test (40% of Dr. Quake’s processing cost of $50,000, assuming economies of scale with repetition of procedures). With these assumptions, the machine cost (at $1 million per machine) of $7.69 billion is about 1/3 of the the cost of 10 years of processing ($20 billion). If the machine life is 20 years the machine cost is 1/6 of processing cost.

If we assume that the future high throughput technology (blue region) can be accomplished withmachines costing $1 million, the capital costs per 100,000 tests per year is lowered to $962 million. Processing costs (at $5,000 per test) for 10 years is $5 billion. Capital cost is now less than 20% of the processing cost. Only at $1,000 per individual do the two costs become comparable over 10 years.

The potential market for these machines appears to be of the order of 1,000 to a few thousand machines per year, or an annual cap ex of $1 to $5 billion. This will depend on how much benefit can be obtained from application of individual genomes to curing and preventing disease. That question will be discussed next.

What Benefits Might Result?

Two outcome benefits are expected. One of these is related to the identification of genetic predisposition to specific diseases, enabling early intervention to prevent disease development or slowing progression from early stages. The second outcome expected is the more accurate diagnosis of presented symptoms, which will enable focused treatments and reduced use of other diagnostic testing.

Are the benefits worth the cost? That is difficult to answer because the application of the technology is in its infancy. It has been found that many diseases turn out to be caused by a complex combination of a number of rare variants. Identifying the specific individual DNA sequences responsible for many diseases has not been possible. It is possible that statistical comparison of each individual to a pool of genomes for individuals with specific diseases will be the procedure first used to narrow the scope of further diagnostic tests. Eventually, of course, specific complex genetic patterns may be identified. In early applications probabilistic applications are likely.

Is There an Economic Case?

An assessment can be made with a macro analysis. The total health care bill for the U.S. annually is of the order of $2 trillion. If we take 1,000,000 DNA decoding tests annually as a benchmark, the cost would be $11 billion per year, with a 10-year cap ex amortization. The cost of this program at this level is less than 1% (0.55%) of total health care costs. Thus, the cost barrier to implementation is low. If savings of only a few percent are realized by eliminating unnecessary tests, errant therapies resulting from misdiagnosis and the early treatment and prevention of many diseases, this could have a huge cost reduction impact and an improved health result for the populace.

We should look at the cost reduction and outcome improvements not only for the year of expense for sequencing the genome of an individual. The genome is time invariant (barring a spontaneous mutation) and all future diagnosis and treatments will be improved from the one-time sequencing.

Be Careful What You Wish For

In the NYT article, Dr. Quake explains an alarming discovery in his genome: He found that he carried a marker for heart disease. From the NYT:

Dr. Quake said that analysts were annotating his genome and had found a variant associated with heart disease. Fortunately, Dr. Quake inherited the variant from only one parent; his other copy of the gene is good.
“You have to have a strong stomach when you look at your own genome,” he said.
Dr. Quake said he was making his genome sequence public, as Dr. Venter and Dr. Watson have done, to speed the advance of knowledge.
Some people may decline the opportunity to have their genome analyzed. They may prefer to follow the maxim: Ignorance is bliss. Others may worry that “defects” in their genome could be used to discriminate against them, in employment or with respect to life insurance, for example.

If it can be determined that medical outcomes are much improved for many people at a lower cost, an incentive to participate in genome screening could be lowered medical insurance premiums. If someone preferred to pay for the higher cost of ignorance and face the risk of being surprised at some time in the future by an advanced stage of disease that might have been preventable, of course they should have that choice.

Conclusion

We have a medical and technological revolution unfolding. The first process for decoding the human genome was developed by Frederick Sanger in 1975. It took many years for the early work to evolve to the Human Genome Project and, from there, to the new decoding procedures of today. The progress in the near future is expected to be very rapid. In the NYT article, Nicholas Wade presented the following quote from Dr. Quake:

“There are four commercial technologies, nothing is static and all the platforms are improving by a factor of two each year,” he said. “We are about to see the floodgates opened and many human genomes sequenced.”

With America looking for ways to bring a new level of cost control to spiraling health care costs, what could be better that new technology that not only reduces costs, but also produces better outcomes?

By John Lounsbury, http://piedmonthudson.wordpress.com
John Lounsbury — Seeking Alpha