Sabtu, 31 Desember 2011

Buses are another matter

From Matt Turner:


First, two commonly suggested responses to traffic congestion—expansions of the road and public transit network—do not appear to have their desired effect:  road and public transit expansions should not be expected to reduce congestion.  Second, traffic levels do not help to predict which cities build roads. Therefore, new roads allocated to metropolitan areas on the basis of current rules are probably not built where they are most needed, which suggests that more careful reviews of highway expansion projects be required. Third, reductions in travel time caused by an average highway expansion are not sufficient to justify the expense of such an expansion. Whether or not other benefits of these expansions may justify their expense remains unresolved. In any case, expansions of the bus network are more likely to pass a cost–benefit test than expansions of the highway network

No wonder he can't understand benefit cost analysis.

In his advocacy for California High Speed Rail, Will Doig can't even get the population of California right.    He says during the century, the population will more than double from 25 million to 60 million.  Well, at the beginning of this century, the census population was around 34 million.

Of course, you can find this out in thousands of places.  He might start here--at the Census web site.  But of course, it is a lot easier just to make stuff up, which is something that rail advocates enjoy doing.  They are about as reliable as the intelligent design people--just cuddlier and not as dangerous.


Choice words from William Black (h/t Rik Osmer)

He writes:

If one had to pick one person in the private sector most responsible for causing the global financial crisis it would be Wallison.  As I explained, he is the person, who with the aid of industry funding, who has pushed the longest and the hardest for the three “de’s.”  It was the three “de’s” combined with modern executive and professional compensation that created the intensely criminogenic environments that have caused our recurrent, intensifying crises.  He complained during the build-up to the crisis that Fannie and Freddie weren’t purchasing more affordable housing loans.  He now claims that it was Fannie and Freddie’s purchase of affordable housing loans that caused the crisis.  He ignores the massive accounting control fraud epidemics and resulting crises that his policies generate.  Upon reading that Fannie and Freddie’s controlling officers purchased the loans as part of a fraud, he asserts that the suit (which refutes his claims) proves his claims.

The piece is long, but worth reading in its entirety. 

Stegman as Geithner's Advisor on Housing

The news that Michael Stegman will be taking a leave from MacArthur to advise Tim Geithner on housing is very good.  It is important for three reasons: (1) Mike has been a leading sensible voice on housing issues for at least 30 years; (2) Treasury has recognized the importance of having an in-house housing person at a senior level; (3) Mike will remind Geithner than users of housing are at least as important as those who lend for housing.


Selasa, 27 Desember 2011

Jeremy Stein for Fed Governor (reprise)

Personally, I am a big fan of Stein's work. The shortest way to explain why is to list the titles of his five most cited papers:

  • Herd Behavior and Investment
  • A Unified Theory of Underreaction, Momentum Trading and Overreaction in Asset Markets
  • Rick Management: Coordinating Investment and Financing Policies
  • Bad News Travels Slowly: Size, Analyst Coverage and the Profitability of Momentum Strategies
  • Internal Capital Markets and the Competition for Corporate Resources.

Stein has spent his career trying to figure out how capital markets really work instead of pledging fealty to models that don't work very well.  I can't think of a better intellectual qualification for a Federal Reserve Board member.

Minggu, 25 Desember 2011

Joe Nocera nails it

He writes:

...Peter Wallison, a resident scholar at the American Enterprise Institute, and a former member of the Financial Crisis Inquiry Commission, almost single-handedly created the myth that Fannie Mae and Freddie Mac caused the financial crisis. His partner in crime is another A.E.I. scholar, Edward Pinto, who a very long time ago was Fannie’s chief credit officer. Pinto claims that as of June 2008, 27 million “risky” mortgages had been issued — “and a lion’s share was on Fannie and Freddie’s books,” as Wallison wrote recently. Never mind that his definition of “risky” is so all-encompassing that it includes mortgages with extremely low default rates as well as those with default rates nearing 30 percent. These latter mortgages were the ones created by the unholy alliance between subprime lenders and Wall Street. Pinto’s numbers are the Big Lie’s primary data point.

Two things: First, Pinto and Wallison's definition of "subprime" is any loan that goes to a neighborhood they wouldn't live in or  to a person they wouldn't have lunch with.  According to the American Housing Survey, there were around 52 million mortgages outstanding in the US in 2009.  This means that according to Wallison and Pinto,  the median borrower is a subprime borrower.  I guess this means they think that that half of homeowners with mortgages should be renting in Potterville.

Second, Nocera should in his piece put quotes around the word "scholar."

Jumat, 23 Desember 2011

Simon Johnson underlines a problem..that could point to a solution.


He writes:


Santa Claus came early this year for four former executives of Washington Mutual, which failed in 2008. The executives reached a settlement with the FDIC, which sued them for taking huge financial risks while “knowing that the real estate market was in a ‘bubble.’ ” The FDIC had sought to recover $900 million, but the executives have just settled for $64 million, almost all of which will be paid by their insurers; their out-of-pockets costs are estimated at just $400,000.
To be sure, the executives lost their jobs and now must drop claims for additional compensation. But, according to the FDIC, the four still earned more than $95 million from January 2005 through September 2008. This is what happens when financial executives are compensated for “return on equity” unadjusted for risk. The executives get the upside when things go well; when the downside risks materialize, they lose nothing (or close to it).
Just thinking aloud here, but if bank executives were compensated based on return on assets (i.e., the returns to both debt and equity), rather than return on equity, a lot of the misaligned incentives in their pay packages would go away.  Among other things, it would discourage races to the bottom.


Why Fannie and Freddie will likely last

I was talking with SF Chronicle columnist Kathleen Pender yesterday, and she shared a trenchant observation:  now that Congress has figured out a way to use the GSEs to raise revenue (via raising G-fees), it will always have an incentive to keep them.  Specifically, Congress has now tied itself to the GSEs, because it will take awhile for increased G-fees to repay the cost of the payroll tax cut.


Rabu, 21 Desember 2011

Why to worry about Chinese house prices.

Getting good data from China is problematic, but it is pretty clear house prices there are falling (see here, here and here).  At first blush, this shouldn't cause too much worry, because the Chinese use far less leverage to buy houses than Americans, and so the probability of being upside down there remains pretty low.

The problem, however, is that municipalities in China have lots of debt.  The actual amount is controversial, but the fact that it is a lot is not.  Chinese municipalities service debt using land sales.  So if property values fall a lot.....

Sabtu, 17 Desember 2011

It is possible to hold the following two views at the same time

(1) The executives for Fannie Mae and Freddie Mac should be held to account for their contributions to the crisis; and

(2) Compared with banks, shadow and otherwise, Fannie and Freddie were pikers in their contributions to the crisis.


Kamis, 15 Desember 2011

A New Survey on Information Management

I'd like to interrupt the usual programming here to ask you a favor.

The startup I'm working on will ship its first product in 2012.  As part of our development, we'd like to get some data on how people are affected by information overload.  We hope our product will help with that problem, but we need to understand better how people feel about the problem and what they're doing about it today.  So we're doing a survey.

I think that you, the folks who read Mobile Opportunity, are a very good cross-section of technology users, so I'd like to ask you to take the survey.  I know you have much better things to do with your time than fill out a survey, but we could really use your help.  It'll take about ten minutes, and it's almost all multiple choice.  I'll share the results here, so you can learn more about your fellow readers and how you compare to them.

To go to the survey, click hereNote: The survey is now closed.  You can read about the results here.

Once our company gets closer to launching, I will start up a separate weblog to talk about the product.  I'll also keep on writing Mobile Opportunity, with its current focus.

Thanks in advance for your help.  I really appreciate it.  And I'll have a new post for you next week.  It's a pretty long one that I've been working on for a while.

Why don't economists have more influence in the White House?

I was talking to someone who was an official in the adminstration about this.  The person told me the problem is, in part, that economists have poor social skills.  Maybe as part of grad school there should be a one credit charm school elective.

Rabu, 07 Desember 2011

Who would pay a 73 percent income tax? Not necessarily the rich.

A paper which is receiving considerable attention (see here, here and here) is Diamond and Saez's Journal of Economic Perspectives piece on optimal marginal tax rates.  They put the rate at 73 percent, and declare it an optimum because it would maximize revenue that could then be used for other things.  In particular, they argue that the utility lost to the rich would be much less than the utility gained by lower income people via government programs.  I do believe that many government programs leave people better off, but I am skeptical about whether the optimal size of government is that which is supported by a revenue maximizing income tax.

In any event, one aspect of the paper bothers me: if one searches for the word "incidence," it is not found.  Incidence reflects who really bears the burden of a tax.  If one taxes a person or a business, they might absorb it, or they might pass it on to someone else.

The formula for the incidence of a tax on those who demand a taxed good is (Supply Elasticity)/(Supply Elasticity - Demand Elasticity).  (I apologize for not having elegant formulas--I don't know how to paste them into Blogger).  Because demand curves are generally downward sloping, demand elasticity has a negative sign, so in a sense, the incidence reflects how relatively elastic supply is relative to the sum of the absolute values of the elasticities of demand and supply.

Now let's think about supply elasticity at the revenue maximizing point.  It is exactly one, in that the reduction in labor offered exactly offsets any increase in the rate.  To illustrate, let us just assume for a moment that demand elasticity is -1.  Then half the incidence of the tax is on the supplier of labor or capital (a.k.a. the rich) and half the incidence is on the demander.  This means that the burden on the rich person is 36.5 percent, not 73 percent.

What we do know is that as tax rates fall, the supply elasticity of the wealthy falls.  Why?  Because we know at lower tax rates, raising rates raises revenue-the supply response to an increase in taxes is smaller.  Let's assume that at a 50 percent marginal tax rate, the elasticity of labor supply for the rich is .25.  Now the incidence on demanders is .25/1.25, or 20 percent of the tax burden; it is 80 percent on the rich.  hence with a 50 percent tax rate, the effective tax on the rich is 40 percent, or higher than it would be with a 73 percent rate!

These arguments all depend on assumed elasticity parameters, and so it is important to estimate them as best as possible.  I should also note that I am all for raising taxes, including on myself, to pay for the many government services that I do support.  Somedays I think that if I could change the tax code, I would just raise my own taxes by ten percent and then have policy that assured that everyone with income greater than mine would pay an effective tax rate no lower than mine.


Selasa, 06 Desember 2011

Is it gloom, or is it underwriting?

More depressing house price numbers from Core logic this morning, with prices falling 1.3 percent month-over month.  The National Association of Realtors says buyer traffic is down.

The fundamentals for buying right now are actually good.  Trulia's most recent calculation of the cost of owning vs the cost of renting shows that in 74 percent of cities, the cash flow cost of owning is less than the cash flow cost of renting, and I don't think this takes into account the tax benefits of owning.  One city where the price to rent ratio is out of whack--New York--has such a strange housing market that it is hard to know what to make of it; the other outlier is Fort Worth, and I really don't know what to make of that.

Since the Trulia calculations were released, rents are up a bit, house prices are down a bit, and mortgage interest rates have fallen, so buying should be even more attractive relative to renting.  To bring things a little closer to home, I am currently refinancing my house, and should I get the new mortgage, there is simply no way I could rent my house for less than the cost of owning (and I am including "hidden" costs of owning, such as maintenance).

So why aren't we seeing a surge in buying?  The first possibility is that people expect rents, and therefore house prices, to fall.  I think falling rent in the near future is unlikely--multifamily vacancies have dropped a lot, and there has not been much new construction. The second is that people are gloomy about their income prospects, and don't want to be caught up in an illiquid investment like a house.  This is likely.  And third, there may be households who want to buy--who might have even qualified to buy in the years before the subprime nonsense--who simply can't get a loan.  Until lenders are more forward looking, it is hard to imagine housing getting off the floor. 

Minggu, 04 Desember 2011

Mark Thoma on the Ec 10 Walk-out

I really liked this:


I was going to stay out of this, partly because I don't find this particular expression of the protest very compelling, but I'll add one thing. A big part of the problem is what we are not supposed to talk about in economics, the politics that surrounds the profession and, in particular, policy prescriptions (and don't let Mankiw kid you, through the things he chooses to link, say on his blog, etc., he plays the political game, and plays it fairly well). The fact that one introductory class at Harvard has this much power to affect the national narrative is part of the problem not the solution. It is yet another reminder of just how concentrated power is in this society, and where it lies. Would a protest at a typical state university have gotten as much publicity? Nope. But when it's the institution that educates the rich and powerful, suddenly we are supposed to take note. And we do.

I started blogging in part because I was fed up with the way in which economic issues were presented at CNN and other mainstream news outlets prior to the Bush reelection. Those with the power to get on the air would make claims that were supposedly based upon economics, but were clearly false or at least highly misleading, and they would do so without an effective challenge from the hosts/anchors or other guests. It clearly had an effect on the national conversation, but it was all based upon using economics as a political rather than an analytical tool. So I don't think the problem is what we teach in economics courses, though we could certainly improve in some dimensions. Most courses are careful to cover market failures, etc., and how those problems can be solved through various types of interventions. The problem is the way economics is used by those with a political agenda. If the powerful had an interest in promoting ideas about market failure and the need for government to fix the problem, we'd hear about these ideas endlessly in the media. But those with power want the ability to use it unconstrained by government or any other force, and it should be no surprise that anti-government, anti-regulation, and anti-tax ideas come to dominate the conversation.
One strange thing about introductory economics: it emphasized the virtues of competitive markets.  Yet if markets are competitive, agents can't earn economic (i.e., abnormally large) profits.  Consider the implications of this as certain members of the political class praise the "job-creators."

Jumat, 02 Desember 2011

New rule: if you are going to call yourself an economist, you need to know the meaning of a confidence interval

I was listening to NPR on the drive in to work this morning, and a heard a man who was labeled "an economist," say that the job growth numbers were disappointing, because measured job growth in November was 120,000, whereas the consensus forecast for job growth was 130,000.

The Bureau of Labor Statistics puts the 90 percent confidence interval of the monthly job growth estimate for the estalishment survey at 100,000.  This means the standard error of the estimate is about 56,000, so the difference between the BLS estimate and the consensus forecast number was less than .2 standard errors, which is essentially zero.  I suppose Mr. Economist would be happy had the number come in at 140,000, which would have been above expectations.

One needn't have a Ph.D. in economics to understand confidence intervals--one undergraduate course in statatistics will do the trick.  Yet week after week, I hear people who call themselves economists yammering on about small movements in numbers that are as likely noise as anything else. 

Kamis, 01 Desember 2011

CAPABILITY & COMPETENCE ADVANCEMENT AGENDA’ (C2A2)

YES, THE STRUCTURAL CAPABILITIES ARCHITECTURE GIVES A GREAT METHOD TO SLOT YOUR CAPABILITIES; BUT AT THE SAME TIME, CAPABILITIES ARE USELESS IF NOT MANAGED WELL. AND FOR THAT, I PRESENT THE CONSEQUENT CAPABILITIES ARCHITECTURE. THERE’S A STATUTORY WARNING THOUGH: THIS STUFF IS NOT FOR KIDS!IN MY PREVIOUS EDITORIAL, I HAD commented on how modern day multinational and transnational corporations should have a structured capability & competence development process in place to achieve long-term success! I also went ahead to present the C2A2 theory (Capability and Competence Advancement Agenda) – a benchmark model that transnational organisations could and should implement offthe- rack for developing capabilities and competencies. In that very editorial, had mentioned details on how the capabilities and competencies of any corporation were of various kinds, but broadly could be divided into a few types of Structural Capabilities (see pictorial representation on the right side); namely, Doorway Capabilities, Elemental Capabilities, Enrichment Capabilities, Power Leadership Capabilities (For a better understanding, refer to my strategy book, CULT, which I’ve coauthored with Arindam Chaudhuri; a faster way would be to go to a--sandeep.blogspot.com and search for the term “C2A2”; yes, the term sounds so hackneyed, but you can’t miss it I guess).

Yes, the Structural Capabilities Architecture gives a great method to slot your capabilities and competencies; but at the same time, capabilities are useless if not managed well. But then, you can’t just stand up in a board meeting and scream at your Presidents to, uh, manage the competencies better; and not at all so in transnational organisations employing thousands. Then what structure exactly should one follow to organise the people to manage structural capabilities of any organisation? For that, I present the Consequent Capabilities Architecture. There’s a statutory warning though: this stuff is not for kids, and not the least for glam-struck management students – it’s meant purely for CEOs, and that too of very large organisations.

CONSEQUENT CAPABILITIES ARCHITECTURE
Consequent Capabilities are named such because the nature of their existence is consequent to the nature of the main structural capabilities. But the most important aspect of them all, as has been mentioned above, is the fact that Structural Capabilities exist and improve or get discraded only because of Consequent Capabilities. Structural Capabilities are the display & end result of the power and efforts of Consequent Capabilities. Different types of Consequent Capabilities identify the need for Structural Capabilities, refine their efficiencies and effectiveness, remodel their alignments with overall corporate structures and processes, and finally ensure that the organization becomes the most intelligent corporate animal that responds demandingly & profitably to all that the environment has to offer.

The existence of Consequent Capabilities runs parallel to the main operational line of structural capabilities. That is, while the continuum from Doorway capabilities to Power Leadership Capabilities focuses on competitive requirements (developing, improving, sustaining, or discarding competitive leadership), Consequent Capabilities focus on development perspectives (developing, improving, sustaining or discarding Structural Capabilities leadership). Consequent Capability Units (comprising of respective managers and team members) are of four types:

LEARNING CAPABILITY UNITS (LC UNITS)
These Units are made up of teams that are associated with all the Consequent Capability Units (Transformation, Fortification and Exnovation) at all levels and have two prime responsibilities:
1. Documenting processes, structures, organizational initiatives, goals, objectives at various discernible levels of the transnational organisation.
2. Developing a sharing network that enables all levels in the organization to learn best practices, structures and initiatives of various Capability Units by initiating Consequent Capability Architecture intervention programmes aimed at educating, teaching, disseminating knowledge, information and data throughout the organisation.

LC Units play the role of historians and professors. The LC Units are repositories of information. LC Units involve themselves in organizational intervention exercises (including, but not restricted to training & development workshops, conferences, seminars, sales sessions etc) at every level to make sure that Units all across the organization share in the learning experiences from across the organization.

EXNOVATION CAPABILITY UNITS (EC UNITS)
Exnovation is literally defined as the opposite of Innovation (I wrote about Exnovation in one of my previous editorials; it’s also there in the book that I mentioned a few paragraphs above, CULT: The ultimate CEO guide to calling the shots without getting shot). Exnovation Capability Units are meant to monitor anomalies in organizational functioning and rectify them. EC Units are dedicated capability units that ensure Exnovation of aberrations to the strategic architecture and initiatives being undertaken by the organization. The primary responsibility of EC Units is to ensure that best practices and benchmarked processes are followed throughout the organisation to the tee; and those employees/structures not adhering to the predecided processes, be either reassigned/retrainedor even retrenched in case retraining does not give progress.

Recent lessons in Corporate Non- Governance (Reliant, Dynergy, Enron, Andersen, Tyco...) have ensured the rising importance of EC Units in the corporate governance functions of organizations. Internal audit teams, for example, are EC Units attempting to ensure that standard financial processes (for example, SEC guidelines, Sarbanes Oxley Act etc) are not tampered with. Presence of Exnovation Capability Units is akin to presence of anti-bodies in human bodies in more ways than one. EC Units are dynamic in nature, both in size and their project requirements.

Corporate governance Exnovation responsibility Of the top management
The compelling need for Exnovation Capability Units does not arise only from the fact that they assist in maintaining normal operations, but more so from the fact that they fall in line with urgently required corporate governance norms. It’s the responsibility of the top management to ensure that EC Units are created at every critical level or process or department, staffed with competent ‘general specialists’, allocated resources for successful functioning, provided independent authority and responsibility to undertake transparent actions, provided with access directly to top level management, and ensured transparency with Prime Stake Controllers like Shareholders, employees, regulatory bodies like SEBI, Federal Trade Commission, European Commission etc.

There is another interesting standpoint that develops once an organization has implemented the EC Units structure through all the critical levels of the organization. By the very definition, fault lines, anomalies, aberrations or deviation occurrences need to be corrected. In this case, after judicious analysis has been done to confirm the findings, the faults should be prioritized according to the damage they might continue to cause to the existence and operations of the organization. Such impending irreparable faults and their consequent damage should be immediately communicated to relevant Prime Stake Controllers

FORTIFICATION CAPABILITY UNITS (FC UNITS)
Fortification Capability Units are meant to continuously identify better processes and structures to achieve the predefined results. FC Units are capability units that ensure continuous improvements to the strategic architecture and processes being undertaken by EC Units at various levels. FC Units do not question the results to be achieved. They rather find out better methodologies of achieving the results. In traditional terms, FC Units attempt to be effective (doing the right things), while EC Units attempt to be efficient (doing things right). At each critical level of the organization, FC Units in organizations should be structurally above EC Units because FC Units dictate what optimal processes and structures should be present. EC Units ensure that the processes and strategic architecture laid out by FC Units is followed to the book.

Fortification responsibility of the top management What is the need for Fortification Capability Units in organizations when managers & executives probably know what the right processes and structures
to be followed are? The needs are a screaming many because of the following reasons:
• Managers’ vision to identify effective processes and structures is strangulated because of their stressed out focus on achieving regular targets and meeting key performance measures. They basically
do not have time to develop orientations towards designing newer and better processes and structures.
• Even if they get the time to develop more effective processes & structures, managers are myopically focused on their scope of operations without worrying about cross-structural and cross-process
effectiveness.
• Further, operational managers generally lack knowledge of using quantitative and qualitative analytical tools to calculate relative strengths and value worth of processes and structures.

Efforts of Fortification Capability Units ensure continuous focus on effectiveness of processes and structures throughout the organization. Fortification Capability Units (and to a large extent Exnovation Capability Units) also ensure that the organization retains ground level implementation sense of strategies, irrespective of how high its vision might become. The top level management retains control over practical issues of how profitable & worthwhile individual processes and businesses are through extremely well researched methods of value chain efficiency analysis (or rather, of Exnovation Capability Units) and effectiveness analysis (of Fortification Capability Units).

TRANSFORMATION CAPABILITY UNITS (TC UNITS)

In the order of hierarchy, Transformation Capability Units at each level of the organization are above the Fortification Capability Units (who in turn are above the Exnovation Units). Transformation Capability Units are meant to continuously question and re-question not only the objective orientations of various levels of the organization, but also the need for the levels themselves. For example, a Transformation Capability Unit in the manufacturing plant of an organization not only would decide what should be the manufacturing benchmarks & objectives with respect to various parameters, but also would decide whether the manufacturing plant should be allowed to continue or not. Once the TC Unit decides on the worth of continuing the complete manufacturing plant, and once the TC Unit decides on the objectives that are worthwhile for the manufacturing plant to undertake, the Fortification Capability Unit takes over to design processes by which the plant would undertake the various objectives; and the Exnovation Capability Unit takes over later to ensure that the processes so designed by the Fortification Capability Unit are adhered to perfectly.

What Lou Gerstner was to IBM, Jack Welch was to GE; Transformational catalysts beyond comparison. The early transformational initiatives of Jack and his team focused on the following strategies:
• Being in only those markets where GE could be number one or two (most importantly to counter infl ationary pressures)
• Delayering’ the organizational structures (To transfer the strategic planning function over from senior managers to direct business leaders)
• Going for quantum leaps rather than small steps (Vision orientation of dramatic improvements in financials through practicable mergers, acquisitions & divestments) The later transformational initiatives of Jack and his team focused on the following strategies:
• Improving service orientation (In 1980, 85% of GE’s revenues came from manufacturing. In 2000, 75% of $125 billion revenues came from service, entailing better profitability) •Going global (In 1987, $31.7 billion revenues came from domestic US sales, and $8.7 billion came from global markets. In 1998, $57.7 billion came from domestic US sales, while $42.8 billion came from global sales).
• Using the information technology tools to again competitive advantage (taking GE online on the net to create a boundary-less world to seamlessly connect all stakeholders)

At this time, there might be a presumption that while Exnovation Capability Units operate only at lower levels of the organisation, Transformational Capability Units operate only at the higher levels of the organisation. Not so. True, TC Units have more importance at higher levels, and EC Units have more at lower levels, but every level requires its own EC, FC, TC and LC units.

What I’ve attempted in this massively self-aggrandizing and theoretical editorial is to tell you – the CEO – that the first step to becoming a world class organisation is documenting a plan to know, maintain, develop and even destroy your capabilities and competencies. And if you had no idea how to prepare that document, like I said once before, just blindly implement what I’ve presented here – and keep sending me the royalty.

Selasa, 29 November 2011

I now work for the Sol Price School of Public Policy as well as the Marshall School of Business at USC

The Price Family gave a $50 million gift for the USC School of Policy, Planning and Development to be remained the USC Sol Price School of Public Policy.  Mr Price was an alum of USC, as is his grandson.

Mr. Price was the force behind Price Club, which later merged with Costco.  He was known for paying his workers well, treating his customers well, and not overpaying his executives.  He was ahead of his time with respect to racial integration and urban renewal.  Sometimes I feel an internal tension, because I admire both success in business and care a lot about social justice.  If all successful people in business were like Mr. Price, I would feel no such tension.  His obituary in the San Diego Jewish World contained the following:
“Most of life is luck,” he said in an 1985 newspaper interview. “Obviously you have to have the will and intensity, and in my case discipline and idealism had a lot to do with it. But if you move back a step, even that is luck."
I can't think of a better way to look at life.  And whether you need mustard or Johnny Walker Black, there is no better place to go than Costco.  I am proud to now work at a school named for him.





Why I think Raphael Bostic is more likely right about FHA than Joe Gyourko/AEI/WSJ

A healthy debate has taken place between HUD Assistant Secretary Raphael Bostic and Wharton Professor Joe Gyourko on the financial future of FHA.  While FHA is thinly capitalized, Raphael argues that will likely survive, while Joe thinks a large taxpayer finance bailout is looming.  In the interest of full disclosure, I should note that Raphael is a colleague of mine at USC, but Joe invited me to be a visiting faculty member at Wharton for a semester.  I think highly of them and am grateful to them both.

I have two reasons to bet on Raphael's view:

(1)  At the time the dumbest mortgage business was being done, FHA was out of the picture.  While FHA's market share is typically in the neighborhood of 12-15 percent, during the period 2003-2007, its market share ranged from 3.77 to 9.66 percent.    FHA did not lower its underwriting standards to that of the shadow banking sector (a sector that was not subject to the Community Reinvestment Act, by the way) in order to keep market share--the government insurance program was far more disciplined than the private sector.

FHA's market share increased dramatically in 2009 and 2010, in large part because the private sector abandoned the low downpayment market.  In 2010 in particular, FHA gained market share despite raising its prices and tightening its underwriting.    FHA was also ramping up its market share after house prices collapsed.  While house prices have not been robustly rising since late 2008, they have not been falling precipitously either.  One could argue that the private sector has been backward looking, while the public sector has been more forward looking.

(2) The second reason I have is more speculative, and is something that I am currently in the middle of researching, but I want to put it out there as a hypothesis (and a hunch).  I suspect that there is such a thing as "burn-out" in default--if a household goes through a difficult time without defaulting, it becomes decreasingly likely to default.  Part of the reason for this is amortization, but that is a small reason.  More important, people who refuse to default even when their measured characteristics suggest that they should have revealed that they are "different," and in a manner that is unobservable.  

Now again, in the interest of full disclosure, I should note that I did not forecast the size of GSE losses, so maybe I shouldn't be taken that seriously.  But I think my first argument will stand up, and as I do more research, I will know more about the second.


Sabtu, 26 November 2011

Does slowing people down slow down the economy?

As my family and I were traveling back to LA from my parents' place in Arizona this weekend, we had to stop at three checkpoints.  Each stop delayed us--I would guess the average delay was 5-10 minutes.  One check point bragged that it had arrested around 100 people--about 70 for immigration violations and 30 for crimes--over the course of 2011.

According to this web site, one of the highways I travelled on carries 10,000 cars per day.  Let's say the average stop takes five minutes, the average car has 1.3 people in it, and the value of people's time averages $15 per hour.  This means that each arrest costs a little under $60,000; perhaps there is a deterrent effect as well.  Is this worth it?  I really don't know.

But I can't help but notice that over the last ten years, the US, as a matter of security policy, has really gummed up the ability of people to get easily from one place to another. Is it a coincidence that the economy has underperformed over this time?  Perhaps.  I can't think of a way to run a regression to test the relationship between ease of travel and economic performance--but that doesn't mean that someone else can't.



Selasa, 22 November 2011

Remembering the date

In the long history of the world, only a few generations have been granted the role of defending freedom in its hour of maximum danger. I do not shrink from this responsibility—I welcome it. I do not believe that any of us would exchange places with any other people or any other generation. The energy, the faith, the devotion which we bring to this endeavor will light our country and all who serve it—and the glow from that fire can truly light the world.

Senin, 21 November 2011

More four year degrees won't solve the current problem

David Brooks and Thomas Friedman have recently taken to arguing that the solution to our income distribution woes is to encourage and enable more people to go to college.  I want to leave aside for a second the fact that our educational problems are much deeper than that--that our high school graduation rate is declining is to me the most alarming education statistic.

Rather, it is worth looking at what has happened to earnings by educational attainment over the past eight years.  The census has put out data for 2002-2010, and here is what it (Table A-6) shows:

Median earnings for men with a high school degree fell 12.1 percent between 2002-2010; earnings for women with a high school diploma fell 8.5 percent between 2002-2010; for men with college degrees, it was a fall of 8.0 percent; for women with college degrees it was flat.  So yes, education is increasing income inequality in that those with college degrees are losing less than those with high school diplomas.

I am the sort of person who would be fine with a GINI of .5 (a number the reflects lots of inequality) if it meant that the people who are materially worst off can live at a decent standard of living.  But currently, those who play by the rules (and I mean really play by the rules) are seeing their living standards erode.  Homilies about sending more people to college are at the moment pretty much beside the point.

Sabtu, 19 November 2011

Raphael Bostic takes on Joe Gyourko

The Assistant Secretary of PDR (and USC colleague) writes:


This week, HUD released its annual report to Congress on the financial status of the Federal Housing Administration (FHA) Mutual Mortgage Insurance (MMI) Fund.  The report demonstrates the long-term strength of the Fund while not shying away from the challenges it faces in the near-term due to ongoing stresses in the housing market.  While the independent actuary reports older books of business underwritten during the bubble years of 2000-2008 are expected to produce losses of more than $26 billion, it also finds that FHA has a very strong platform going forward, with insurance on loans booked since January 2009 posting an estimated net economic value of $18 billion. Indeed, the actuary reports that the Fund still retains positive capital, and that it should be able to rebuild capital to the statutory requirement of two percent of insurance-in-force very quickly once housing markets across the county exhibit sustained growth.

Notwithstanding findings of the independent actuary that the FHA MMI Fund retains positive capital four years into the worst housing crisis since the Great Depression, a report commissioned by the American Enterprise Institute (AEI) suggests that FHA both lacks an actuarially sound program and is in current need of a significant capital infusion. 

Read the whole thing. It has actually stunned me how well FHA says done relative to AEI's paragon of virtue, the private market.  Of course, it was the private labels security market that drove down FHA's market share during the worst of the lending market. FHA loans actually always required underwriting; underwriting in the private sector often disappeared.


Kamis, 17 November 2011

Read CRL on Disparities in Mortgage Lending

The Center for Responsible Lending's research team of Carolina Reid (who has been working tirelessly at developing data on subprime mortgages for some time now), Roberto Quercia, We Li and Debbie Grunstein Bocian has produced Lost Ground, 2011: Disparities in Mortgage Lending and Foreclosures. They argue
1) The nation is not even halfway through the foreclosure
crisis. Among mortgages made between
2004 and 2008, 6.4 percent have
ended in foreclosure, and an additional 8.3 percent are
at immediate, serious risk.

(2) Foreclosure patterns are strongly
linked with patterns of risky
lending.
The foreclosure rates are consistently
worse for borrowers who received high-risk loan products
that were aggressively marketed
before the housing crash, such as
loans with prepayment penalties,
hybrid adjustable-rate mortgages
(ARMs), and option
ARMs.
Foreclosure rates are highest in
neighborhoods where these
loanswere concentrated.

(3)The majority of people affected
by foreclosures have been
white families. However, borrowers of
color are more than twice as
likely to lose their home as
white households. These higher
rates reflect the fact that African
Americans and Latinos were
consistently more likely to receive
high-risk loan products, even
after accounting for income and
credit status.
It is really striking how African-Americans and Hispanics were steered into crappy loans, even controlling for income and credit history. Beyond all this, the web site accompanying the report has really nicely organized data on severely delinquent loans and loans in foreclosure by state, race, ethnicity and MSA.

Holmen Jenkins makes me spit out my coffee this morning.

He spins this scenario:

Take this case: Workers in a rail yard see men in suits prowling around. Rumors fly the company is being sold. One worker buys call options on his employer's stock and, because the rumors turn out to be right, is hauled up on insider-trading charges. Had the rumors been wrong, had the worker lost money, had the men in suits been federal railroad inspectors, think the feds would have filed a case?
The natural lesson we draw from this little piece of fiction: if Spencer Bachus buys a short position after he meets with Ben Bernanke, it's ok.

Selasa, 15 November 2011

Sometimes you have to hold your nose

Reporter Jim Puzzanghera  of the LA Times asked me today whether I would restore conforming loan limits in certain high cost areas to their pre-October 1 729,250 level.  He wrote:


Although he'd like to see more data, Green thinks it's probably a smart move to increase the loan limits. And he agreed that the move was unlikely to hurt the FHA's finances. 
"My gut answer is, I'd probably raise it back right now," Green said. "The downside of not raising it is potentially pretty bad."
I really dislike the idea of subsidizing mortgages that only households earning more than $200,000 per year can afford.  At the same time, however, Nick Timiraos last week wrote:

Potentially more revealing is this data point from California, which has a higher share of markets affected by the declines: applications for purchase loans with balances between $625,500 and $729,750 were down by 25% from September and by 33% from one year ago. By contrast, overall purchase-loan applications in California were down by just 12% and 3%, respectively.
Housing is still very weak and many borrowers are underwater.  I wanted to see if lowering loan limits would lead the private sector to step in--I am not seeing any evidence that it is.  Beyond the data cited in the Timiraos story, flow of funds data show that private lending in other sectors of the economy remains moribund.

Maybe it is worth waiting for another month of data before the old limits are restored.  But it is not worth worsening things in the market to make a point.

Harry Frankfurt and Herman Cain

The Washington Post sends me to a Milwaukee Journal-Sentinal interview with Herman Cain on Libya.




Watching the cringe inducing answers reminded me of one of may favorite books of the last decade or so: Harry Frankfurt's On Bullshit. I am writing this from my house, and my copy of the book is in my office, so let me pull a quote from the book that is featured in a Slate review:

Both in lying and in telling the truth people are guided by their beliefs concerning the way things are. These guide them as they endeavor either to describe the world correctly or to describe it deceitfully. For this reason, telling lies does not tend to unfit a person for telling the truth in the same way that bullshitting tends to. ...The bullshitter ignores these demands altogether. He does not reject the authority of the truth, as the liar does, and oppose himself to it. He pays no attention to it at all. By virtue of this, bullshit is a greater enemy of the truth than lies are.

I am not naive. Among my favorite presidents, three--FDR, LBJ, and Bill Clinton--were excellent liars. They were not, however, bullshitters. Herman Cain is.

Minggu, 13 November 2011

The Prescience of Rudiger Dornbusch

As events have unfolded in Europe over the past year, I keep thinking back to an article written by Rudiger Dornbusch in Foreign Affairs.  The summary:

The battle for the common currency may be remembered as one of the more useless in Europe's history. The euro is hailed as a solution to high unemployment, low growth, and the high costs of welfare states. But the deep budget cuts required before integration are already causing pain and may trigger severe recessions. If the European Monetary Union goes forward, a common currency will eliminate the adjustments now made by nominal exchange rates, and the central bank will control money with an iron fist. Labor markets will do the adjusting, a mechanism bound to fail, given those markets' inflexibility in Europe.

He wrote the piece in 1996. It is now behind a pay-wall, but if you have access to a university library, you can probably get access to the piece.

I know this makes me elitist, but...

...people running for president should actually know stuff.  Jon Huntsman does, which seems to disqualify him immediately.


Jumat, 11 November 2011

What's the real difference between Brookings and AEI?

With Brookings, I need to read the study to know how it turns out.

With AEI, I don't need to read the study to know how it turns out. 

Kamis, 10 November 2011

Lessons From the Failure of Flash: Greed Kills

Adobe's decision to stop development of mobile Flash has deservedly gotten a lot of attention online.  It's a sad story for Adobe and Flash developers: a dominating standard on the PC web failed to get traction in mobile, and will now be abandoned gradually in favor of HTML 5.  But the story's not limited to mobile -- without a mobile growth path, I think Flash itself is destined to become a dwindling legacy standard everywhere (link).  I think the whole Flash business edifice is coming down.

How did Flash go from leader to loser?  There are a lot of explanations being floated online. Erica Ogg at GigaOm has a good list (link):

--Mobile flash didn't work very well
--It was opposed by powerful people like Steve Jobs
--It was out-competed by HTML 5

(And by the way, how in the world do you get out-competed by something as slow-moving as HTML 5?)

I agree with Erica, but it's more a list of symptoms than root causes.  It's like saying an airplane crashed because the wings fell off.  Yes, that's true, but why did the wings fall off?  If you look for root causes of the Flash failure, I think they go back many years to a fundamental misreading of the mobile market, and to short-term revenue goals that were more important than long-term strategy at both Macromedia and Adobe.

In other words, Flash didn't just die.  It was managed into oblivion.

The story of Flash is a great cautionary tale for companies that want to create and control software platforms, so it's worth looking at more closely.


A quick, oversimplified history of Flash

In the software world, there is an inherent conflict between setting a broad standard and making money.  If you have good software technology and you're willing to give it away, you can get people to adopt it very broadly, but you will go broke in the process.  On the other hand, if you charge money for your technology, you can stay in business, but it's very hard to get it broadly adopted as a standard because people don't want to lock themselves into paying you.

Clever software companies have long realized that you can work around this conflict by giving away one technology to make it a standard, and then charging for something else related to it.  For example, many open source software companies give away their core product, but charge for hosting and support and other services.  Android is another example -- it's a free operating system for mobile phone manufacturers, but if you use it in your phone Google also tries to coerce you into bundling its services, which extract revenue from your customers. 

In the case of Flash, the player software was given away for free on the web, and Macromedia (the owner of Flash at the time) made its money by selling Flash content development tools.  The free Flash player eventually took on two roles on the web: it was the preferred way to create artistically-sophisticated web content, including an active subculture of online gaming, and it became one of the most popular ways to play video.  Flash reached a point of critical mass where most people felt they just had to have the player installed in their browser.  It became a de facto standard on the web.

Enter Japan Inc., carrying cash.
  The rise of mobile devices changed the situation for Flash.  Long before today's smartphones, with their sophisticated web browsers, Japan was the center of mobile phone innovation, and the dominant player there was NTT DoCoMo, with its proprietary iMode phone platform.  The folks at DoCoMo wanted to create more compelling multimedia experiences for their iMode phones, and so in early 2003 they licensed Macromedia's Flash Lite, the mobile version of Flash, for inclusion in iMode phones (link).

The deal was a breakthrough for Macromedia.  Instead of giving away the flash client, the way it had on the PC, Macromedia could charge for the client, have it forced into the hands of every user, and continue to also make money selling development tools.  The company had found a way to have its cake and eat it too!  In late 2004, the iMode deal was extended worldwide (link), and I'm sure Macromedia had visions of global domination.

Unfortunately for Flash, Japan is a unique phone market, and DoCoMo is a unique operator.  The DoCoMo deal could not be duplicated on most phone platforms other than iMode.  Macromedia, and later Adobe, was now trapped by its own success.  To make Flash Lite a standard in mobile, it would have needed to give away the player, undercutting its lucrative DoCoMo deal.  When you have a whole business unit focused on making money from licensing the player, giving it away would mean missing revenue projections and laying off a lot of people.  Macromedia chose the revenue, and Flash Lite never became a mobile standard.

Without fully realizing it, Macromedia had undermined the business model for Flash itself. The more popular mobile became, the weaker Flash would be.

Enter the modern smartphone.  Jump forward to 2007, when the iPhone and other modern smartphones made full mobile web browsing practical.  Adobe, by now the owner of Flash, was completely unprepared to respond.  Even if it started giving away Flash Lite, the player had been designed for limited-function feature phones and could not duplicate the full PC Flash experience.  Meanwhile, the full Flash player had been designed for PCs; it was too fat to run well on a smartphone.  So the full web had moved to a place where Adobe could not follow.  The ubiquity of the Flash standard was broken by Adobe itself.

To make things worse, Adobe was by then in the midst of a strategy to upgrade Flash into a full programming layer for mobile devices, a project called Apollo (later renamed AIR).  The promise of AIR was to make all operating systems irrelevant by separating them from their applications.  At the time, I thought Adobe's strategy was very clever (link), but the implementation turned out to be woefully slow. 

So here's what Adobe did to itself:  By mismanaging the move to full mobile browsing, it demonstrated that customers were willing to live with a mobile browser that could not display Flash.  Then, by declaring its intent to take over the mobile platform world, Adobe alarmed the other platform companies, especially Apple.  This gave them both the opportunity and the incentive to crush mobile Flash.

Which is exactly what they did.


The lesson: Don't be greedy

There are a couple of lessons from this experience.  The first is that when you've established a free standard, charging money for it puts your whole business at risk.  Contrast the Flash experience to PDF, another standard Adobe established.  Unlike Flash, Adobe progressively gave up more and more control over the PDF standard, to the point where competitors can easily create their own PDF writers, and in fact Microsoft bundles one with Windows Office.  Despite the web community's broad hostility for PDF, it continues to be a de facto standard in computing.  There is no possible way for Adobe to make money directly from the PDF reader, but its Acrobat PDF management and generation business continues to bring in revenue.

The second lesson is that you have to align your business structure with your strategy.  I think Macromedia made a fundamental error by putting mobile Flash into its own business unit.  Adobe continued the error by creating a separate mobile BU when it bought Macromedia (link).  That structure meant the mobile Flash team was forced to make money from the player.  If the player and flash development tools had been in the same BU, management might have at least had a chance to trade off player revenue to grow the tools business.


What can Adobe do now?

The Adobe folks say the discontinuation of mobile flash is just an exercise in focus (link).  They point out that developers can still create apps using Flash and compile them for mobile devices, and that Flash is still alive on the desktop.  Viewed from the narrow perspective of the situation that Adobe faces in late 2011, the changes to Flash probably are prudent.  But judged against Adobe's promise to create an "an industry-defining technology platform" when it bought Macromedia in 2005 (link), it's hard to call the current situation anything other than a failure.

I think it's clear that Flash as a platform is dying; the end of the mobile Flash player has disillusioned many of its most passionate supporters.  You can hear them cussing here and here. Flash compatibility will continue to live on in AIR and other web content development tools, of course, but now that Adobe doesn't control the player, I think it will have trouble giving its tools any particular advantage.

What Adobe should do is start contributing aggressively to HTML 5, to upgrade it into the full web platform that AIR was originally supposed to be.  That's a role no one in the industry has taken ownership of, web developers are crying out for it, and Adobe implies that's what it will do.  But I've heard these broad statements from Adobe before, and usually the implementation has fallen far short of the promises.  At this point, I doubt Adobe has the vision and agility to pull it off.  Most likely it will retreat to what it has always been at the core: a maker of software tools for artistically-inclined creative people.  It's a nice stable niche, but it's nothing like the dominant leadership role that Adobe once aspired to.

Rabu, 09 November 2011

Do Richwine and Briggs show that, on average, teachers are overpaid? I don't think so. (Warning: a little wonky)

A recent study by John Richwine and Andrew Biggs of the Heritage Foundation and the American Enterprise Institute purports to show that teachers are on average overpaid.   I do not find their evidence convincing, and the reasons have less to do with their affilitations than the technical nature of their work.  My problems with their paper are:

(1) They estimate a reduced form, which means it is difficult to interpret the meaning of their coefficients.

(2) Even if we accept their reduced form, there are issues in how the authors specify their explanatory variables.

(3) The authors' specification has a serious selectivity problem and

(4) Most disturbingly, they ignore their most convincing spefication, a specification that supports the idea that teachers get paid 10 percent less in wages than those in other professions.

Let's turn to each problem in turn:

(1) Underlying any wage equation is a supply curve for labor and a demand curve for labor.  Let's write these out:

L(s) = a + bw +cX1+ e1
L(d) =d - fw +gX2 + e2

X1 and X2 are vestors of explanatory variables, e1 and e2 are residuals from a regression equation. 

Let's say one of the elements in X2 is years of education--the demand for labor goes up in years of education after controlling for wages.  The coefficient g that is multiplied by years of education is thus easy to interpret--it is a wage premium associated with education.

The problem is that the authors estimate a reduced from,  where they put L(s)=L(d).  The resulting equation they arrive at is

w = d/(b+f)+gX2/(b+f)+e2/(b+f)-a/(b+f)-cX1/(b+f)-e1/(b+f)

If  X2 is education, and is in both the supply and deman equation, the reduced form wage equation reduces to:
w=(d-a)/(b+f) +(g-c)X1/(b+f)+(e2-e1)/(b+f)

So the coefficient on X1 is (g-c)/(b+f). This coefficient helps with prediction of wages, but it does not allow us to disintangle the stuctural foundation of wages.  This why why when we are trying to determine the impact of policy on outcomes, reduced forms are problematic.

(2) The authors assume that wages are linear in years of education.  This is clearly not true--the impact of  education on wages tends to fall into "buckets;" < 12 years, 12-15 years, 16 years, and > 16 years.  You get the idea.  Their mis-specification of the educational variable could bias their other findings.

(3) People who select themselves into teaching might have skills that do not show up in educational levels or on aptitude tests.  I have lots of education and do well on aptitude tests, but I think I would be at sea teaching second graders and REALLY at sea teaching middle schoolers.  Teaching students at these levels requires patience, insight and social skills that are not measured by aptitude tests.

The authors point to the interesting fact that people generally make less money when they move from teaching to non-teaching jobs.  There are alternative interpretations to there.  One is that teaching is a hard job, and so people willingly leave at lower wages.  The second is that those who select out of teaching are those who have decided they are not very good at it.

(4) The most disturbing part of the paper is this:



"Table 2 shows how teacher salaries change depending on whether education or AFQT is included in the regression. The first row is the "standard" regression based on our CPS analysis in the previous section: Years of education are controlled for, but AFQT is not. The standard regression shows a teacher salary penalty of 12.6 percent.

The second row includes both education and AFQT in the same regression. The impact on teacher wages is small: The penalty decreases by less than two percentage points. The third row again includes AFQT but now omits education. With this specification, the change is dramatic: The teaching penalty is gone, replaced by a statistically insignificant premium.

How to interpret these results? On the one hand, the difference in IQ between teachers and other college graduates

by itself has only a small effect on estimates of the teacher penalty. As the second row indicates, teachers with both the same education and AFQT score as other workers still receive 10.7 percent less in wages.



However, as we have shown, education is a misleading measure of teacher skills in several ways. In addition to the IQ difference between teachers and non-teachers, the education major is among the least challenging fields of study, and years of education subsequently have little to no effect on teacher quality. This suggests that eliminating education as a control variable and letting AFQT alone account for skills (as in the third row) may provide the most accurate wage estimates.

Replacing education with an objective measure of skills eliminates the observed teacher penalty, indicating that non-teachers with the same education as a typical teacher will likely have more applicable skills. We emphasize that a job is not necessarily less important or less challenging when the credentials for it are easier to obtain. Indeed, effective teachers are highly valuable to society and the economy."

So the authors have a regression with both education (which reflects Spence-type signalling, among other things) and IQ. The reduction in the R-squared when education is dropped suggests that after controlling for IQ, the coefficient on education continues to be statistically different from zero. When both IQ and education are included, teachers suffer a 10 percent wage discount relative to the private sector. Yet the authors ignore this result for the rest of the paper.

(FWIW, I really admire Michelle Rhee).

Kamis, 03 November 2011

THE MOTHER OF ALL INNOVATIONS – EXNOVATION©

WHY COMPANIES NEED TO MASTER THE ART OF NOT INNOVATING. IN OTHER WORDS, THE ART OF EXNOVATION – THE OPPOSITE OF INNOVATION

Good morning world. The mother of all anti-thesis theorems is here. Well, umm, it was already there since the past few years. Actually it was in 1996 when I conjured up this term called exnovation – which I defined as the opposite of innovation – and presumed that I had arrived on the global management scene; well, had not I finally created a better mousetrap? 15 years later, I see that the term exnovation is still known to almost zero individuals on this plant (‘cept me of course), and where known has taken up definitions that I never intended – and of course, nobody’s beaten their way up to my door yet. And that’s when I decided to give it one more try – define the term appropriately so that organisations realize the need to necessarily incorporate exnovation as a critical process within organisational structures.

I accept, in the present times, nothing excites corporate junkies more than the conceptWhat the processs-oriented Welch did, his innovation-hungry successor might undo of innovation. Who in heavens would care about exnovation for god’s sake?! Would you wish your company to come out tops on the World’s Most Innovative Companies’ lists or would you wish to be the numero uno on the exnovation charter – in other words, the world’s topmost ‘non-innovating’ company? One doesn’t need to think too deeply to get the answer to that. Frankly, the term exnovation was perhaps doomed from its very definition.

And reasonably too. Iconic CEOs have grown in fame because of being innovative. How many CEOs would you know of in the world who are worshipped because they exnovated? The answer might surprise you. Quite a few. And to understand this dichotomy, you’d have to first understand the correct definition of exnovation.

Exnovation does not actually mean propagating a philosophy of not innovating within the organisation. Exnovation in reality means that once a process has been tested, modulated and finally super-efficiently mastered and bested within the innovative circles of any organisation, there should be a critical system that ensures that when this process is replicated across the various offices of the organisation, the process is not changed but is implemented in exactly the same manner in which it was made super-efficient; in other words, no smart alec within the organisation should be allowed to tamper with the already super-efficient process. In other words, the responsibilty of innovation should be the mandate of specialised innovation units/teams within an organisation and should ‘not’ be encouraged to each and every individual within the organisation. The logic is that not every individual is competent at innovating – yet, everybody wishes to innovate, which is what can create a doomsday scenario within any organisation. Think the case of two call centers, where credit card customers call when they wish to complain about their lost cards. Imagine one call center, where all employees are trained by exnovation managers to follow tried and tested responses and processes; imagine the other call center, where each employee is allowed independence in innovatively deciding how to respond to the calling customer’s lost card issue. Any guesses on which call center would ensure better productivity and customer satisfaction? Clearly, the one practising exnovation. And that, my dear CEOs, is the responsibility of the Exnovation units within an organisation – units staffed with managers and supervisors whose sole job it is to ensure that best practice processes and structures are followed to the tee and not tampered with within the organisation by individuals or teams without a formal mandate. Call them what you may – but any manager responsible for ensuring replication and mirror implementation of any efficient process is an exnovation manager.

And it’s a fact that CEOs and companies have thrived practising this management philosophy of exnovation. The last time this $421.85 billion- a-year topline earning company allowed each and every was much before its stock became a market-commodity on NYSE (on October 1, 1970). Till date, its “Save money. Live better” concept is based on standard processes, followed to the hilt and marginally improved over the years, to deliver maximum productivity and efficiencies. What gives this company’s operations the push? Leveraging tested economies of scale (a process that economists have discussed over decades), sourcing materials from lowprice suppliers (simply put – common sense), using a well tested satellitebased IT system for logistics (a technology that was invented in the late 1950s; today, the company’s vehicles make about 120,000 daily trips to and/or from its 135 distribution centers spread across 38 states in US alone, a count equal to the average number of vehicles that use the Lincoln Tunnel per day in New York City) and smarter financial and inventory management called ‘float’ (the firm pays suppliers in 90 days, but plans its stocks so that it gets sold within 7 days).

The company is #1 on the Fortune list: Walmart (2011; it has occupied the pole position in the Fortune 500 Rankings for the eight time in ten years!). For that matter, recall the last time you heard of an innovation from Walmart. “After I came in as CEO, I looked at the world post-9/11 and realised that over the next 10 or 20 years, there just was not going to be much tailwind. It would be a more global market, it would be more driven by innovation. We have to change the company to become more innovation driven – in order to deal with this environment. It’s the right thing for investors.” Wise words from a wise CEO, spoken in the American summer of 2006, it seems. This protagonist was appointed the CEO of a large conglomerate on September 7, 2001 [which he refers to as “the company”]. When he took over the mantle, the company having been led by his “strictly process-oriented” predecessor, had grown to become a $415 billion giant (m-cap). So how has his “innovation-driven-change” focus worked for his investors and shareholders [to whom he wanted to do right]? Ten years have gone by, and under him, the company has lost 58% of its value! And while America Inc. has become more profitable in the past decade, this company’s bottomline has actually gone drier by 14.91%. The first thing this innovation-lover of a CEO did when he took over control of this company was increase the company’s R&D budget by a billion dollars more and spend another $100 million in renovation of the company’s New York innovation centre. Well, loving innovation is not wrong. What is wrong is in forgetting that the best innovated products, processes and structures should not be tampered with!

In other words, Mike Duke, Walmart’s CEO, uses commonsense to improve financials. Not innovation.Geoffrey Immelt forgot exnovation, which his predecessor Jack Welch had mastered. Yes, I’m talking about GE. Immelt, later in an HBR paper titled, “Growth as a process”, confessed, “I knew if I could define a process and set the right metrics, this company could go 100 miles an hour in the right direction. It took time though, to understand growth as a process. If I had worked that wheelshaped ‘execute-for-growth-process’ diagram in 2001, I would have started with it. But in reality, you get these things by wallowing in them a while. Jack was a great teacher in this regard. I would see him wallow in something like Six Sigma.” But this is not to say that Jack Welch was against innovation – in fact, he loved it; but he ensured that not everybody in the organisation was allowed to do that. Immelt’s paper does state that “under Jack Welch, GE’s managers applied their imaginations relentlessly to the task of making work more efficient. Welch created a formidable toolkit and mindset to maintain bottomline discipline.”

Share price movements of world’s largest oil companiesWhatever best practices were innovated in GE’s group companies, Welch ensured that the same were exnovated too and shared with other group companies in GE’s Crontonville Training Centre and GE’s Management Academy. And subsequently, such best practices were implemented throughout the group with a combination of commonsense and managerial Rex Tillerson, CEO of Exxon, respects set processes and cares little about algae fuel!judgement. From Six Sigma to the 20-70-10 rule, Welch was all about making GE’s traditional strength – process orientation – religion for its employees. It’s easy to guess a name that Welch would have fired in his tenure at GE. What else when you have a list of over 112,000 employees to choose from? [They were fired because they did not fit into the process-oriented culture of GE; according to a June 2011 HBR article titled, ‘You Can’t Dictate Culture – but You Can Influence It’, by Ron Ashkenas, Managing Partner of Schaffer Consulting and a co-author of The GE Work-Out, “The real turning point for GE’s transformation came when Jack Welch publicly announced to his senior managers that he had fired two business leaders for not demonstrating the new behaviours of the company – despite having achieved exceptional financial results.]

Next, tell us one innovation that Welch introduced. Difficult? In all probability, your answer will only end up defining a process he introduced at GE and ensured everyone – from his senior managers to the junior-most – followed to the hilt. Honestly, it wasn’t just innovation that created wealth on a massive scale for GE shareholders during Welch’s tenure by 2,864.29% (to make it the world’s most valuable company; with an m-cap of $415 billion, much ahead of the world’s thensecond- most valuable Microsoft at $335 billion), it was exnovation too – perhaps more so.

Stock movement comparison of GE, GM, Ford and WalmartTalk about a petrochemical company which is the third-largest company in the world and the highest profit-maker ever (with $30.46 billion in bottomlines in FY2010). In the name of innovation, the last time you saw this company contribute was when it developed the naphtha steam cracking technology (which it uses till date to refine petrochemicals) in the 1940s. Since then, there have only been modifications and improvements on this technology. Even when others had started talking about bio-fuels and innovation, this company’s CEO was adamant on continuing to invest in the technology that made what the $363.69 billion company (m-cap as on November 1, 2011) represented in the modern world. “I am not an expert on biofuels. I am not an expert on farming. I don’t have a lot of technology to add to moonshine. What are we going to bring to this area to create value for our shareholders that’s differentiating? Because to just go in and invest like everybody else – well, why would a shareholder want to own Exxon Mobil?”, said Rex Tillerson, the Chairman & CEO of Exxon Mobil – the second-largest Fortune 500 company. And this is what Fortune Senior Writer Geoff Colvin wrote in his article titled, ‘Exxon = oil, g*dammit!’ about Tillerson’s attitude to innovate in fuels of the future: “The other supermajors are all proclaiming their greenness and investing in biofuels, wind power and solar power. Exxon isn’t. At Exxon it’s all petroleum. Why isn’t the company investing in less polluting energy sources like biofuels, wind, and solar? Remembering that Exxon is above all in the profit business, we know where to look for the answer. As a place to earn knockout returns on capital, alternative energy looks wobbly. It’s a similar story for alternative fuels for power generation. Exxon just doesn’t know much about building dams or burning agricultural waste. Its expertise is in oil and gas.” Translation – Exxon continues to work on processes set and ignores what Tillerson calls moonshine [read: innovative fuels].

And to talk about how efficient and bottomline focussed this system at Exxon has become, Colvin has some lines to add: “At this supremely important job, it is a world champion. All the major oil companies bear about the same capital cost, just over 6%. But Exxon earns a return that trounces its competitors. Others could be pumping oil from the same platform, and Exxon would make more money on it. It is like taking the same train to work, but they get to the office first.” Can the way the most valuable company on Earth functions be some lesson for exnovation managers? Of course.

Next, the auto majors. Since Henry Ford introduced real innovation in the industry in the form of the assembly line, the Ford Motor Company hasn’t had much to boast about in this regard. And yet, it became the only Detroit major to bounce back without a Fed bailout. And how about the real innovator? Appears, being an innovator does not pay well in the auto industry too! General Motors was ranked the #1 innovator (among 184 companies) by The Patent Board in its automotive and transportation industry scorecard for 2011. But all this came at the cost of the company’s bottomlines which bled $76.15 billion in the seven years leading to 2010 [and this is not considering the fateful year 2009 when GM got a fresh lease of life with the US Fed pumping-in a huge $52 billion that ultimately saved America’s innovation pride]. And what about investors? If GM has the patents and is the king of innovation, should it not have been the best bet for investors? Count the numbers and decide: if an investor had invested $100 in GM stock exactly 10 years back, he would have just $78.42 left in his trading account – a return of negative 21.58%! Had the same sum been invested in four of the other big automakers in the world, the reading would have been quite different. Investing in Ford, the investor would have gained 22.72%, in Toyota: 39.52%, in Hyundai Motors: 89.4%, and in Volkswagen: 364.32%! These are companies that focus on design and maintaining a procedure that helps create cars with set standards of quality – not innovate or lead the rush for patents in clean-energy fuels! Message for GM – instead of investing billions of taxpayers’ funds in developing green-fuel and propulsion technologies, put people on a production process that will help launch more variants of the small diesel car (the Chevrolet Beat) for the BRIC markets. That should suffice. Exnovate – like Toyota does with its production system that follows the 5S, Kaizen and Jidoka philosophies – and create a process of continuous improvement in small increments that make the system more efficient, effective & adaptable.

In his May 2007 best-seller ‘The Myths of Innovation’, author Scott Berkun [who had worked on the Internet Explorer development team at Microsoft from 1994-1999], using lessons from the history of innovation, breaks apart one powerful myth about innovation – popular in the world of business – with each chapter. “Competence trumps innovation. If you suck at what you do, being innovative will not help you. Business is driven by providing value to customers and often that can be done without innovation: make a good and needed thing, sell it at a good price, and advertise with confidence. If you can do those three things consistently you’ll beat most of your competitors, since they are hard to do: many industries have market leaders that fail in this criteria. If you need innovations to achieve those three things, great, have at it. If not, your lack of innovation isn’t your biggest problem. Asking for examples kills hype dead. Just say “can you show me your latest innovation?” Most people who use the word don’t have examples – they don’t know what they’re saying and that’s why they’re addicted to the i-word.”

The fundamental question really is – could airlines like Singapore Airlines, Virgin Airways, China Southern, United Airways, KLM Royal Dutch Airlines and Korean Air maintain their near 100% On-Time departure record for flights to and from India (for Aurgust 2011; as per DGCA) had each of their management heads, employees and pilots innovated in their transactions? No. [That would surely have disastrous consequences!] Would renowned hospitals for heart surgeries be the same safe place for patients if their doctors were to innovate their processes and dig out new surgery styles each time? No. [Absurd!] Would Chinese steel companies like Hebei Iron and Steel, Baosteel Group, Wuhan Iron and Steel, Jiangsu Shagang and Shandong Iron and Steel Group feature in the world’s top ten volume producers of steel (source: World Steel Organisation, 2011) had they innovated on the manufacturing method every single day? Impossibly no!

But really, I repeat ad nauseam that exnovation is not about refusing innovation within the company. Yes, a few of my examples may give off that air, but really, exnovation engenders an ideology that only some employees are gifted enough to analyse and innovate processes – and therefore such elitist employees should be placed in specialised innovation units with a sole responsibility to check processes and structures throughout the organisation and to innovatively improve them in whichever way possible. Employees who don’t have such innovative capacities may be better at simply implementing or following the processes; such employees should therefore be trained to ‘not innovate’ by exnovation managers.

The world believes that Steve Jobs was a great innovator. I would rather say he was the world’s second greatest exnovator – one who ensured that even his innovation teams had to follow a structured time driven process to come up with innovative solutions and products. And when they did, the same was exnovated across all of Apple’s divisions and offices. That was the wonder of Steve Jobs the visionary.

In the year 2003, the globally renowed management author Jim Collins wrote an iconic article for the Fortune magazine, titled The 10 Greatest CEOs Of All Time. Jim ranked at #1 on this all time list, an individual known as Charles Coffin. Jim wrote in that articel, “Coffin oversaw two social innovations of huge significance: America’s first research laboratory and the idea of systematic management development. While Edison was essentially a genius with a thousand helpers, Coffin created a system of genius that did not depend on him. Like the founders of the United States, he created the ideology and mechanisms that made his institution one of the world’s most enduring and widely emulated.” If this is not one of the greatest combinations of innovation with exnovation, then what is? The institution Coffin co-founded with Edison was GE. Coffin passed away in 1926. Till date, he remains for me the world’s greatest exnovator.