Selasa, 29 November 2011

I now work for the Sol Price School of Public Policy as well as the Marshall School of Business at USC

The Price Family gave a $50 million gift for the USC School of Policy, Planning and Development to be remained the USC Sol Price School of Public Policy.  Mr Price was an alum of USC, as is his grandson.

Mr. Price was the force behind Price Club, which later merged with Costco.  He was known for paying his workers well, treating his customers well, and not overpaying his executives.  He was ahead of his time with respect to racial integration and urban renewal.  Sometimes I feel an internal tension, because I admire both success in business and care a lot about social justice.  If all successful people in business were like Mr. Price, I would feel no such tension.  His obituary in the San Diego Jewish World contained the following:
“Most of life is luck,” he said in an 1985 newspaper interview. “Obviously you have to have the will and intensity, and in my case discipline and idealism had a lot to do with it. But if you move back a step, even that is luck."
I can't think of a better way to look at life.  And whether you need mustard or Johnny Walker Black, there is no better place to go than Costco.  I am proud to now work at a school named for him.





Why I think Raphael Bostic is more likely right about FHA than Joe Gyourko/AEI/WSJ

A healthy debate has taken place between HUD Assistant Secretary Raphael Bostic and Wharton Professor Joe Gyourko on the financial future of FHA.  While FHA is thinly capitalized, Raphael argues that will likely survive, while Joe thinks a large taxpayer finance bailout is looming.  In the interest of full disclosure, I should note that Raphael is a colleague of mine at USC, but Joe invited me to be a visiting faculty member at Wharton for a semester.  I think highly of them and am grateful to them both.

I have two reasons to bet on Raphael's view:

(1)  At the time the dumbest mortgage business was being done, FHA was out of the picture.  While FHA's market share is typically in the neighborhood of 12-15 percent, during the period 2003-2007, its market share ranged from 3.77 to 9.66 percent.    FHA did not lower its underwriting standards to that of the shadow banking sector (a sector that was not subject to the Community Reinvestment Act, by the way) in order to keep market share--the government insurance program was far more disciplined than the private sector.

FHA's market share increased dramatically in 2009 and 2010, in large part because the private sector abandoned the low downpayment market.  In 2010 in particular, FHA gained market share despite raising its prices and tightening its underwriting.    FHA was also ramping up its market share after house prices collapsed.  While house prices have not been robustly rising since late 2008, they have not been falling precipitously either.  One could argue that the private sector has been backward looking, while the public sector has been more forward looking.

(2) The second reason I have is more speculative, and is something that I am currently in the middle of researching, but I want to put it out there as a hypothesis (and a hunch).  I suspect that there is such a thing as "burn-out" in default--if a household goes through a difficult time without defaulting, it becomes decreasingly likely to default.  Part of the reason for this is amortization, but that is a small reason.  More important, people who refuse to default even when their measured characteristics suggest that they should have revealed that they are "different," and in a manner that is unobservable.  

Now again, in the interest of full disclosure, I should note that I did not forecast the size of GSE losses, so maybe I shouldn't be taken that seriously.  But I think my first argument will stand up, and as I do more research, I will know more about the second.


Sabtu, 26 November 2011

Does slowing people down slow down the economy?

As my family and I were traveling back to LA from my parents' place in Arizona this weekend, we had to stop at three checkpoints.  Each stop delayed us--I would guess the average delay was 5-10 minutes.  One check point bragged that it had arrested around 100 people--about 70 for immigration violations and 30 for crimes--over the course of 2011.

According to this web site, one of the highways I travelled on carries 10,000 cars per day.  Let's say the average stop takes five minutes, the average car has 1.3 people in it, and the value of people's time averages $15 per hour.  This means that each arrest costs a little under $60,000; perhaps there is a deterrent effect as well.  Is this worth it?  I really don't know.

But I can't help but notice that over the last ten years, the US, as a matter of security policy, has really gummed up the ability of people to get easily from one place to another. Is it a coincidence that the economy has underperformed over this time?  Perhaps.  I can't think of a way to run a regression to test the relationship between ease of travel and economic performance--but that doesn't mean that someone else can't.



Selasa, 22 November 2011

Remembering the date

In the long history of the world, only a few generations have been granted the role of defending freedom in its hour of maximum danger. I do not shrink from this responsibility—I welcome it. I do not believe that any of us would exchange places with any other people or any other generation. The energy, the faith, the devotion which we bring to this endeavor will light our country and all who serve it—and the glow from that fire can truly light the world.

Senin, 21 November 2011

More four year degrees won't solve the current problem

David Brooks and Thomas Friedman have recently taken to arguing that the solution to our income distribution woes is to encourage and enable more people to go to college.  I want to leave aside for a second the fact that our educational problems are much deeper than that--that our high school graduation rate is declining is to me the most alarming education statistic.

Rather, it is worth looking at what has happened to earnings by educational attainment over the past eight years.  The census has put out data for 2002-2010, and here is what it (Table A-6) shows:

Median earnings for men with a high school degree fell 12.1 percent between 2002-2010; earnings for women with a high school diploma fell 8.5 percent between 2002-2010; for men with college degrees, it was a fall of 8.0 percent; for women with college degrees it was flat.  So yes, education is increasing income inequality in that those with college degrees are losing less than those with high school diplomas.

I am the sort of person who would be fine with a GINI of .5 (a number the reflects lots of inequality) if it meant that the people who are materially worst off can live at a decent standard of living.  But currently, those who play by the rules (and I mean really play by the rules) are seeing their living standards erode.  Homilies about sending more people to college are at the moment pretty much beside the point.

Sabtu, 19 November 2011

Raphael Bostic takes on Joe Gyourko

The Assistant Secretary of PDR (and USC colleague) writes:


This week, HUD released its annual report to Congress on the financial status of the Federal Housing Administration (FHA) Mutual Mortgage Insurance (MMI) Fund.  The report demonstrates the long-term strength of the Fund while not shying away from the challenges it faces in the near-term due to ongoing stresses in the housing market.  While the independent actuary reports older books of business underwritten during the bubble years of 2000-2008 are expected to produce losses of more than $26 billion, it also finds that FHA has a very strong platform going forward, with insurance on loans booked since January 2009 posting an estimated net economic value of $18 billion. Indeed, the actuary reports that the Fund still retains positive capital, and that it should be able to rebuild capital to the statutory requirement of two percent of insurance-in-force very quickly once housing markets across the county exhibit sustained growth.

Notwithstanding findings of the independent actuary that the FHA MMI Fund retains positive capital four years into the worst housing crisis since the Great Depression, a report commissioned by the American Enterprise Institute (AEI) suggests that FHA both lacks an actuarially sound program and is in current need of a significant capital infusion. 

Read the whole thing. It has actually stunned me how well FHA says done relative to AEI's paragon of virtue, the private market.  Of course, it was the private labels security market that drove down FHA's market share during the worst of the lending market. FHA loans actually always required underwriting; underwriting in the private sector often disappeared.


Kamis, 17 November 2011

Read CRL on Disparities in Mortgage Lending

The Center for Responsible Lending's research team of Carolina Reid (who has been working tirelessly at developing data on subprime mortgages for some time now), Roberto Quercia, We Li and Debbie Grunstein Bocian has produced Lost Ground, 2011: Disparities in Mortgage Lending and Foreclosures. They argue
1) The nation is not even halfway through the foreclosure
crisis. Among mortgages made between
2004 and 2008, 6.4 percent have
ended in foreclosure, and an additional 8.3 percent are
at immediate, serious risk.

(2) Foreclosure patterns are strongly
linked with patterns of risky
lending.
The foreclosure rates are consistently
worse for borrowers who received high-risk loan products
that were aggressively marketed
before the housing crash, such as
loans with prepayment penalties,
hybrid adjustable-rate mortgages
(ARMs), and option
ARMs.
Foreclosure rates are highest in
neighborhoods where these
loanswere concentrated.

(3)The majority of people affected
by foreclosures have been
white families. However, borrowers of
color are more than twice as
likely to lose their home as
white households. These higher
rates reflect the fact that African
Americans and Latinos were
consistently more likely to receive
high-risk loan products, even
after accounting for income and
credit status.
It is really striking how African-Americans and Hispanics were steered into crappy loans, even controlling for income and credit history. Beyond all this, the web site accompanying the report has really nicely organized data on severely delinquent loans and loans in foreclosure by state, race, ethnicity and MSA.

Holmen Jenkins makes me spit out my coffee this morning.

He spins this scenario:

Take this case: Workers in a rail yard see men in suits prowling around. Rumors fly the company is being sold. One worker buys call options on his employer's stock and, because the rumors turn out to be right, is hauled up on insider-trading charges. Had the rumors been wrong, had the worker lost money, had the men in suits been federal railroad inspectors, think the feds would have filed a case?
The natural lesson we draw from this little piece of fiction: if Spencer Bachus buys a short position after he meets with Ben Bernanke, it's ok.

Selasa, 15 November 2011

Sometimes you have to hold your nose

Reporter Jim Puzzanghera  of the LA Times asked me today whether I would restore conforming loan limits in certain high cost areas to their pre-October 1 729,250 level.  He wrote:


Although he'd like to see more data, Green thinks it's probably a smart move to increase the loan limits. And he agreed that the move was unlikely to hurt the FHA's finances. 
"My gut answer is, I'd probably raise it back right now," Green said. "The downside of not raising it is potentially pretty bad."
I really dislike the idea of subsidizing mortgages that only households earning more than $200,000 per year can afford.  At the same time, however, Nick Timiraos last week wrote:

Potentially more revealing is this data point from California, which has a higher share of markets affected by the declines: applications for purchase loans with balances between $625,500 and $729,750 were down by 25% from September and by 33% from one year ago. By contrast, overall purchase-loan applications in California were down by just 12% and 3%, respectively.
Housing is still very weak and many borrowers are underwater.  I wanted to see if lowering loan limits would lead the private sector to step in--I am not seeing any evidence that it is.  Beyond the data cited in the Timiraos story, flow of funds data show that private lending in other sectors of the economy remains moribund.

Maybe it is worth waiting for another month of data before the old limits are restored.  But it is not worth worsening things in the market to make a point.

Harry Frankfurt and Herman Cain

The Washington Post sends me to a Milwaukee Journal-Sentinal interview with Herman Cain on Libya.




Watching the cringe inducing answers reminded me of one of may favorite books of the last decade or so: Harry Frankfurt's On Bullshit. I am writing this from my house, and my copy of the book is in my office, so let me pull a quote from the book that is featured in a Slate review:

Both in lying and in telling the truth people are guided by their beliefs concerning the way things are. These guide them as they endeavor either to describe the world correctly or to describe it deceitfully. For this reason, telling lies does not tend to unfit a person for telling the truth in the same way that bullshitting tends to. ...The bullshitter ignores these demands altogether. He does not reject the authority of the truth, as the liar does, and oppose himself to it. He pays no attention to it at all. By virtue of this, bullshit is a greater enemy of the truth than lies are.

I am not naive. Among my favorite presidents, three--FDR, LBJ, and Bill Clinton--were excellent liars. They were not, however, bullshitters. Herman Cain is.

Minggu, 13 November 2011

The Prescience of Rudiger Dornbusch

As events have unfolded in Europe over the past year, I keep thinking back to an article written by Rudiger Dornbusch in Foreign Affairs.  The summary:

The battle for the common currency may be remembered as one of the more useless in Europe's history. The euro is hailed as a solution to high unemployment, low growth, and the high costs of welfare states. But the deep budget cuts required before integration are already causing pain and may trigger severe recessions. If the European Monetary Union goes forward, a common currency will eliminate the adjustments now made by nominal exchange rates, and the central bank will control money with an iron fist. Labor markets will do the adjusting, a mechanism bound to fail, given those markets' inflexibility in Europe.

He wrote the piece in 1996. It is now behind a pay-wall, but if you have access to a university library, you can probably get access to the piece.

I know this makes me elitist, but...

...people running for president should actually know stuff.  Jon Huntsman does, which seems to disqualify him immediately.


Jumat, 11 November 2011

What's the real difference between Brookings and AEI?

With Brookings, I need to read the study to know how it turns out.

With AEI, I don't need to read the study to know how it turns out. 

Kamis, 10 November 2011

Lessons From the Failure of Flash: Greed Kills

Adobe's decision to stop development of mobile Flash has deservedly gotten a lot of attention online.  It's a sad story for Adobe and Flash developers: a dominating standard on the PC web failed to get traction in mobile, and will now be abandoned gradually in favor of HTML 5.  But the story's not limited to mobile -- without a mobile growth path, I think Flash itself is destined to become a dwindling legacy standard everywhere (link).  I think the whole Flash business edifice is coming down.

How did Flash go from leader to loser?  There are a lot of explanations being floated online. Erica Ogg at GigaOm has a good list (link):

--Mobile flash didn't work very well
--It was opposed by powerful people like Steve Jobs
--It was out-competed by HTML 5

(And by the way, how in the world do you get out-competed by something as slow-moving as HTML 5?)

I agree with Erica, but it's more a list of symptoms than root causes.  It's like saying an airplane crashed because the wings fell off.  Yes, that's true, but why did the wings fall off?  If you look for root causes of the Flash failure, I think they go back many years to a fundamental misreading of the mobile market, and to short-term revenue goals that were more important than long-term strategy at both Macromedia and Adobe.

In other words, Flash didn't just die.  It was managed into oblivion.

The story of Flash is a great cautionary tale for companies that want to create and control software platforms, so it's worth looking at more closely.


A quick, oversimplified history of Flash

In the software world, there is an inherent conflict between setting a broad standard and making money.  If you have good software technology and you're willing to give it away, you can get people to adopt it very broadly, but you will go broke in the process.  On the other hand, if you charge money for your technology, you can stay in business, but it's very hard to get it broadly adopted as a standard because people don't want to lock themselves into paying you.

Clever software companies have long realized that you can work around this conflict by giving away one technology to make it a standard, and then charging for something else related to it.  For example, many open source software companies give away their core product, but charge for hosting and support and other services.  Android is another example -- it's a free operating system for mobile phone manufacturers, but if you use it in your phone Google also tries to coerce you into bundling its services, which extract revenue from your customers. 

In the case of Flash, the player software was given away for free on the web, and Macromedia (the owner of Flash at the time) made its money by selling Flash content development tools.  The free Flash player eventually took on two roles on the web: it was the preferred way to create artistically-sophisticated web content, including an active subculture of online gaming, and it became one of the most popular ways to play video.  Flash reached a point of critical mass where most people felt they just had to have the player installed in their browser.  It became a de facto standard on the web.

Enter Japan Inc., carrying cash.
  The rise of mobile devices changed the situation for Flash.  Long before today's smartphones, with their sophisticated web browsers, Japan was the center of mobile phone innovation, and the dominant player there was NTT DoCoMo, with its proprietary iMode phone platform.  The folks at DoCoMo wanted to create more compelling multimedia experiences for their iMode phones, and so in early 2003 they licensed Macromedia's Flash Lite, the mobile version of Flash, for inclusion in iMode phones (link).

The deal was a breakthrough for Macromedia.  Instead of giving away the flash client, the way it had on the PC, Macromedia could charge for the client, have it forced into the hands of every user, and continue to also make money selling development tools.  The company had found a way to have its cake and eat it too!  In late 2004, the iMode deal was extended worldwide (link), and I'm sure Macromedia had visions of global domination.

Unfortunately for Flash, Japan is a unique phone market, and DoCoMo is a unique operator.  The DoCoMo deal could not be duplicated on most phone platforms other than iMode.  Macromedia, and later Adobe, was now trapped by its own success.  To make Flash Lite a standard in mobile, it would have needed to give away the player, undercutting its lucrative DoCoMo deal.  When you have a whole business unit focused on making money from licensing the player, giving it away would mean missing revenue projections and laying off a lot of people.  Macromedia chose the revenue, and Flash Lite never became a mobile standard.

Without fully realizing it, Macromedia had undermined the business model for Flash itself. The more popular mobile became, the weaker Flash would be.

Enter the modern smartphone.  Jump forward to 2007, when the iPhone and other modern smartphones made full mobile web browsing practical.  Adobe, by now the owner of Flash, was completely unprepared to respond.  Even if it started giving away Flash Lite, the player had been designed for limited-function feature phones and could not duplicate the full PC Flash experience.  Meanwhile, the full Flash player had been designed for PCs; it was too fat to run well on a smartphone.  So the full web had moved to a place where Adobe could not follow.  The ubiquity of the Flash standard was broken by Adobe itself.

To make things worse, Adobe was by then in the midst of a strategy to upgrade Flash into a full programming layer for mobile devices, a project called Apollo (later renamed AIR).  The promise of AIR was to make all operating systems irrelevant by separating them from their applications.  At the time, I thought Adobe's strategy was very clever (link), but the implementation turned out to be woefully slow. 

So here's what Adobe did to itself:  By mismanaging the move to full mobile browsing, it demonstrated that customers were willing to live with a mobile browser that could not display Flash.  Then, by declaring its intent to take over the mobile platform world, Adobe alarmed the other platform companies, especially Apple.  This gave them both the opportunity and the incentive to crush mobile Flash.

Which is exactly what they did.


The lesson: Don't be greedy

There are a couple of lessons from this experience.  The first is that when you've established a free standard, charging money for it puts your whole business at risk.  Contrast the Flash experience to PDF, another standard Adobe established.  Unlike Flash, Adobe progressively gave up more and more control over the PDF standard, to the point where competitors can easily create their own PDF writers, and in fact Microsoft bundles one with Windows Office.  Despite the web community's broad hostility for PDF, it continues to be a de facto standard in computing.  There is no possible way for Adobe to make money directly from the PDF reader, but its Acrobat PDF management and generation business continues to bring in revenue.

The second lesson is that you have to align your business structure with your strategy.  I think Macromedia made a fundamental error by putting mobile Flash into its own business unit.  Adobe continued the error by creating a separate mobile BU when it bought Macromedia (link).  That structure meant the mobile Flash team was forced to make money from the player.  If the player and flash development tools had been in the same BU, management might have at least had a chance to trade off player revenue to grow the tools business.


What can Adobe do now?

The Adobe folks say the discontinuation of mobile flash is just an exercise in focus (link).  They point out that developers can still create apps using Flash and compile them for mobile devices, and that Flash is still alive on the desktop.  Viewed from the narrow perspective of the situation that Adobe faces in late 2011, the changes to Flash probably are prudent.  But judged against Adobe's promise to create an "an industry-defining technology platform" when it bought Macromedia in 2005 (link), it's hard to call the current situation anything other than a failure.

I think it's clear that Flash as a platform is dying; the end of the mobile Flash player has disillusioned many of its most passionate supporters.  You can hear them cussing here and here. Flash compatibility will continue to live on in AIR and other web content development tools, of course, but now that Adobe doesn't control the player, I think it will have trouble giving its tools any particular advantage.

What Adobe should do is start contributing aggressively to HTML 5, to upgrade it into the full web platform that AIR was originally supposed to be.  That's a role no one in the industry has taken ownership of, web developers are crying out for it, and Adobe implies that's what it will do.  But I've heard these broad statements from Adobe before, and usually the implementation has fallen far short of the promises.  At this point, I doubt Adobe has the vision and agility to pull it off.  Most likely it will retreat to what it has always been at the core: a maker of software tools for artistically-inclined creative people.  It's a nice stable niche, but it's nothing like the dominant leadership role that Adobe once aspired to.

Rabu, 09 November 2011

Do Richwine and Briggs show that, on average, teachers are overpaid? I don't think so. (Warning: a little wonky)

A recent study by John Richwine and Andrew Biggs of the Heritage Foundation and the American Enterprise Institute purports to show that teachers are on average overpaid.   I do not find their evidence convincing, and the reasons have less to do with their affilitations than the technical nature of their work.  My problems with their paper are:

(1) They estimate a reduced form, which means it is difficult to interpret the meaning of their coefficients.

(2) Even if we accept their reduced form, there are issues in how the authors specify their explanatory variables.

(3) The authors' specification has a serious selectivity problem and

(4) Most disturbingly, they ignore their most convincing spefication, a specification that supports the idea that teachers get paid 10 percent less in wages than those in other professions.

Let's turn to each problem in turn:

(1) Underlying any wage equation is a supply curve for labor and a demand curve for labor.  Let's write these out:

L(s) = a + bw +cX1+ e1
L(d) =d - fw +gX2 + e2

X1 and X2 are vestors of explanatory variables, e1 and e2 are residuals from a regression equation. 

Let's say one of the elements in X2 is years of education--the demand for labor goes up in years of education after controlling for wages.  The coefficient g that is multiplied by years of education is thus easy to interpret--it is a wage premium associated with education.

The problem is that the authors estimate a reduced from,  where they put L(s)=L(d).  The resulting equation they arrive at is

w = d/(b+f)+gX2/(b+f)+e2/(b+f)-a/(b+f)-cX1/(b+f)-e1/(b+f)

If  X2 is education, and is in both the supply and deman equation, the reduced form wage equation reduces to:
w=(d-a)/(b+f) +(g-c)X1/(b+f)+(e2-e1)/(b+f)

So the coefficient on X1 is (g-c)/(b+f). This coefficient helps with prediction of wages, but it does not allow us to disintangle the stuctural foundation of wages.  This why why when we are trying to determine the impact of policy on outcomes, reduced forms are problematic.

(2) The authors assume that wages are linear in years of education.  This is clearly not true--the impact of  education on wages tends to fall into "buckets;" < 12 years, 12-15 years, 16 years, and > 16 years.  You get the idea.  Their mis-specification of the educational variable could bias their other findings.

(3) People who select themselves into teaching might have skills that do not show up in educational levels or on aptitude tests.  I have lots of education and do well on aptitude tests, but I think I would be at sea teaching second graders and REALLY at sea teaching middle schoolers.  Teaching students at these levels requires patience, insight and social skills that are not measured by aptitude tests.

The authors point to the interesting fact that people generally make less money when they move from teaching to non-teaching jobs.  There are alternative interpretations to there.  One is that teaching is a hard job, and so people willingly leave at lower wages.  The second is that those who select out of teaching are those who have decided they are not very good at it.

(4) The most disturbing part of the paper is this:



"Table 2 shows how teacher salaries change depending on whether education or AFQT is included in the regression. The first row is the "standard" regression based on our CPS analysis in the previous section: Years of education are controlled for, but AFQT is not. The standard regression shows a teacher salary penalty of 12.6 percent.

The second row includes both education and AFQT in the same regression. The impact on teacher wages is small: The penalty decreases by less than two percentage points. The third row again includes AFQT but now omits education. With this specification, the change is dramatic: The teaching penalty is gone, replaced by a statistically insignificant premium.

How to interpret these results? On the one hand, the difference in IQ between teachers and other college graduates

by itself has only a small effect on estimates of the teacher penalty. As the second row indicates, teachers with both the same education and AFQT score as other workers still receive 10.7 percent less in wages.



However, as we have shown, education is a misleading measure of teacher skills in several ways. In addition to the IQ difference between teachers and non-teachers, the education major is among the least challenging fields of study, and years of education subsequently have little to no effect on teacher quality. This suggests that eliminating education as a control variable and letting AFQT alone account for skills (as in the third row) may provide the most accurate wage estimates.

Replacing education with an objective measure of skills eliminates the observed teacher penalty, indicating that non-teachers with the same education as a typical teacher will likely have more applicable skills. We emphasize that a job is not necessarily less important or less challenging when the credentials for it are easier to obtain. Indeed, effective teachers are highly valuable to society and the economy."

So the authors have a regression with both education (which reflects Spence-type signalling, among other things) and IQ. The reduction in the R-squared when education is dropped suggests that after controlling for IQ, the coefficient on education continues to be statistically different from zero. When both IQ and education are included, teachers suffer a 10 percent wage discount relative to the private sector. Yet the authors ignore this result for the rest of the paper.

(FWIW, I really admire Michelle Rhee).

Kamis, 03 November 2011

THE MOTHER OF ALL INNOVATIONS – EXNOVATION©

WHY COMPANIES NEED TO MASTER THE ART OF NOT INNOVATING. IN OTHER WORDS, THE ART OF EXNOVATION – THE OPPOSITE OF INNOVATION

Good morning world. The mother of all anti-thesis theorems is here. Well, umm, it was already there since the past few years. Actually it was in 1996 when I conjured up this term called exnovation – which I defined as the opposite of innovation – and presumed that I had arrived on the global management scene; well, had not I finally created a better mousetrap? 15 years later, I see that the term exnovation is still known to almost zero individuals on this plant (‘cept me of course), and where known has taken up definitions that I never intended – and of course, nobody’s beaten their way up to my door yet. And that’s when I decided to give it one more try – define the term appropriately so that organisations realize the need to necessarily incorporate exnovation as a critical process within organisational structures.

I accept, in the present times, nothing excites corporate junkies more than the conceptWhat the processs-oriented Welch did, his innovation-hungry successor might undo of innovation. Who in heavens would care about exnovation for god’s sake?! Would you wish your company to come out tops on the World’s Most Innovative Companies’ lists or would you wish to be the numero uno on the exnovation charter – in other words, the world’s topmost ‘non-innovating’ company? One doesn’t need to think too deeply to get the answer to that. Frankly, the term exnovation was perhaps doomed from its very definition.

And reasonably too. Iconic CEOs have grown in fame because of being innovative. How many CEOs would you know of in the world who are worshipped because they exnovated? The answer might surprise you. Quite a few. And to understand this dichotomy, you’d have to first understand the correct definition of exnovation.

Exnovation does not actually mean propagating a philosophy of not innovating within the organisation. Exnovation in reality means that once a process has been tested, modulated and finally super-efficiently mastered and bested within the innovative circles of any organisation, there should be a critical system that ensures that when this process is replicated across the various offices of the organisation, the process is not changed but is implemented in exactly the same manner in which it was made super-efficient; in other words, no smart alec within the organisation should be allowed to tamper with the already super-efficient process. In other words, the responsibilty of innovation should be the mandate of specialised innovation units/teams within an organisation and should ‘not’ be encouraged to each and every individual within the organisation. The logic is that not every individual is competent at innovating – yet, everybody wishes to innovate, which is what can create a doomsday scenario within any organisation. Think the case of two call centers, where credit card customers call when they wish to complain about their lost cards. Imagine one call center, where all employees are trained by exnovation managers to follow tried and tested responses and processes; imagine the other call center, where each employee is allowed independence in innovatively deciding how to respond to the calling customer’s lost card issue. Any guesses on which call center would ensure better productivity and customer satisfaction? Clearly, the one practising exnovation. And that, my dear CEOs, is the responsibility of the Exnovation units within an organisation – units staffed with managers and supervisors whose sole job it is to ensure that best practice processes and structures are followed to the tee and not tampered with within the organisation by individuals or teams without a formal mandate. Call them what you may – but any manager responsible for ensuring replication and mirror implementation of any efficient process is an exnovation manager.

And it’s a fact that CEOs and companies have thrived practising this management philosophy of exnovation. The last time this $421.85 billion- a-year topline earning company allowed each and every was much before its stock became a market-commodity on NYSE (on October 1, 1970). Till date, its “Save money. Live better” concept is based on standard processes, followed to the hilt and marginally improved over the years, to deliver maximum productivity and efficiencies. What gives this company’s operations the push? Leveraging tested economies of scale (a process that economists have discussed over decades), sourcing materials from lowprice suppliers (simply put – common sense), using a well tested satellitebased IT system for logistics (a technology that was invented in the late 1950s; today, the company’s vehicles make about 120,000 daily trips to and/or from its 135 distribution centers spread across 38 states in US alone, a count equal to the average number of vehicles that use the Lincoln Tunnel per day in New York City) and smarter financial and inventory management called ‘float’ (the firm pays suppliers in 90 days, but plans its stocks so that it gets sold within 7 days).

The company is #1 on the Fortune list: Walmart (2011; it has occupied the pole position in the Fortune 500 Rankings for the eight time in ten years!). For that matter, recall the last time you heard of an innovation from Walmart. “After I came in as CEO, I looked at the world post-9/11 and realised that over the next 10 or 20 years, there just was not going to be much tailwind. It would be a more global market, it would be more driven by innovation. We have to change the company to become more innovation driven – in order to deal with this environment. It’s the right thing for investors.” Wise words from a wise CEO, spoken in the American summer of 2006, it seems. This protagonist was appointed the CEO of a large conglomerate on September 7, 2001 [which he refers to as “the company”]. When he took over the mantle, the company having been led by his “strictly process-oriented” predecessor, had grown to become a $415 billion giant (m-cap). So how has his “innovation-driven-change” focus worked for his investors and shareholders [to whom he wanted to do right]? Ten years have gone by, and under him, the company has lost 58% of its value! And while America Inc. has become more profitable in the past decade, this company’s bottomline has actually gone drier by 14.91%. The first thing this innovation-lover of a CEO did when he took over control of this company was increase the company’s R&D budget by a billion dollars more and spend another $100 million in renovation of the company’s New York innovation centre. Well, loving innovation is not wrong. What is wrong is in forgetting that the best innovated products, processes and structures should not be tampered with!

In other words, Mike Duke, Walmart’s CEO, uses commonsense to improve financials. Not innovation.Geoffrey Immelt forgot exnovation, which his predecessor Jack Welch had mastered. Yes, I’m talking about GE. Immelt, later in an HBR paper titled, “Growth as a process”, confessed, “I knew if I could define a process and set the right metrics, this company could go 100 miles an hour in the right direction. It took time though, to understand growth as a process. If I had worked that wheelshaped ‘execute-for-growth-process’ diagram in 2001, I would have started with it. But in reality, you get these things by wallowing in them a while. Jack was a great teacher in this regard. I would see him wallow in something like Six Sigma.” But this is not to say that Jack Welch was against innovation – in fact, he loved it; but he ensured that not everybody in the organisation was allowed to do that. Immelt’s paper does state that “under Jack Welch, GE’s managers applied their imaginations relentlessly to the task of making work more efficient. Welch created a formidable toolkit and mindset to maintain bottomline discipline.”

Share price movements of world’s largest oil companiesWhatever best practices were innovated in GE’s group companies, Welch ensured that the same were exnovated too and shared with other group companies in GE’s Crontonville Training Centre and GE’s Management Academy. And subsequently, such best practices were implemented throughout the group with a combination of commonsense and managerial Rex Tillerson, CEO of Exxon, respects set processes and cares little about algae fuel!judgement. From Six Sigma to the 20-70-10 rule, Welch was all about making GE’s traditional strength – process orientation – religion for its employees. It’s easy to guess a name that Welch would have fired in his tenure at GE. What else when you have a list of over 112,000 employees to choose from? [They were fired because they did not fit into the process-oriented culture of GE; according to a June 2011 HBR article titled, ‘You Can’t Dictate Culture – but You Can Influence It’, by Ron Ashkenas, Managing Partner of Schaffer Consulting and a co-author of The GE Work-Out, “The real turning point for GE’s transformation came when Jack Welch publicly announced to his senior managers that he had fired two business leaders for not demonstrating the new behaviours of the company – despite having achieved exceptional financial results.]

Next, tell us one innovation that Welch introduced. Difficult? In all probability, your answer will only end up defining a process he introduced at GE and ensured everyone – from his senior managers to the junior-most – followed to the hilt. Honestly, it wasn’t just innovation that created wealth on a massive scale for GE shareholders during Welch’s tenure by 2,864.29% (to make it the world’s most valuable company; with an m-cap of $415 billion, much ahead of the world’s thensecond- most valuable Microsoft at $335 billion), it was exnovation too – perhaps more so.

Stock movement comparison of GE, GM, Ford and WalmartTalk about a petrochemical company which is the third-largest company in the world and the highest profit-maker ever (with $30.46 billion in bottomlines in FY2010). In the name of innovation, the last time you saw this company contribute was when it developed the naphtha steam cracking technology (which it uses till date to refine petrochemicals) in the 1940s. Since then, there have only been modifications and improvements on this technology. Even when others had started talking about bio-fuels and innovation, this company’s CEO was adamant on continuing to invest in the technology that made what the $363.69 billion company (m-cap as on November 1, 2011) represented in the modern world. “I am not an expert on biofuels. I am not an expert on farming. I don’t have a lot of technology to add to moonshine. What are we going to bring to this area to create value for our shareholders that’s differentiating? Because to just go in and invest like everybody else – well, why would a shareholder want to own Exxon Mobil?”, said Rex Tillerson, the Chairman & CEO of Exxon Mobil – the second-largest Fortune 500 company. And this is what Fortune Senior Writer Geoff Colvin wrote in his article titled, ‘Exxon = oil, g*dammit!’ about Tillerson’s attitude to innovate in fuels of the future: “The other supermajors are all proclaiming their greenness and investing in biofuels, wind power and solar power. Exxon isn’t. At Exxon it’s all petroleum. Why isn’t the company investing in less polluting energy sources like biofuels, wind, and solar? Remembering that Exxon is above all in the profit business, we know where to look for the answer. As a place to earn knockout returns on capital, alternative energy looks wobbly. It’s a similar story for alternative fuels for power generation. Exxon just doesn’t know much about building dams or burning agricultural waste. Its expertise is in oil and gas.” Translation – Exxon continues to work on processes set and ignores what Tillerson calls moonshine [read: innovative fuels].

And to talk about how efficient and bottomline focussed this system at Exxon has become, Colvin has some lines to add: “At this supremely important job, it is a world champion. All the major oil companies bear about the same capital cost, just over 6%. But Exxon earns a return that trounces its competitors. Others could be pumping oil from the same platform, and Exxon would make more money on it. It is like taking the same train to work, but they get to the office first.” Can the way the most valuable company on Earth functions be some lesson for exnovation managers? Of course.

Next, the auto majors. Since Henry Ford introduced real innovation in the industry in the form of the assembly line, the Ford Motor Company hasn’t had much to boast about in this regard. And yet, it became the only Detroit major to bounce back without a Fed bailout. And how about the real innovator? Appears, being an innovator does not pay well in the auto industry too! General Motors was ranked the #1 innovator (among 184 companies) by The Patent Board in its automotive and transportation industry scorecard for 2011. But all this came at the cost of the company’s bottomlines which bled $76.15 billion in the seven years leading to 2010 [and this is not considering the fateful year 2009 when GM got a fresh lease of life with the US Fed pumping-in a huge $52 billion that ultimately saved America’s innovation pride]. And what about investors? If GM has the patents and is the king of innovation, should it not have been the best bet for investors? Count the numbers and decide: if an investor had invested $100 in GM stock exactly 10 years back, he would have just $78.42 left in his trading account – a return of negative 21.58%! Had the same sum been invested in four of the other big automakers in the world, the reading would have been quite different. Investing in Ford, the investor would have gained 22.72%, in Toyota: 39.52%, in Hyundai Motors: 89.4%, and in Volkswagen: 364.32%! These are companies that focus on design and maintaining a procedure that helps create cars with set standards of quality – not innovate or lead the rush for patents in clean-energy fuels! Message for GM – instead of investing billions of taxpayers’ funds in developing green-fuel and propulsion technologies, put people on a production process that will help launch more variants of the small diesel car (the Chevrolet Beat) for the BRIC markets. That should suffice. Exnovate – like Toyota does with its production system that follows the 5S, Kaizen and Jidoka philosophies – and create a process of continuous improvement in small increments that make the system more efficient, effective & adaptable.

In his May 2007 best-seller ‘The Myths of Innovation’, author Scott Berkun [who had worked on the Internet Explorer development team at Microsoft from 1994-1999], using lessons from the history of innovation, breaks apart one powerful myth about innovation – popular in the world of business – with each chapter. “Competence trumps innovation. If you suck at what you do, being innovative will not help you. Business is driven by providing value to customers and often that can be done without innovation: make a good and needed thing, sell it at a good price, and advertise with confidence. If you can do those three things consistently you’ll beat most of your competitors, since they are hard to do: many industries have market leaders that fail in this criteria. If you need innovations to achieve those three things, great, have at it. If not, your lack of innovation isn’t your biggest problem. Asking for examples kills hype dead. Just say “can you show me your latest innovation?” Most people who use the word don’t have examples – they don’t know what they’re saying and that’s why they’re addicted to the i-word.”

The fundamental question really is – could airlines like Singapore Airlines, Virgin Airways, China Southern, United Airways, KLM Royal Dutch Airlines and Korean Air maintain their near 100% On-Time departure record for flights to and from India (for Aurgust 2011; as per DGCA) had each of their management heads, employees and pilots innovated in their transactions? No. [That would surely have disastrous consequences!] Would renowned hospitals for heart surgeries be the same safe place for patients if their doctors were to innovate their processes and dig out new surgery styles each time? No. [Absurd!] Would Chinese steel companies like Hebei Iron and Steel, Baosteel Group, Wuhan Iron and Steel, Jiangsu Shagang and Shandong Iron and Steel Group feature in the world’s top ten volume producers of steel (source: World Steel Organisation, 2011) had they innovated on the manufacturing method every single day? Impossibly no!

But really, I repeat ad nauseam that exnovation is not about refusing innovation within the company. Yes, a few of my examples may give off that air, but really, exnovation engenders an ideology that only some employees are gifted enough to analyse and innovate processes – and therefore such elitist employees should be placed in specialised innovation units with a sole responsibility to check processes and structures throughout the organisation and to innovatively improve them in whichever way possible. Employees who don’t have such innovative capacities may be better at simply implementing or following the processes; such employees should therefore be trained to ‘not innovate’ by exnovation managers.

The world believes that Steve Jobs was a great innovator. I would rather say he was the world’s second greatest exnovator – one who ensured that even his innovation teams had to follow a structured time driven process to come up with innovative solutions and products. And when they did, the same was exnovated across all of Apple’s divisions and offices. That was the wonder of Steve Jobs the visionary.

In the year 2003, the globally renowed management author Jim Collins wrote an iconic article for the Fortune magazine, titled The 10 Greatest CEOs Of All Time. Jim ranked at #1 on this all time list, an individual known as Charles Coffin. Jim wrote in that articel, “Coffin oversaw two social innovations of huge significance: America’s first research laboratory and the idea of systematic management development. While Edison was essentially a genius with a thousand helpers, Coffin created a system of genius that did not depend on him. Like the founders of the United States, he created the ideology and mechanisms that made his institution one of the world’s most enduring and widely emulated.” If this is not one of the greatest combinations of innovation with exnovation, then what is? The institution Coffin co-founded with Edison was GE. Coffin passed away in 1926. Till date, he remains for me the world’s greatest exnovator.

Pulling out David Min's Graph from Mike Konczal's piece

Beyond reproducing Mike's post, I want to underline this graph from David Min:

This punches a hole in the argument that Pinto and Wallison make that Fannie and Freddie were making "dangerous loans" when they moved to higher LTV and lower FICO lending.  Their models allowed for offsets--if one had a very low LTV, one could get by with a relatively low FICO, and vice versa.  The private label market allowed for lending standards that were crappy in all dimensions.

Mike Konczal gives Six Reasons not to believe the meme that Fannie and Freddie caused the crisis

Reproduced with his kind permission:

1. Private markets caused the shady mortgage boom: The first thing to point out is that the both the subprime mortgage boom and the subsequent crash are very much concentrated in the private market, especially the private label securitization channel (PLS) market. The Government-Sponsored Entities (GSEs, or Fannie and Freddie) were not behind them. The fly-by-night lending boom, slicing and dicing mortgage bonds, derivatives and CDOs, and all the other shadiness of the mortgage market in the 2000s were Wall Street creations, and they drove all those risky mortgages.
Here’s some data to back that up: “More than 84 percent of the subprime mortgages in 2006 were issued by private lending institutions… Private firms made nearly 83 percent of the subprime loans to low- and moderate-income borrowers that year.”
As Center For American Progress’s David Min pointed out to me, the timing doesn’t work at all: “But from 2002-2005, [GSEs] saw a fairly precipitous drop in market share, going from about 50% to just under 30% of all mortgage originations. Conversely, private label securitization [PLS] shot up from about 10% to about 40% over the same period. This is, to state the obvious, a very radical shift in mortgage originations that overlapped neatly with the origination of the most toxic home loans.”

2. The government’s affordability mission didn’t cause the crisis: The next thing to mention is that the “affordability goals” of the GSEs, as well as the Community Reinvestment Act (CRA), didn’t cause the problems. Randy Krozner summarized one of the better studies on this so far, finding that “the very small share of all higher-priced loan originations that can reasonably be attributed to the CRA makes it hard to imagine how this law could have contributed in any meaningful way to the current subprime crisis.” The CRA wasn’t nearly big enough to cause these problems.
I’d recommend checking out “A Closer Look at Fannie Mae and Freddie Mac: What We Know, What We Think We Know and What We Don’t Know by Jason Thomas and Robert Van Order for more on the GSEs’ goals, which, in addition to explaining how their affordability mission is a distraction, argues that subprime loans were only 5 percent of the GSEs’ losses. The GSEs also bought the highly rated tranches of mortgage bonds, for which there was already a ton of demand.

3. There is a lot of research to back this up and little against it: This is not exactly an obscure corner of the wonk world — it is one of the most studied capital markets in the world. What has other research found on this matter? From Min:
Did Fannie and Freddie buy high-risk mortgage-backed securities? Yes. But they did not buy enough of them to be blamed for the mortgage crisis. Highly respected analysts who have looked at these data in much greater detail than Wallison, Pinto, or myself, including the nonpartisan Government Accountability Office, the Harvard Joint Center for Housing Studies, the Financial Crisis Inquiry Commission majority, the Federal Housing Finance Agency, and virtually all academics, including the University of North Carolina, Glaeser et al at Harvard, and the St. Louis Federal Reserve, have all rejected the Wallison/Pinto argument that federal affordable housing policies were responsible for the proliferation of actual high-risk mortgages over the past decade.
The other side has virtually no research conducted that explains their argument, with one exception that I’ll cover below.

4. Conservatives sang a different tune before the crash: Conservative think tanks spent the 2000s saying the exact opposite of what they are saying now and the opposite of what Bloomberg said above. They argued that the CRA and the GSEs were getting in the way of getting risky subprime mortgages to risky subprime borrowers.
My personal favorite is Cato’s “Should CRA Stand for ‘Community Redundancy Act?’” from 2000 (here’s a write-up by James Kwak), which argues a position amplified in its 2003 Handbook for Congress financial deregulation chapter: “by increasing the costs to banks of doing business in distressed communities, the CRA makes banks likely to deny credit to marginal borrowers that would qualify for credit if costs were not so high.” Replace “marginal” with Bloomberg’s “on the cusp” and you get the same idea.

Bill Black went through what AEI said about the GSEs during the 2000s and it is the same thing — that they were blocking subprime loans from being made. In the words of Peter Wallison in 2004: “In recent years, study after study has shown that Fannie Mae and Freddie Mac are failing to do even as much as banks and S&Ls in providing financing for affordable housing, including minority and low income housing.”

5. Expanding the subprime loan category to say GSEs had more exposure makes no sense: Some argue that the GSEs had huge subprime exposure if you create a new category that supposedly represents the risks of subprime more accurately. This new “high-risk” category is associated with a consultant to AEI named Ed Pinto, and his analysis deliberately blurs the wording on “high-risk” and subprime in much of his writings. David Min broke down the numbers, and I wrote about it here. Here’s a graphic from Min’s follow-up work, addressing criticism:
min_updated
Even this “high risk” category isn’t risky compared to subprime and it looks like the national average. When you divide it by private label, the numbers are even worse. Private label loans “have defaulted at over 6x the rate of GSE loans, as well as the fact that private label securitization is responsible for 42% of all delinquencies despite accounting for only 13% of all outstanding loans (as compared to the GSEs being responsible for 22% of all delinquencies despite accounting for 57% of all outstanding loans).” The issue isn’t this fake “high risk” category, it is subprime and private label origination.
The Financial Crisis Inquiry Commission (FCIC) panel looked carefully at this argument and also ended up shredding it. So even those who blame the GSEs can’t get the numbers to work when they make up categories.

6. Even some Republicans don’t agree with this argument: The three Republicans on the FCIC panel rejected the “blame the GSEs/Congress” approach to explaining the crisis in their minority report. Indeed, they, and most conservatives who know this is a dead end, tend to take a “it’s a whole lot of things, hoocoodanode?” approach.

Peter Wallison blamed the GSEs when he served as the fourth Republican on the FCIC panel. What did the other three Republicans make of his argument? Check out these released FCIC emails from the GOP members. They are really fun, because you can see the other Republicans doing damage control and debating whether Wallison and Pinto were on the take for making this argument — because the argument makes no sense when looking at the data.

There are lots of great quotes: “Re: peter, it seems that if you get pinto on your side, peter can’t complain. But is peter thinking idependently [sic] or is he just a parrot for pinto?”, “I can’t tell re: who is the leader and who is the follower,” “Maybe this email is reaching you too late but I think wmt [William M. Thomas] is going to push to find out if pinto is being paid by anyone.” And then there’s the infamous event where Wallison emailed his fellow GOP member: “It’s very important, I think, that what we say in our separate statements not undermine the ability of the new House GOP to modify or repeal Dodd-Frank.”

The GSEs had a serious corruption problem and were flawed in design — Jeff Madrick and Frank Partnoy had a good column about the GSEs in the NYRB recently that you should check out about all this — but they were not the culprits of the bubble

Rabu, 02 November 2011

It is time to kill California High Speed Rail

Lisa Schweitzer notes that the cost estimate has been raised to nearly $100 billion (which she says still might understate the cost).

Let's do a little math about this--$100 billion throws off about $5 billion per year. If low income people work 200 250 days a year, and we were to fully subsidize the $5 LA metro day pass (and if I have my zeros right), we could fund 5 4 million people's transit per year.