Rabu, 27 April 2011

Sudeep Reddy on Bernanke's Press Conference

He writes (correctly in my view):

This press conference looks like a more intelligent, faster-paced version of a congressional hearing. It's a lot of the same questions -- bond-buying, high oil prices, long-term unemployment -- except you would've gotten five lawmakers making speeches about gas prices before asking a single question. Bernanke looks the same as he does at a hearing. He's not exactly thrilled to be there, but happy to take questions as long as you want.

Senin, 25 April 2011

Wendell Cox on Municipal Boundaries

From comments:

The arbitrariness of the municipal boundaries that drive much of the historical core city and suburb analysis does indeed create significant problems. We commented on this in PERSPECTIVES ON URBAN CORES AND SUBURBS (http://www.newgeography.com/content/002123-perspectives-urban-cores-and-suburbs). 

We concluded: "An eventual more precise analysis of urban cores and suburban trends will be welcome. Yet, as our analysis of trends in New Jersey indicated, even the growth in more urban core oriented municipalities was minuscule compared to the state's suburban growth. Further, much of the urban core growth in the nation came from areas that, although formally located within “city limits” actually were on the suburban fringe. This was true, for example, in Kansas City, Oklahoma City and even Portland. This suggests that the small share of growth reported in urban cores would be even less if it were based on census tract data; and suburbanization, as a way of life, may indeed be even more prevalent than this year’s numbers suggest."

Deborah Popper on "Subtracted Cities"

She writes about how shrinking cities can take control of their destinies, while being realistic about what those destinies imply:


Detroit stands as the ultimate expression of industrial depopulation. The Motor City offers traffic-free streets, burned-out skyscrapers, open-prairie neighborhoods, nesting pheasants, an ornate-trashed former railroad station, vast closed factories, and signs urging "Fists, Not Guns." A third of its 139 square miles lie vacant. In the 2010 census it lost a national-record-setting quarter of the people it had at the millennium: a huge dip not just to its people, but to anxious potential private- and public-sector investors.
Is Detroit an epic outlier, a spectacular aberration or is it a fractured finger pointing at a horrific future for other large shrinking cities? Cleveland lost 17 percent of its population in the census, Birmingham 13 percent, Buffalo 11 percent, and the special case of post-Katrina New Orleans 29 percent. The losses in such places and smaller ones like Braddock, Penn.; Cairo, Ill.; or Flint, Mich., go well beyond population. In every recent decade, houses, businesses, jobs, schools, entire neighborhoods -- and hope -- keep getting removed.
The subtractions have occurred without plan, intention or control of any sort and so pose daunting challenges. In contrast, population growth or stability is much more manageable and politically palatable. Subtraction is haphazard, volatile, unexpected, risky. No American city plan, zoning law or environmental regulation anticipates it. In principle, a city can buy a deserted house, store or factory and return it to use. Yet which use? If the city cannot find or decide on one, how long should the property stay idle before the city razes it? How prevalent must abandonment become before it demands systematic neighborhood or citywide solutions instead of lot-by-lot ones?
Subtracted cities can rely on no standard approaches. Such places have struggled for at least two generations, since the peak of the postwar consumer boom. Thousands of neighborhoods in hundreds of cities have lost their grip on the American dream. As a nation, we have little idea how to respond. The frustratingly slow national economic recovery only makes conditions worse by suggesting that they may become permanent.
Subtracted cities rarely begin even fitful action until perhaps half the population has left. Thus generations can pass between first big loss and substantial action. Usually the local leadership must change before the city's hopes for growth subside to allow the new leadership to work with or around loss instead of directly against it. By then, the tax base, public services, budget troubles, labor forces, morale and spirit have predictably become dismal. To reverse the momentum of the long-established downward spiral requires extraordinary effort.
Fatalism is no option: Subtracted cities must try to reclaim control of their destinies. They could start by training residents to value, salvage, restore and market unused sites and the material found there. They might supplement school drug-free zones with subtraction-action ones by reacting quickly when nearby empty properties show neglect. Children who see debris-filled plots and boarded-up buildings learn not to expect much from life. Just planting a few trees often makes a deserted lot look cared for.

Jumat, 22 April 2011

Goodbye, IBM. Seriously.

For those of us who worked at Apple in earlier days, the company's current success is sometimes surreal.  I had one of those moments today.  Back in the mid 1990s, we were struggling to get to $10 billion in revenue per year, a figure that seemed ridiculously high.  This week, Apple reported quarterly revenue of $27 billion.  Apple is almost certainly now a $100 billion a year company.

To put that in perspective, Apple is now larger than companies like Honda, Sony, Deutsche Telekom, Procter & Gamble, Vodafone -- and IBM.  Apple is very close to passing Samsung and HP, which would make it the world's largest computing company.

In 1981, when IBM entered the PC business, Apple ran a big ad in the Wall Street Journal saying "Welcome, IBM. Seriously."  At the time, everyone thought it was a very cheeky move by a tiny upstart company.  No one -- and I mean absolutely no one -- would have believed that 30 years later Apple would be looking at IBM in the rear view mirror.

The spookiest thing is that Apple may still have a lot of room to grow in both mobile phones and tablets.  There's no way the company can keep growing like this indefinitely, but it's very hard to predict exactly when it'll slow down.

Kamis, 21 April 2011

Chris Leinberger, Wendell Cox, Drunks and Lampposts

In a New Republic piece, Chris Leinberger says that Wendell Cox's statement that the census shows that central cities have had a small fraction of urban population growth is beside the point.  He argues, correctly, that the census definitions of urban and suburban are pretty arbitrary: if one is in the City of Los Angeles, she lives in an urban area, if she is in Santa Monica or Beverly Hills or Pasadena, she is in a suburban area.  One visit to LA reveals that this is silly.  Just because the light is there doesn't mean the missing keys are there.

But Leinberger then makes a statement that is also un-illuminating:

Likewise, the suburbs of those core cities include classic subdivisions and McMansions, like the home of Tony Soprano, but they also include booming places like Old Town Pasadena, Reston Town Center near Dulles Airport outside D.C., and revitalized Jersey City and Hoboken, NJ, on the other side of the Hudson River from Manhattan.
I now live in Pasadena, and before that, I lived in Bethesda.  They are both indeed wonderful places (for me anyway); they are also quite "walkable."  But neither strikes me as booming, so I looked up their population growth between 2000 and 2010.  The Bethesda Central Designated Place grew by  a little over 9 percent; Pasadena grew by 2.3 percent.  The country as a whole grew by a little less than 10 percent.  It is hard to make a case for booming.

But perhaps the issue is supply.  If no houses are available, then it is not surprising that population has not grown much.  Ryan Avent has correctly made this point.  But the residential vacancy rates in both Bethesda and Pasadena are in excess of 7 percent--not huge, but not exactly tight either.  And Pasadena's condominium market continues to face serious problems.

Personally, I love the kind a communities Leinberger favors--I seem to live in them.  But  some urbanists engage in hectoring that really bothers me.  Lots and lots of Americans appear to love their cars and their isolated houses.   So long as they interanalize the costs their lifestyle imposes (and I have long been for a tax that puts a floor on gasoline costs), people should be able to live how they like and where they like.

Raphael Bostic on a layered housing problem

He gives an interview here:

There is a triple threat facing the elder African American LGBT population in the Detroit area. Even though small in number, this particular group of people encounters difficulties in finding retirement homes, safety, recognition and financial security. Dr. Raphael Bostic, the assistant secretary of Housing and Urban Development, attended an April 16 summit organized by KICK (an agency for LGBT African Americans) to address such concerns. Dr. Bostic spoke to BTL about discrimination and other issues faced by these elders.

What were the common concerns discussed at the KICK summit?
The elder LGBT population has significant challenges. They don't have children who can offer them help and support. If they are with a partner they often don't have access to their (partner's) pension funds, so they can become extremely vulnerable rather quickly. This is a really important conversation, and a lot of the gay and lesbian elder population has not been a (focus) of that conversation. Somehow they are a hidden population.
Elders in African American communities have difficulties, elders in general have difficulties and LGBT elders have difficulties, so this really overlays three types of groups. We don't really know much about the challenges that this group faces and they are forced to be invisible because sexual orientation and gender identity are not protected classes, so landlords can and do discriminate against these (people). So sometimes they have to go back into the closet. One of the things we are trying work on is how often these issues arise so we can talk about it in an informed way and hopefully get to a place where that kind of discrimination happens a lot less frequently.

Quick Takes: The RIM Tragedy, Lame Market Research, Ebooks Closer to Tipping, Flip vs. Cisco, Google as Microsoft, Nokia and the Word "Primary"

Short thoughts on recent tech news...


RIM as Greek tragedy

I wrote last fall that I was worried about RIM's financial stability (link), but I never expected the company to start inflicting damage on itself.  RIM has always come across as a calm, dependable company.  Maybe not as flashy as some other firms, but reliable and smart.  But as we approached the PlayBook launch, the company has started to look like its own worst enemy.

It's clear that the PlayBook was designed initially as a companion device for people who have BlackBerry phones, and only those people.  That's an interesting choice -- not one I would have made, but I can see RIM's logic.  But apparently RIM decided late in the game that it needed to market the tablet to a broader range of customers.  It started talking up the features those users would need, without making clear that the features would not be included in the device at launch.  Many of the things the company has been touting -- such as Android app compatibility and the ability to check e-mail messages independently of a BlackBerry -- were not available when the device shipped.  RIM has been marketing vaporware.  That guarantees disappointed reviews that focus on what the device doesn't do, rather than what it does.  Check out Walt Mossberg's write-up (link).

While this has been going on, RIM co-CEO Mike Lazaridis has been compounding the problem by creating a personal reputation as a loose cannon.  His latest escapade was ending a TV interview with BBC when they asked about security issues.  The use of the word "security" was mildly provocative, but if you've ever dealt with the British press, you know they specialize in goading people to get an interesting reaction.  The more senior your title, the more they'll poke at you, to see if you can take the heat.

The way this game works, there are several techniques you can use to deal with an aggressive question.  You can laugh at it, you can calmly point out the flaw in the question, you can answer it earnestly and patiently, and you can even pretend not to understand it (I did that once on a UK TV show and it drove the interviewer crazy because he didn't have time to rephrase the question).  But the one thing you can't do is stop the interview.  If you do that, the BBC will post a clip of you online that makes you look like a gimlet-eyed prima donna (link).

The fact that Lazaridis did this means either he's losing personal control under pressure, or not being properly briefed by his press people, or both.  Whatever the cause, it is unprofessional, and it's making RIM's challenges harder.

If you want to understand the damage being done, you can read the forward-looking obituary of RIM that Slate just ran (link).  Or check out this column by Rob Pegoraro of the Washington Post (link). Rob's a very fair-minded, professional journalist who isn't given to hyperbole.  But he called Lazaridis' actions "profoundly foolish from any sane marketing perspective...Seriously, does RIM not realize whom it’s competing with? The company is all but begging to get crushed by Apple."

I haven't written off RIM by any means.  They have a huge customer base, a great brand, and a long history of overcoming skepticism from people like me.  I hope they can do it again.  But at a minimum, RIM's management needs to recognize that they do not have the marketing skills needed to play in the world of increased smartphone competition.  They need professional help, immediately.  And I worry that the marketing problems are actually symptoms of much deeper disorder within the company.


The lamest market research study of the year

It's still early in the year, but I think someone's going to have to work pretty hard to do a lamer market research study than Harris Interactive's EquiTrends survey of mobile phone brands in the US.  Harris says the survey indicated that Motorola has the most "brand equity" of mobile phone brands in the US, followed by HTC, Sony Ericsson, Nokia, and Apple.  Harris also provided a nice chart of the results (link):



There are a couple of problems here.  The first is that the reportedly best-selling mobile phone brand in the US, Samsung, was not included in the results (link).  Oops.

The second problem is that Harris doesn't directly measure brand equity (which is a pretty fuzzy concept anyway).  What it measures is "Familiarity, Quality, and Purchase Consideration."  Those three ratings were combined into an overall brand equity score.

So this is a made-up rating created through a mathematical formula that Harris hasn't shared with the public, as far as I can tell.  But Harris assures us that it's meaningful: "Those companies with high brand equity are able to avoid switching behaviors of those brands that lack brand equity."  (link).  So, according to Harris's research, people in the US should be switching from other phone brands to Motorola.

But in the real world, the exact opposite has been happening.  Motorola has been losing share.  The number three rated brand, Sony Ericsson, has barely any distribution in the US, so it doesn't have much share to lose.  The number four brand, Nokia, has lost most of its US share.

Harris argues that Apple's mediocre score is driven by the sophistication of the iPhone:  "There is still a large audience of consumers that aren’t interested in a smartphone running their life, and Apple doesn’t have a product to meet that need."  I think that's correct, but HTC also sells only smartphones, and it was ranked number two.

And oh by the way, what's the margin of error in Harris's survey?  I can't find it disclosed anywhere, but my guess is that it's several points plus or minus, in which case everyone except Motorola is in a statistical tie.  That wouldn't have made for a cool looking marketing chart, though.

It's been distressing to see websites pick up the Harris story and repeat it without questioning the results.  PC Magazine swallowed it whole (link), as did MocoNews (link).  A lot of other sites reprinted the Harris press release verbatim.  Even if you didn't dig into the flaws, the study ought to fail the basic sniff test of credibility -- does anyone really believe that HTC has a stronger brand in the US than Apple?

When I worked at Apple and Palm, we hated synthetic brand rating studies like this one (and the JD Power ratings, which are similar) because the results depend more on the secret formula used by the polling company than on the actual behavior of customers.  The polling companies construct these special methodologies because they can then sell long reports to the companies surveyed explaining the results, and also charge the winners for the right to quote the results in their marketing.  Check out the fine print at the bottom of the Harris press release: "The EquiTrend® study results disclosed in this release may not be used for advertising, marketing or promotional purposes without the prior written consent of Harris Interactive."  I don't know for sure that Harris charges to quote the survey, but that's the usual procedure.

The lesson for all of us is that you should never accept any market research study without looking into its background, even if it comes from a famous research company.


Ebooks: Here comes the tipping point

The continued strong sales of iPad, Kindle, and Nook in the US are bringing us steadily closer to the tipping point where it will pay an author to bypass paper publishing and sell direct to ebooks.  The latest evidence is from the Association of American Publishers, which reported that ebooks made up 27% of all book revenue in the US in January-February 2011 (link).  AAP correctly pointed out that the ebooks share was raised temporarily by people buying ebooks to read on all of the e-readers they got for Christmas.  The share will go down later in the year.

Still, at any share over about 20%, it will be more economical for an established author to self-publish through ebooks (where they can retain 70% of sales revenue) rather than working through a paper publisher (where they get at most 15% of revenue).  When we hit that point on a sustained basis, I expect that a lot of authors will move to electronic publishing quickly.

It looks like we'll hit that point sometime this year or next.


Flip aftershocks

Silicon Valley has the attention span of a toddler in a candy store, but it was interesting to see how people around here lingered on the story of Flip's demise several days after the announcement.  There were dark suggestions of ulterior motives at Cisco -- that they had bought the company to strip it of its intellectual property (link) or that they shut it down a viable company only so they could look decisive to Wall Street (link).  And that was just the stuff in the press.  I've heard even more pointed speculation from people working in Silicon Valley.

My guess is the real story is a lot more complicated and nuanced, but at this point it doesn't matter.  Killing Flip may have helped Cisco with Wall Street analysts, but the sequence of buying Flip and then shutting it down has seriously damaged the company's image in Silicon Valley as a leader and a partner.  Silicon Valley is a very forgiving place.  You can make huge strategic mistakes, and waste billions of dollars, and still you'll be forgiven as long as you did it in sincere pursuit of a reasonable business idea.  But Cisco's senior management is now viewed as either overconfident to the point of stupidity, or as the deliberate torture-murderer of a beloved consumer brand.  I've rarely seen this level of hostility toward a management team, and I don't think they will be forgiven anytime soon, if ever.

Does that have any practical impact on Cisco's business?  Not immediately; business is business.  But it will probably be a little harder for Cisco to make alliances and hire ambitious people in the future.


Google 2011 = Microsoft 2000?

It's spooky how Google is sometimes starting to remind me of Microsoft circa 2000.

The latest incident was a quote from a Google executive saying that the company wants iPhone to grow because Google makes a lot of money from it (link).  Microsoft used to say the same sort of thing about Apple, claiming that it made more when a Mac was sold rather than a Windows PC (link).  (The idea was that many Microsoft apps were bundled with Windows at low cost, whereas Mac customers bought Microsoft apps at retail.)
   
In both cases, the statements may be technically true, but what they really point out is that the company has deep internal conflicts between its various business units.  Yes, part of Microsoft wanted to make Macintosh successful, but another part of Microsoft wanted to kill Macintosh.  Microsoft as a whole wanted to do both at the same time, which created internal confusion.  Add in antitrust lawsuits by governments and Wall Street pressure for quarterly growth, and Microsoft quickly became distracted, inwardly focused, and slow-moving.

Parts of Google, I'm sure, think iPhone is great and want it to grow.  But I guarantee that the Android team is trying to kill iPhone (and Nokia, and HP/Palm).  Google has its own set of government distractions, plus a big old lawsuit from Oracle, plus legal action by Microsoft and Apple against Android licensees. 

There are huge differences between Google and Microsoft, of course.  Google is not under the same sort of Wall Street pressure that was applied to Microsoft, and Google's founders have not lost interest in running the company. 

But it's disturbing to see how quickly some of Microsoft's symptoms are showing up at Google.


Hey Nokia, how do you define "primary"?

Microsoft and Nokia said they have finalized the contract for their alliance.  There were a couple of interesting tidbits in the announcement:

--Both companies said they completed the negotiations sooner than they expected.  Usually that sort of statement is hype, but for an agreement of this size, it actually was a pretty fast turnaround.

--They went out of their way to say that Nokia will be paying royalties for Windows Phone similar to what other companies pay.  That's important legally and for regulators, so companies like Samsung can't complain that Microsoft is giving discriminatory pricing.  At the same time, the announcement also made it clear that Microsoft will be passing a ton of money to Nokia for various services and IP, which Nokia wanted on the record to help with its investors.  I think the net effect will be that Nokia gets a free Windows Phone license for a long time.  That will not please Samsung, HTC, and the other Windows Phone licensees, because it puts them as a price disadvantage.

--The companies are apparently cross-licensing a lot of patents.  I wonder if this will help Nokia with its IP warfare against Apple.

--In an interview with AllThingsD (link), Microsoft and Google Nokia said Windows Phone was Nokia's "primary smartphone operating system." That leaves open the door for Nokia to play with other smartphone operating systems, and it leaves completely unanswered the question of tablets.  I'm sure the Symbian/Meego fans will be all over that as a ray of hope for their platforms, but to me it just leaves some prudent wiggle room for Nokia in the future.  I'd love to know how the agreement defines the words "smartphone" and "primary" -- or if it even has definitions for them.

(Note: Edited on April 22 to fix an embarrassing typo.)

Selasa, 19 April 2011

Why does S&P matter?

Are they telling us anything about the US fiscal condition that the whole world doesn't know?  Is there anything to suggest that their past insights have been particularly penetrating?  Just wondering...

Selasa, 12 April 2011

All advice welcome

I am on a panel on budget issues with Paul Ryan on June 9 in Madison.  Needless to say, I am taking preparation for this very seriously.  I would therefore welcome any thoughts, links, etc. from anyone inclined to send them.  Don't worry if you think I have seen it before--I am looking for a dump of everything right now.

The Real Lesson of Cisco's Billion-Dollar Flip Debacle

Cisco announced that it's closing down the Flip camera business and revisiting its other consumer products.  With a purchase cost for Pure Digital (maker of Flip) of over $600 million, and now restructuring charges of $300 million (link), the total cost of Cisco's failed consumer experiment is probably north of a billion dollars, making it one of the larger business debacles in Silicon Valley in the last few years.

Most online analysis of the announcement doesn't really explain what happened.  The consensus is that Flip was doomed by competition with smartphones, but that says more about the mindset of the tech media than it does about Cisco's actual decisions.  I think the reality is that Cisco just doesn't know how to manage a consumer business.

There are important lessons in that for all tech companies.

Here are some samples from today's online commentary:

Gizmodo (link):  "The Flip Camera Is Finally Dead—Your Smartphone’s Got Blood on Its Hands."

Engadget (link):  "Cisco CEO John Chambers says the brand is being dispatched as the company refocuses, done in by the proliferation of high-definition sensors into smartphones and PMPs and the like."

ReadWriteWeb (link): "Single-purpose gadgetry has no place in today's smartphone-obsessed world."

ArsTechnica (link):  "Flip can't be faring well against the growing number of smartphones with built-in HD cameras. The quality of your typical smartphone video camera is comparable to the Flip, and people have their phones on them all the time."

Computerworld (link):  "More and more people are using their smartphones to take lower-quality video...the market for low-cost small video cameras that produce quick-and-easy videos is dead."

There's an old saying that when all you have is a hammer, every problem looks like a nail.  We need a similar proverb for news analysis -- when you're obsessed with smartphones, every market change looks like it was caused by them.

But did smartphones alone kill Pure Digital?  Two years ago, it was the most promising consumer hardware startup in Silicon Valley.  It had excellent products and a rabid customer base.  Two years later, it's completely dead.  That's a lot to blame on phones.  Plus, Cisco appears to be moving away from driving consumer markets in general.  The Umi videoconferencing system is being refocused on business, and Cisco CEO John Chambers said, "our consumer efforts will focus on how we help our enterprise and service provider customers optimize and expand their offerings for consumers, and help ensure the network's ability to deliver on those offerings."  In other words, we'll be working through partners rather than creating demand on our own (link).

Smartphones didn't cause all of that.  But they did play a supporting role in the drama.  They commoditized Flip's original features, putting the onus on Cisco to give it new features and innovations.  As Rachel King at ZDNet pointed out (link), Cisco failed to respond:

"The technology of Flip never really evolved since then, making it a very stale gadget. Sure, even once Cisco picked up Flip, new models continued to come out each year. Yet Cisco dropped the ball by never pushing further with Flip. It never moved beyond 720p HD video quality, and it never got HDMI connectivity."

Presenting a stationary target is enough to doom any consumer electronics product.  For example, what would have happened if Apple had stopped evolving the iPhone after version 1?  You'd have no app store, no 3G.  Today we'd be talking about iPhone as a cute idea that was fated to be crushed by commodity competition from Android. 

Just the way we're talking about Flip.

The important question is why Cisco failed to rise to the challenge.  Why didn't it innovate faster?  I don't know, because I wasn't there, but I'm sure the transition to Cisco ownership didn't help.  It was not a simple acquisition.  Cisco didn't just buy Pure Digital and keep it intact, it merged the company into its existing consumer business unit, which was populated by consumer people Cisco had picked up from various Valley companies in the previous few years.   Some of the key Flip managers were given new roles reaching beyond cameras, and there must have been intense politics as the various players jockeyed for influence.

Then there was the matter of Cisco's culture.  I had a great meeting at Pure Digital several years ago, prior to the merger.  They were housed above a department store in San Francisco, in a weird funky space with lots of consumer atmosphere.  The office was surrounded by restaurants and shops.

In contrast, visiting Cisco is like visiting a factory.  Every building on their massive campus looks the same, with an abstract fountain out front, the walls painted in muted tans and other muddy colors.  The buildings are surrounded by an ocean of cars.  The lobbies are lined with plaques of the company's patents, and the corridors inside have blown-up photographs of Cisco microprocessors.  In the stairwells you'll usually see a couple of crates of networking equipment, shoved under the stairs.  And all of the cubicles look the same.



The Cisco campus.



A typical Cisco building.

Cisco is an outstanding company, and an excellent place to work.  But it screams respectable enterprise hardware supplier.  To someone from a funky consumer company, going there would feel like having your heart ripped out and replaced with a brick.

Then there were the business practices to contend with.  As an enterprise company, Cisco is used to long product development cycles, direct sales, and high margins to support all of its infrastructure.  A consumer business thrives on fast product cycles, sales through retailers, and low margins used to drive volume.  Almost nothing in Cisco's existing business practices maps well to a consumer company.  But it's not clear that Cisco understood any of that.

The transition to Cisco management happened at a terrible time for Flip.  Just when the company's best people should have been focused obsessively on their next generation of camera goodness, their management was given new responsibilities, and Cisco started "helping out" with ideas like using Flip cameras for videoconferencing -- something that had nothing to do with Flip's original customers and mission.

If Pure Digital had remained independent, would it have innovated quickly enough?  Maybe not; it's very hard for a young company to think beyond the product that made it successful.  But merging with Cisco, and going through all of the associated disruptions, probably made the task almost impossible.

I'm sure that as the Flip team members get their layoff notices, we'll start to hear a lot more inside scoop.  But in the meantime, this announcement by Cisco looks like a classic case of an enterprise company that thought it knew how to make consumer products, and turned out to be utterly wrong.

That's not an unusual story.  It's almost impossible for any enterprise company to be successful in consumer, just as successful consumer companies usually fail in enterprise.  The habits and business practices that make them a winner in one market doom them in the other.

The lesson in all of this: If you're at an enterprise company that wants to enter the consumer market, or vice-versa, you need to wall off the new business completely from your existing company.  Different management, different financial model, different HR and legal.

You might ask, if the businesses need to be separated so thoroughly, why even try to mix them?  Which is the real point.

The other lesson of the Flip failure is that we should all be very skeptical when a big enterprise company says it's going consumer.  Hey Intel, do you really think you can design phones? (link)  Have you already forgotten Intel Play? (link)

I'll give the final word to Harry McCracken (link):  "You can be one of the most successful maker of enterprise technology products the world has ever known, but that doesn’t mean your instincts will carry over to the consumer market. They’re really different, and few companies have ever been successful in both."

Right on.

Minggu, 10 April 2011

Robert E. Baldwin 1924-2011

He was my dissertation advisor.  From his obituary:

Baldwin was also an "academic father" to scores of students, inspiring them with his quiet but deeply held passion for combining academic rigor with real-world applicability. Many of his students have become professors in universities across the world. His vocation is also carried on by his son, Richard, and son-in-law, Gene Grossman, both of whom are professors of economics specialising in international trade.
I was very fortunate to be among those scores.   

Jumat, 08 April 2011

A coincidence?

Rhonda Porter points out that a government shutdown would stop the mortgage underwriting process.

Lenders will not be able to order 4506Ts--tax transacripts--from the IRS.  This is the principal source of income verification.  Without it, lenders will not be able to underwrite borrowers.
 

AL CAPONE IN ALCATRAZ!

DO CIVIL & CLASS ACTION LAWSUITS HURT YOUR SHAREHOLDERS? DO PATENT LITIGATIONS ERODE YOUR COMPANY’S REPUTATION? DO INVESTORS VIEW YOUR COURTROOM ENGAGEMENTS AS A SIGN TO DUMP YOUR STOCK? OR ARE LEGAL CASES SIMPLY MUCH ADO ABOUT NOTHING WITH NO EFFECT ON YOUR STOCK PRICES?

There is Butch Cassidy and there is Walmart. Much talked about and forced to run the gauntlet of protectors of the legal system, the similarities are strong. There is a difference though. As much as Cassidy enjoyed biking around on Wyoming’s mountainous curves with the Sundance Kid, keeping his shirt collar a good distance from the Sheriff’s grasp, Walmart is a behemoth that does not mind sauntering down the courtroom corridors. Its autobiography is strewn with litigations. But isn’t this bad logic, to be a target of and to be a propagator of various lawsuits?

Walmart is Relative Stockthe poster boy of the retail revolution, and the #1 2010 Fortune 500 giant. Against Walmart, the cases have covered various spaces – not paying suppliers on time, gender discrimination, failure to dole-out fringe benefits to parttimers, deliberate selling of low-quality items, unfair remuneration and promotion- related policies, paying low wages (a lawsuit filed in 2001 stated that the average wage for a Walmart sales attendant was $13,861 a year, while the federal poverty line for a family of three was $14,630), environment-related and other accusations by government agencies et al. Suing the Bentonville retailer has become a wholesale affair, with the average count of lawsuits filed against it, touching 5,000 per year (as per a Forbes report). But how much of a difference have the aspiring attorneys and plantiffs made to the reputation and earnings of Walmart?

Numbers are proof. Yes, since 2001, the company has paid more than $2.5 billion in lawsuit settlements. But the parallel tale is that during the same decade, while the company has opened 4,266 new outlets in 16 countries around the world, the company’s m-cap has increased by $6Mike Duke CEO Walmart7.54 billion. As far as revenues go, the figures have improved 155.67% (despite two downturns since FY2000) to touch $421.85 billion (FY2010). The forecasts are bright. The company is fast approaching the $500 billion sales-barrier, with estimates of $439.81 billion and $461.86 billion for FY2011 & FY2012 respectively (as per Thomson Financial). Truth is: the company has grown from strength to strength despite umpteen disputes. And it has not been a strategy of hiding in a blanket of silence. The company is combining the wave of allegations with a strong focus on marketing and advertising to maximise opportunities to turn ‘negative’ headlines into huge recall exercises. Imagine this – every single day of FY2010, on average, the company spent $65.75 million on advertising, marking an increase of 14% y-o-y. Little wonder that the retailer is up for a better 2011 & 2012, with buyers across America and the world indoctrinated to the Walmart culture.

As for those who believe that legal affairs raise questions about a firm character, here is a correction: they don’t. Had litigations mattered, the percentage of American households visiting Walmart would never been as high as 83% (in FY2010). Had litigations mattered, the company-in-question would have always seen its stock crash on news of civil or class action charges. Well, it does not occur in that manner.

On June 19, 2001, six Walmart employees from California, Illinois, Ohio, Texas & Florida filed a nationwide gender discrimination class action lawsuit against it. The charge brought together about 1.5 million former and present employees, and was meant to be the biggest class action suit against any company in American history, with damage claims running into billions of dollars. That day, the Walmart stock closed 0.69% higher. It gained a further 3.41% the next trading session. The case was last heard by the Supreme Court on March 29, 2011. And despite expectations of a multi-billion dollar setback to Walmart, the stock saw a rise of 0.13% the day before the hearing date. Though it is hard to also understand why the stock rose just 0.15% on April 1, 2011 (the day the Supreme Court ruled that the class action case against Wal-Mart must be reversed), we may safely assume that courtroom engagements (involving well known corporate brands) have little say in describing negative market sentiments for their stocks.

Here is what Larry McQuillan, Director, Pacific Resource Institute explains in a report titled, Wal-Mart Stands Up To Wave Of Lawsuits: “Fighting lawsuits makes the most long-term sense. Thetrial bar’s strategy against corporate America up to now has been to file a suit and bring the company to the table to get a settlement out of it. Wal-Mart has been a leader in not bowing to those pressures, unlike many companies that are afraid of bad publicity and want to settle. If you don’t defend yourself early on, and be persistent, you will be steamrolled.” Adds Prof. Kathryn Harrigan of Columbia Business School, “I would litigate everything. And if in the end the law made me do something, I’d fight to make sure my competitors had to do it as well. Shareholders shouldn’t be overly concerned about litigation exposure, because it’s a small price to pay.”

This one instance is not the only encouraging spotlight for shareholders in a seemingly apocalyptic wasteland. 596 pharmacists in Colorado won $45 million in damages against the discount retailer on May 9, 2003. When trading closed that day, the stock had appreciated by 1.43%. On Dec. 22, 2005, the Alameda County Superior Court in Oakland, California slapped a fine of $172 million against Walmart for violating a State law. The stock rose 0.23% that day. On Dec. 3, 2009, a Boston court stuck up a $40 million bill on Walmart’s front door. Stock price change: a positive 0.89%.

Jim Balsille & Tim CookThere are other Al Capones too. Courtroom battles in the world of technology are common. Apple Inc., knows that well. It has been involved in many patent infringement cases over the past decade – both as an accused and as the plaintiff. From paying up The Beatles $26.5 million and deciding to stay out of the music industry on December 8, 1991(till it launched the iTunes), to selling faulty MacBook LCD screens and iPads with battery that had overheating issues, it has taken it well. Rather, too well. And the investors are the happiest lot. From the time Steve Jobs returned to Apple in late 1996, the company’s Mcap has increased by 10,039.68% to touch $314.33 billion (as on April 5, 2011). And the rise has happened during a period when it has been busy being slammed with court papers by companies like Cisco (on Jan. 10, 2007, for infringing upon and copying and using Cisco’s registered iPhone trademark, a day after Jobs revealed Apple’s new bet, the iPhone; the Apple stock gained 4.07% that day), Nokia (for infringing on Nokia’s patents in virtually all of its mobile phones, portable music players and computers; two complaints, of which the last was on Mar. 29, 2011 – stock rise of 0.16%), Xerox (sued on April 10, 1990, for stealing Xerox’s GUI technology, which gave birth to Apple’s then-best-selling Macintosh PC – stock gain of 3.32%) et al. Apple has not been a silent observer either. Its cases against Nokia, HTC (on March 2, 2010, Apple sued HTC over 20 patent infringements with regards to its iPhone; HTC fired back by claiming that Apple had violated five patents), Microsoft (ruling given against Apple on September 1994, in a case where Apple tried to prevent Microsoft and HP from using GUI elements), and many more are proof that litigation is only a part of the larger brand-building process meant to be accepted with a spirit of more youthful optimism.

Not convinced yet? Here’s the big bite. On Oct. 1, 2010, the US Eastern District of Texas held up a $625.5 million damages claim against Apple (for violating digi-tech patents held by Mirror Worlds) - the 4th largest patent verdict passed in US history & the largest for 2010. It was meant to send the Apple stock plunging. Quite the contrary happened. When markets opened the next week, within two trading sessions, the stock gained 3.70% – an m-cap gain of $9.49 billion.

After a long-drawn battle of 4 years, BlackBerry-maker RIM was forced to pay-up $612.5 million on March 3, 2006, by a US court to NTP Inc. (one of the earliest patent-holders of wireless email). The sum was meant to settle a dispute over RIM’s email service made available for its 3 million users. The verdict then was supposed to not just bring RIM into the scanner of many watchdogs, it was also predicted to put an end to the entire Black Berry network in US and raise questions on its future. This is what appeared in an online Fortune article post the verdict, “The price of RIM’s shares was halted at $72.00 at 4:37 pm in anticipation of the announcement. RIM’s stock price soared after shares began trading after-hours, reaching as high as $86.30 in after-hours trade.” RIM’s mcap had risen by $2.65 billion (19.86%) to touch $15.97 billion when the day ended. If such huge courtroom verdicts were destined to reduce citadels to dust, RIM would have been much smaller than it is today. Perhaps gone. The reality is different. Its user base in 4 years has swollen by 1733% to 55 million and its m-cap has risen to at $28.79 billion.

Apple’s & RIM’s stock performancesThere seems to be a common belief that involvement in lawsuits will “always” lead to negative returns for shareholders and a poor financial reporting. Untrue. Prof. David Yermack of Stern School of Business (NYU) & Prof. S. Dahiya of Georgetown University, in their paper titled, Litigation Exposure, Capital Structure, and Shareholder Value, while analysing the case of value creation and destruction in the tobacco industry, concluded how companies have gained in the past by following a strategy of radical litigation. They took the case of Brooke Group CEO Bennett LeBow, who believed that civil suits had positive outcomes. The paper concludes, “Brooke Group had a tiny market share, low margins, high leverage, and highly concentrated management ownership. Beginning in 1996, the firm reached settlements in lawsuits brought by class action plaintiffs and US state governments. These events led to impressive returns for shareholders of Brooke Group.”

Even in the case of a shareholder litigation (which is considered the most vicious of all litigation types), as Prof. Georgi D. Kaltchev of International University College (Bulgaria) proves, the probability of shareholder wealth falling is low. In his November 2009 paper titled, Securities litigation and stock returns, Kaltchev proves how his hypothesis “that shareholder litigation announcements negatively affect stock returns, only finds partial support.” As per him, in more than 67% of the cases, wealth is not lost.

If the company involved in litigation adopts a heavy PR, advertising and marketing strategy (Promotion, Price & Place of 5Ps, like Walmart did by saving theaverage American household $2,500, as per Global Insight), allows innovation (Product of 5Ps, like Apple) to overshadow competition and targets the right segment (Positioning of 5Ps, like BlackBerry), litigation and court cases will only play in favour of the accused.

For companies that earn their bread and butter in the IT space, lack of innovation and absence of right positioning is poison. Why is it that Microsoft and Dell have lost tremendous value in the market, even when they have been quick to move to new emerging markets? Blame their stalled innovation engines, not litigation. About 10 years back (January 3, 2000), Microsoft was the most valuable company in the world with an m-cap of $466 billion. Then, besides 500-odd court cases, a series of innovation hiccups occurred. The Vista failed, the ‘Courier’ tablet idea planned for launch in early 2009 was dumped, its entry into the handset hardware market with the Kin was a disaster, the Zune music player was also an out-and-out failure, and its Windows software for smartphones is still scouting for a credible platform.

Of course, its SQL Server has made news, but then, what’s so innovative about a database server when everyone has already started talking about cloud computing? For Microsoft, the litigations (for lack of innovations to advertise about!) have played against investor sentiments. Litigations do prove how any company is still trying to test out something new.

That’s good. But when you keep the investors guessing forever, you’re in trouble. Like Microsoft, which has lost 53.52% of its value since 2000, Dell & Motorola are no different. From m-caps of $111 billion & $56 billion a decade back, the companies are today valued at $27.51 billion & $14.87 billion – reductions of 75.22% & 73.45% respectively.

Pharmaceutical companies over decades have been known to live through patent infringement lawsuits. The count of these increased from 81 in FY2005 to over 243 in FY2010. During the same period, generic players (which were taken to court by Big Pharmas), have won 70% of the cases – which directly translates to $60 million in revenues during the first six months for the generic players. This gain, after spending $5 million on an average on each challenge, sounds quite a deal. As new drug development pipelines are drying up, with no new blockbuster in sight till at least 2015, the next three-four years will witness many more litigations, where it could mean an increasing count of generics suing generics! In short – the lawsuits will get to you sooner than you thought. Gear up.

There is a joke which does the rounds in America. After the Feds, it’s Walmart which gets the maximum summons. It’s true. Consider this; how many will be surprised if you told them that companies in the technology & telecom industry are the ones sued the most (with drug makers at #2)? My guess is – none. And my advise is, ride on the opportunities. This time, they come in the name of litigations. If the courtroom-savvy-employee- whipping Walmart can, if the patentinfringing- Xerox GUI-stealing Apple can, so can any other company. Advertise, innovate, grow, and don’t you worry about litigations. They never could catch even Al Capone on that.

Selasa, 05 April 2011

Now for Metro Areas (which are harder to define)

The ten largest MSAs in 1960 (SMSA for everything except Chicago and New York, which are CMSAs) with current rank in parentheses.


  1. New York (1)
  2. Chicago (3)
  3. Los Angeles (within a whisker of Chicago) (2)
  4. Philadelphia (5)
  5. Detroit (11)
  6. San Francisco (13)
  7. Boston (10)
  8. Pittsburgh (!!)  (22)
  9. St. Louis (18)
  10. Washington (8)
San Francisco is just San Francisco-Oakland--if one added San Jose it would be in the same position in 1960 and further up the ranks now.  I forgot that Pittsburgh was once a top ten MSA.  There is obvously a lot more persistence here.

50 years of Population in in Ten Largest Municipalities in the US

Just for grins, I looked to see how the ten largest cities around the time I was born have changed in terms of rank.  The top ten in 1960 and their current rank:


  1. New York (still # 1)
  2. Chicago (now 3)
  3. Los Angeles (now 2)
  4. Philadelphia (now 5)
  5. Detroit (now 18)
  6. Baltimore (now 21)
  7. Houston (now 4)
  8. Cleveland (now 45)
  9. Washington, DC (now 24)
  10. St. Louis (now 58!)
New York is a bit larger now, and has for more than 50 years been more than twice as large as the second largest city, making the spirit of George Zipf happy. LA and Houston have gained population as well as rank; the other seven all lost.  St. Louis is hemmed in by a boundary that was drawn more than a century ago, but it doesn't lack land--the area north of downtown is basically field and forest.  Cleveland, Baltimore and, of course, Detroit have lots of empty space within their boundaries as well.

For symmetry, let's look at where the current top ten were in 1960.

  1. New York (1)
  2. Los Angeles (3)
  3. Chicago (2)
  4. Houston (7)
  5. Philadelphia (5) (note: the top 5 were all in the top 7 in 1960).
  6. Phoenix (29)
  7. San Antonio (17)
  8. San Diego (18)
  9. Dallas (13)
  10. San Jose (57)
The cities in the second five moved pretty dramatically.  They all had lots of available land in 1960 and are all, of course, sunbelt (although I suppose on could argue that northern California is not sunbelt).  


  

Ryan Avent doesn't believe 2.8 percent unemployment is a reasonable possibility

Ryan Avent on the assumptions underlying the Ryan plan:


http://www.economist.com/blogs/freeexchange/2011/04/facts_and_figures

A couple of other points: Ryan seems to think that the two basic problems facing the country are: (1) too many defined benefits and (2) too much equality.

Sabtu, 02 April 2011

Dalton Conley Talks about Intergenerational Wealth--and it is not pretty


Dalton gave a very nice talk at USC on Friday, presenting his Center for American Progress paper 

Wealth Mobility and Volatility in Black and White.  The CAP page on the paper summarizes the findings:


  • What family an individual comes from ƒƒexplains about three-quarters of where they end up in the wealth distribution as adults. For African Americans, however, the impact of family background is substantially lower, at 37 percent.
  • Individuals are more likely to mainƒƒtain wealth than to attain wealth, or more precisely, low-wealth children are unlikely to become high-wealth adults, while high-wealth children are very likely to be high-wealth adults. Looking at previous years’ data, less than 10 percent of children who grew up in families in the bottom wealth quartile, which had a maximal cut off of about $8,000 in 1984, reached high wealth levels by adulthood between 1999 and 2003 (when the top group’s minimal value was $82,501and the median was over $189,000). And over 55 percent of children who grew up in families in the top wealth quartile—over $155,000 of net worth back in 1984—held on to their high wealth levels by adulthood.
  • The strongest predictor of an adult’s ƒƒrelative wealth status is his or her income, which in turn is highly predicated on his or her parents’ income and wealth.
  • Wealthy white children are much more ƒƒlikely to become wealthy adults than wealthy African-American children: Over 55 percent of all white children raised by parents in the top wealth quartile hold onto the top wealth position as adults. This is contrasted to only the 37 percent of African-American children raised by parents in the top wealth quartile who hold onto the top wealth position as adults.

Jumat, 01 April 2011

The Five Most Colossal Tech Industry Failures You've Never Heard Of

The tech industry is famous for forgetting its own history.  We're so focused on what's next that we often forget what came before.  Sometimes that's useful, because we're not held back by old assumptions.  But sometimes it's harmful, when we repeat over and over and over and over the mistakes that have already been made by previous generations of innovators.

In the spirit of preventing those repeated failures, I spent time researching some of the biggest, but most forgotten, failures in technology history.  I was shocked by how much we've forgotten -- and by how much we can learn from our own past.


5. Atari Suitmaster 5200

Video console manufacturer Atari was notorious for its boom and bust growth in the 1980s.  The company's best-known failure was probably the game cartridge ET the Extraterrestrial, which Atari over-ordered massively in anticipation of hot Christmas sales that never materialized.  Legend says that truckloads of ET cartridges were secretly crushed and buried in a New Mexico landfill.

What's much less well known is that Atari was also involved in the creation of an early motion-controller for home videogames, a predecessor of Microsoft's Kinect.  Since video detection technology was not sufficiently advanced at the time, the Suitmaster motion controller consisted of a bodysuit with 38 relays sewn into the lining at the joints, plus 20 mercury switches for sensing changes in position.  The suit was to be bundled with the home cartridge version of Krull, a videogame based on the science fiction movie of the same name.

A massive copromotion was arranged with the producers of Krull, and Atari made a huge advance purchase of Suitmaster bodysuits and cartridges.  Unfortunately, development was rushed, and late testing revealed two difficulties.  The first was that the suit's electromechanical components consumed about 200 watts of power, much of which was dissipated as heat.  That may not sound like much, but imagine jamming two incandescent light bulbs under your armpits and you'll get the picture.  There were also allegedly several unfortunate incidents involving mercury leaks from broken switches, but the resulting lawsuits were settled out of court and the records were sealed, so the reports cannot be verified.

The Christmas promotion was canceled, but Atari didn't give up on the Suitmaster immediately.  The next year, it was repurposed as a coin-op game accessory, allowing the user to control a game of Dig Dug through gestures.  Sadly, Atari's rushed development caught up with it again.  Due to a programming error in the port to Dig Dug, under certain obscure circumstances when Dig Dug got flamed by a Fygar the suit would electrocute the player.  (The bug was discovered by an arcade operator trying out the game after hours, in what is now memorialized in coin-op gaming circles as The Paramus Incident).  That was the last straw for Atari's corporate parent, Warner Communications.  To limit its potential liability if a Suitmaster were to fall into public hands, Warner arranged to have the entire inventory chopped up and mixed into concrete poured into a sub-basement of the Sears Tower in Chicago, which was then undergoing renovation.   A small bronze plaque in the third sub-basement of the Sears Tower is the Suitmaster's only memorial:


 

4. eSocialSite.com

Before Facebook, before MySpace, even before Friendster, the most successful social networking site on the web was eSocial.  Largely forgotten today, eSocial thrived in the late 1990s as usage of web browsers took off on PCs.  By 1998, it had reached more than 50 million users worldwide, an unheard-of success at the time.  Its Series A fundraising in 1999 raised more than $132 million from a consortium of VCs led by Sequoia Capital.  Many people still cite eSocial's Super Bowl ad in January 2000, which featured a singing yak puppet, as a classic of the dot-com bubble era.  When the company went IPO in February 2000, its stock price made it the 23rd most valuable company in North America.

Unfortunately, just two months later, it was revealed that 99.999974% of eSocial's registered users were fake people simulated algorithmically by a rogue eSocial programmer.  The other 13 were middle school students from Connecticut who were technically too young to sign up for the service.  eSocial was sued for allowing underage users, which delayed critical service upgrades for several months.  By the time the litigation was resolved, Friendster had seized the initiative, and eSocial was quickly forgotten.

eSocial found a second life overseas, though, and today it is still the leading social site in several former Soviet republics in Central Asia.  The founders of eSocial have long since left the company, and today are active in Wikidoctor.org, a promising new site that enables people to crowdsource the diagnosis of diseases and other chronic health problems.
   

3. The cardboard aeroplane

It's an unfortunate fact that wartime is a great stimulator of innovation.  Desperation leads countries to try all sorts of crazy ideas.  The successful ones become famous, while the failures are usually forgotten.   For example, you don't hear much today about Britain's World War II plan to turn icebergs into aircraft carriers (link).

Even more obscure was the effort to create an aircraft from cardboard.  One of the greatest bottlenecks in aircraft construction during the war was the shortage of aluminum feedstock.  Britain could not expand aluminum production quickly enough to meet its needs, so it attempted to substitute the output from the Empire's massive Canadian paper mills.  The idea of a cardboard airplane sounds crazy at first, but cardboard can be incredibly rigid in some directions (as you've found if you've ever tried to smash a box for recycling).  Through the proper use of corrugation in multiple directions, the British found that they could create a material with the same tensile characteristics as aluminum, with only slightly greater weight.

Early flight tests of the cardboard aircraft were not encouraging, as the first two test planes broke up suddenly in mid-flight.  Subsequent investigation revealed that water was infiltrating the corrugations, and then freezing when the plane reached altitude.  The expansion of the ice caused the cardboard to delaminate, resulting in failure of the airframe.

But the engineers persevered, sealing the cardboard with paraffin wax to waterproof it.  These new models successfully completed flight tests in the UK, and were demonstrated for Winston Churchill in 1943, who endorsed them enthusiastically. 

The new aircraft were deployed to North Africa, where another unfortunate problem appeared: the paraffin melted in the desert heat, causing the planes to wilt on the tarmac.  Needless to say, this limited their effectiveness.  The British engineers persevered, eventually creating a new waterproofing scheme utilizing used cooking oil.  This not only waterproofed the planes, but also made them smell like fish & chips, a definite plus to homesick British airmen.  Unfortunately, wartime supplies of cooking oil in Britain were limited, and by the time alternate supplies could be imported from the America South, the war was nearly over.

The cardboard airplane disappeared into history, but its spirit lives on (link).


2. The microwave hairdryer


The 1950s and 1960s were the golden age of innovation in electronics.  Companies like HP, Varian, and Raytheon created amazing new devices, often adapted from wartime technologies.  One example was the microwave oven, which was derived from radar.

But microwaves were once used for a lot more than cooking food.  My dad worked in the electronics industry at the time, and he often told me stories about the remarkable new product ideas he worked on.  One was the microwave hairdryer.

Today we're frightened of microwaves because they're "radiation," and that's assumed to be bad.  But in the 1960s people understood that microwaves had nothing to do with nuclear radiation.  They were just another tool that you could use to get things done, like arsenic or high voltage electronics.  Engineers at my dad's employer (which he asked me not to name) were looking for new ways to use microwaves to solve everyday problems.  Someone noted the number of hours women spent under rigid-hood hairdryers, used to finish the elaborate hairdos that were prevalent in the 1960s, and realized that a microwave hairdrying helmet could do the same job in just 45 seconds -- creating a massive increase in national productivity.

Unfortunately, the microwave hairdryer ran into a series of technical problems.  The first was that the microwaves caused metal bobby pins and hair clips to arc, which frightened customers and gave their hair an unattractive burned smell.  That was solved by substituting plastic clips.  The second problem was that the microwave frequency that couples best with wet hair is very close to the frequency that couples best with blood plasma.  This required some precise adjustments to the three-foot-long Klystron tubes that powered the hairdryers.  If they were jostled there was a very slight risk of causing the client's blood to boil (although this never actually happened in practice).

The technical problems were eventually resolved, but the death knell to the microwave hairdryer was something no engineer could fix: a sudden change in hairstyles in the late 1960s.  The move toward long straight hair, frequently unwashed among younger people, caused a collapse in the hairdryer market, from which it has never recovered. 

There was an abortive attempt to create a microwave blow dryer in the 1970s, but it was pulled from the market when it caused LED watches to burst into flame.


1. Apple Gravenstein


During the Dark Years when Steve Jobs was away, a rudderless and confused Apple Computer churned out a long series of failed initiatives.  Their names echo faintly in tech industry history:  CyberDog, Taligent, Kaleida, OpenDoc, HyperCard, Pippin, eWorld, emate, A/UX, the 20th Anniversary Macintosh, Macintosh Portable, QuickTake, the G4 Cube (oh, wait, Steve did that one), Newton, and so on.

But the most catastrophic failure was the one Apple worked hardest to hush up, the project called Gravenstein.  Simply put, Gravenstein was Apple's secret project to produce an electric automobile.

In the late 1980s, Apple was growing like a weed, but the driver of its growth was the Macintosh product line initiated under Steve Jobs.  John Sculley and the rest of Apple's senior management team were concerned with securing their historical legacy by doing something completely different.  Sculley, noting the chaos caused in the world economy by the oil embargo of the 1970s, chose to focus on the creation of an all-electric car.  Michael Spindler, ironically nicknamed "Diesel," was chosen to manage the production of the vehicle.  Bob Brunner drove the overall design, but Jean-Louis Gassee was asked to do the interior, on account of he's French and has good taste.

Apple used its Cray supercomputer to craft a unique teardrop aerodynamic shape for the car.  Apple purchased all the needed parts, and planned to begin production in its Fremont, California factory.  To prepare the market for the car, Sculley started working automobile references into Apple's advertising.  The most famous of these was the "Helocar" advertisement (link).   If you watch the ad closely, you can see actual diagrams of the Gravenstein's design and aerodynamic shape, although of course the first version of the car was not intended to fly.

Unfortunately, the public response to the Helocar ad was so overwhelmingly negative that it frightened Apple's Board of Directors.  Sculley was ordered to scrap the Gravenstein project, and all documents related to it were shredded and then burned.  Although Gravenstein never came to market, its legacy affected Apple's products for decades to come.  The Macintosh Portable, for example, used bulky lead-acid batteries that were originally intended to power Gravenstein.  And many years later, Jonathan Ive reused the Helocar's aerodynamic shape in the design of the original iMac.




Those are my five top little-known tech failures of all time.  What are yours?  There are many other candidates.  Honorable mentions should include Leonardo da Vinci's steam-powered snail killer, Thomas Alva Edison's notorious electric bunion trimmer, spitr.com, and of course Microsoft Bob.

You can draw many lessons from these failures, but to me the most important lesson of all is that you can't trust blog posts written on this particular date.

Posted April 1, 2011