Sunday, July 11, 2010

From an old polymath to the new generation

Vinnie asked for a review of his book, "The New Polymath."

Though I've already posted this, that was before I obtained a hard copy.

Less Important

Now that I've been through it once, slowly, not quite looking for typos, but inevitably finding them, I'll discuss one of the chapters. But first, the proof reading. Back in March 2010, I reviewed a late draft of The New Polymath and offered a few corrective suggestions. It looks like many, though not all, made it into the first edition. The ones that didn't make it will remain between Vinnie and I. I'll offer up a couple of hits though, one that was partly fixed, and one that I missed (as did every other editor, apparently).

  • "a good a" should be "as good". [page 43]
  • "used in all kinds of games" was changed to "used many games" [page 139]
The first one almost no one will register, as tiny words out of place are typically processed by the brain in the way intended, and likewise with the second.

More Important

I took much more time reading the hard copy than when I read electronic proofs, because the spell and error checking was both under a deadline, and I didn't need to absorb the full meaning behind the paragraphs, merely check that each sentence made sense. Still, I filed away a number of mental threads that I might later use for a book review. In the meantime, numerous others have chalked up their thoughts (Vinnie has a bookmark page for the "mentions"). After digesting my impressions a few days, I decided to focus on one of the multitude of innovations, innovators, and concepts in the book. If I tried to cover more, I think I'd be spending weeks collecting and expressing my thoughts.

The "80/20" vs. the "20/80"

In the introduction to Chapter 9 ("Arsonists: And Other Disruptors") Vinnie presages the conversation with Bill Hambrecht by quoting his (Bill's not Vinnie's I guess) mantra "80 percent of the value for 20 percent of the price." And drops the same phrase in a couple pages later, just before "Learning from the Automobile Service Model", with a final sound bite in the last part of Chapter 9, talking about Verizon, AT&T and other telcos (telephone companies, in case you're not into the industry jargon).

Chapter 10 kicks off with the question "How do you deliver 80% of the value at 20% of the price?," repeating the prelude woven into the prior chapter. What's missing from that narrative, though, is a deeper dive into where the "80/20" comes from in the first place, in order to compare it to a different slant "20/80", and more critically, what it could mean to business executives. I think many will understand the components that are left out, just as they may understand sentences with lost or jumbled words, though I think it's necessary to riff on both ideas. After all, to me, Vinnie's book serves as a jumping off point for many more discussions (not to mention he's given me that facilitator role in the book's epilogue).

Who was Pareto, and why didn't he make it into The New Polymath?

Those who took statistics in school (and remember what they learned), or who have applied statistics in the business world, should have heard of the Pareto Principle (the Peter Principle is somewhat related, but not what we're talking about here). That's where the 20th Century Italian Vilfredo Pareto observed that 80% of the local property (land) was owned by 20% of the population. I'm not going to debate the propriety of wealth distribution, or similar issues such as pollution sources or energy consumption, merely add this to the background of the book. Vinnie's world is full of examples where fixing the 20% of the (number of) problems makes 80% of the (measured total) pain go away. And in my world, focusing on the 20% longest running transactions or batch jobs has a tremendous impact on total system health, though everyone knows that those 20% are not all going away, or they'll be succeeded by a new generation later.

Here is another example from my work, plotting the number of objects in a database against their size:

That's what a Pareto curve is. The implication is one should focus on the first 20%, or the biggest 20%, because that's more efficient use of your time spent in analyzing.

Vinnie, and Bill Hambrecht, look at the same ratio for a different purpose. Suppose we're plotting software price against features (or service instead of software, if you will). The larger, more expensive enterprise software packages, that do it all, cost the most. The smaller, off-the-shelf packages, have maybe 80% of the features, though may cost 1/5 the price. Is that a realistic hypothesis? Are companies willing to give up the promise of do-it-all software for a huge expenditure reduction? Vinnie seems to think so. And to be fair, I think that's where Vinnie can add value to the conversation, sharing the risks of what is being dropped, with the benefits of freeing all that cash flow.

As always when writing on my own time, these ideas are my own, and not my employers. I've used a couple examples from work, but that doesn't mean I'm talking about my work, or anyone else's work specifically.

Off with their heads

In order to toss out over-priced, too-large software, companies would need to be prepared for the massive uncertainty of either going-it-alone, or going with the "little engine that could." That kind of disruptive risk requires a lot of nerve. Simply looking at easily replaceable commodities or services (say, printers or copiers) can be tackled with a tangible risk level. Bypassing years-long enterprise architecture plans for the latest of 41 flavors is another matter. Suppose one had put in a plan to replace proprietary ABAP code with more widely-known Java code, only to find the underlying software platform wasn't going to support future versions of Java, probably because Oracle now owns Sun Microsystems, implying an uncertain future for Java code? Sort of explains why there are still markets for mainframes and COBOL programmers, right?

I'd hypothesize that identifying the 20% largest costs should be a no-brainer for any finance person with access to the right queries. Identifying the best candidates to replace those functions with a stable platform, 80% as reliable (just good enough delivery), and expected to survive past the next contract renewal, will be worthy of a John Grisham thriller one day.

Then there's the flip side, which Vinnie alludes to, whereby locking in with today's lowest cost vendor could then lead to higher and higher prices later, once the critical mass of platform specific customizations have been done. I don't really have an answer for that, other than people will remember your promises the next time you try to come back to the well and get the now-entrenched monkey off your back.


I'll stop here with a joke reference. Be sure to read the Discussion tab...

The Ninety-Ninety rule


Anonymous said...
This comment has been removed by a blog administrator.
Anonymous said...
This comment has been removed by a blog administrator.
Anonymous said...
This comment has been removed by a blog administrator.