4 Reasons Naysayers Win IT Battles, Not Wars

Posted: 4/23/2012

4 Reasons Naysayers Win IT Battles, Not Wars

Want to predict the best storage innovation coming down the pipe? Take a look at what traditional IT is saying won't work.

By:Mark Peters

For all its obvious scientific brilliance, frequent technical advances, and near-magical capabilities, it often seems as if the enterprise IT world is populated by obstinate naysayers rather than opportunity seekers. Put forward some new discovery or approach and you can almost guarantee a chorus of "yes, but ..." This phrase can be completed in multiple ways, linked to money, desire, suitability for purpose, etc., but they all amount to "no," or at least "not now," "not for us," and so on. It's almost like a rite of passage for any successful IT product or platform: if not initially derided, denigrated, and made dubious, it probably has no chance to achieve dominance.

Let's look at a few examples, and try to figure out why the tech world's knee-jerk reaction is to explain why something won't work, and, in particular, why storage changes can seem to take such a long time. On the one hand this "reluctant embrace of progress" (as some might call it) is far more prevalent than you might think, tempting me to simply quote J.K. Galbraith and be done with this. As he pointed out, "faced with the choice of changing one's mind and proving that there is no need to do so, almost everybody gets busy on the proof." But, on the other hand, the "cautiously pragmatic adoption of the new" (as others would see it) is also far more logical than you might think.

The longer history of technology shows that what I'm about to describe is not new. For instance, Chester Carlson spent years trying to get someone to buy into his idea, a process that we now know as xerography But--hey--who would ever really want near-instant copies of documents?!? Closer to home, we all know that Thomas Watson (of IBM fame) thought that there might be a total market of five mainframe computers! (And you get bonus points for remembering the name of the person who challenged the notion of ever needing a computing device at home.)

Even more recently, the Internet grew from a pragmatic need for access across university systems, not from someone imagining a need for electronic email or recognizing that online shopping might be the next great idea. Why on earth would you want computers to talk to each other?

Perhaps this is our first clue: The real power and potential of something new is not always seen at the outset. Cars and telephones were merely cool toys at first, and they needed a whole infrastructure to realize their potential and endemic popularity. The same is essentially true of the Web--it was something of a chicken-and-egg situation, constrained by access speeds and network costs, as well as the limited number of websites. But its value was--at first gradually, then rapidly--seen.

Whole new approaches to business took shape. It wasn't so much invented per se, as it coalesced and occurred; no focus groups had been sitting around demanding an Internet, or a personal computer, or an automated tape library, or an iWhatever, or Google, or email, or solid-state based storage, or even deduplication. The value of such things wasn't always in fulfilling known, or well-expressed needs, but in teasing out latent needs and then fuelling the value.

Fine, but with so much history of successful innovation, why do storage and IT, of all worlds, continue to exhibit reticence to embracing the new? It seems to me that there are a number of factors at play.

1. "Safety first" is, understandably, the mantra of most IT shops of any size. Few IT managers are actively rewarded for embracing the new (although that's beginning to change as IT departments begin to use incentives and MBOs to encourage the use of things like virtualization and cloud services), whereas plenty of IT managers will be chastised if anything goes wrong. This simply means that ultra-conservatism is inbred; and interoperability, certification, application-testing, and support are certainly not things to be sniffed at when you're running, say, an airline, manufacturing, or telecom system.

2. Often only one or two vendors will bring forth a particular type of technical advance. This means that the natural competitive response from the rest of the vendor ecosystem is to point to all the negatives (whether real or simply FUD) of the new thing; whether this is because they don't have an equivalent offering or if they're merely trying to buy time until they do, the effect is the same: The volume of negativity about new tools and methods gets amplified beyond what might be expected.

3. Another common issue is that small and midsize business and enterprise users will see something as OK for use by consumers--and/or maybe their "propeller-head" departments--but not for "real" computing. Unfortunately for the naysayers, however, the real value demonstrated by these rogue users shows what's actually possible; the proverbial cat is out of the bag. It's why, for instance, developers are using their corporate credit cards to access cloud systems, and in doing so they're applying pressure on traditional IT departments to adopt similarly flexible and economic solutions.

To give another example, many years ago, when Yahoo's instant messaging (IM) first launched, it was so much an amateur solution that system maintenance was scheduled between 9 a.m. and 5 p.m., since it was assumed no one would be using it then. Yet IM is now a mission-critical app for some operations (including some trading floors). The consumer versions of things get proven and drive change in IT. How often, for example, do you have real problems with your personal email system--Hotmail or Gmail perhaps--compared to the corporate Exchange system?

4. Very often the prima facie attraction of something is not what makes it successful in the long term. Thus, for example, deduplication achieved success once it was understood that you could get backups done in time if you used it, not simply because of the superficial "tape sucks" messaging that was semantically attractive.

Also, very often there's a driving need to improve the economic model to meet massive, growing needs. For instance, VMware was pretty much a localized departmental solution for engineers and scientists until its value in overall consolidation was seen, which equated to better utilization and a powerful economic motivation. Solid-state adoption and the various cloud topologies are essentially similar--their underlying value, despite everything, is economic.

Let's now bring the discussion squarely back to modern data centers and data storage. The idea of using things that are now seen as standard and perfectly normal, such as SATA drives or iSCSI, in data centers (for real processing) was greeted just a few years ago with close to unanimous doubt. Equally, servers were seen as fine for departmental use and the loony fringes, but were not to be taken too seriously: "OMG, next you'll be suggesting that we run data centers on Windows!" That was unthinkable only a few years ago.

When the latest era of NAND-flash based solid-state implementations arrived on the scene a little over four years ago, it was greeted by a universal round of "why it won't work" from all the vendors that didn't then have an offering. Since then everyone has jumped on the bandwagon. Now things that were attacked with flash--such as reliability--are beginning to actually be seen as advantages for the technology. Whether that's a matter of product progress, user experience, or perhaps the lack of continued attacks, doesn't much matter. Today the "no" in the flash world is more centered on the suitability of multi-level cell (MLC) media for enterprise use. Guess what, it's cheaper, and so we'll find a way to make it work.

So, overall, if you want to get an idea of where we might be heading in data centers, take a good look at what seems logical but is getting the negative shrug, as it's perfectly likely to be embraced eventually. Indeed, it can be a long time; look at ILM. Its promise is only now beginning to be properly realized, but is also usually still discussed in hushed tones because the promise was made so long ago.

Other things happen faster. The rush to the cloud is a perfect example, although, even there, it's been the adventurous blazing the trails--those who blend the challenges of consumerization and conservatism to cross an IT chasm. (Of course, many uses of the cloud are also easy to do pretty quickly, as they are not trapped in the budget and lease cycles of regular IT.)

As is typical, the silent "no" majority will likely follow the vociferous "yes" minority, claiming the idea was both theirs and entirely sensible all along. They were, of course, just waiting for the kinks to be ironed out first. 

To view the original areticle click here