It's undeniable that the entire technology industry is shifting to cloud computing. Just as the '80s was the era of the PC, and the '90s (and '00s, too) was the era of the Web, it's inevitable that the '10s will be the era of cloud computing.
Endless words have been written about the technology underlying cloud computing. A number of orchestration products joust, each described by its company as the most complete, best performing product on the market. We've seen hybrid cloud products released by every vendor from Borneo to Nome, every one non pareil in tying together distributed orchestration products. One has even seen IBM describe its mainframe products as "truly a cloud" because mainframes, well...compute, I guess.
Of course, this is understandable. Every cloud product is associated with a vendor, and every vendor has to make its numbers. If the technology trend a la mode is cloud, well, then every vendor needs to look au courant. In addition, IT groups love new technology; after all, that's what they specialize in, and every new trend and product that comes down the block is a new chance to build expertise and cement their position as the technology nomenklatura.
But cloud computing is a curious phenomenon because much of the uptake is by groups that traditionally didn't drive adoption or even get much involved in infrastructure decisions: Application groups and software developers. They've embraced cloud computing, particularly Amazon Web Services, with gusto-so much so that central IT has developed new terminology to describe it: shadow IT, or, even more witheringly, rogue IT. Anyone who has looked at the growth of AWS (as I did), can see that it's experiencing enormous growth.
Vendor and IT organization embrace of cloud computing is understandable. But why have end users so assiduously adopted it? After all, throughout most of IT history, application groups stood aloof from infrastructure involvement, seeing it as nothing more than plumbing managed by specialists. What's driving "shadow" IT?
Less is More: As Commodities Get Cheaper, Consumers Stock Up
This change in behavior is rooted in economics and can be understood as the understandable interaction of the theories of two giants of economic theory: William Stanley Jevons andRonald Coase.
Jevons was a Victorian-era economist who developed theories about marginal value. More specifically, he studied the then-unsettled question about whether a lower price for a commodity would motivate people to shift spending to other commodities. In other words, would they continue to consume the same amount of the commodity and use the savings for other purposes?
Jevons' test case was coal. As the use of coal became more efficient-meaning that the same amount of work would require less purchase of coal-overall coal use rose. This went against common sense, which said if the price dropped, people would have more money to spend on something else. (This was so counterintuitive that it came to be known as Jevons Paradox). Far from using less coal to perform tasks associated with its traditional use, people found many new tasks that beforehand could not be cost-justified but were now economic, given coal's lower price.
Sign up for Computerworld eNewsletters.