Short, a research scientist at UC San Diego, said that as the capacity of servers used to process the explosion of data increases, there are "unprecedented challenges and opportunities for corporate information officers."
For example, the study pointed to a sharp increase in the use of server virtualization technology beginning in 2006, as well the more recent use of cloud computing systems where server-processing power is provided as a centrally administered commodity delved out on a pay-as-needed basis.
The scientists focused their analysis on server performance per dollar versus raw processing power. They said the calculation used offered "a more consistent yardstick" when one considers the wide variety of servers used by enterprises.
For example, during the five years prior to 2008, new-server performance went up five- to eight-fold.
"While midrange servers doubled their Web processing and business application workloads every two years, they doubled their performance per dollar every 1.5 years," Bohn said.
In 2008, not surprisingly, entry-level servers -- those that cost less than $25,000 -- processed about 65% of the world's information, while midrange servers processed 30%, and high-end servers costing $500,000 or more processed just 5%, according to the 36-page report.
The report also stated that the total worldwide sales of all servers has remained stable at about $50 billion to $55 billion per year for five years ending in 2008.
Sign up for Computerworld eNewsletters.