Technology market research experts are predicting a huge growth for cloud computing in the very near future. Gartner, an expert in information technology research and advisory, predicts that the bulk of IT spending by 2016 will be for cloud computing platforms and applications with nearly half of large enterprises having cloud deployments by the end of 2017.
While Forrester - another leading technology market research company - has forecasted a growth for cloud in the Asia-Pacific region, predicting that the cloud computing market will grow from $6.9B in 2013 to $31.9B in 2020. Justifiably, there is a lot of hype and excitement about the prospect of cloud computing, its returns on investment (ROI) and efficiencies and flexibilities that it will offer IT departments and business units.
Together with this, the explosion of mobile devices is forcing dramatic changes in the way the network infrastructures scale. Amid this hype, however, it is imperative to render consideration to how the cloud will affect the way networks are designed, conceived, and secured. Cloud era means a dramatic increase in the sheer volume of data passing into and out of networks at any given moment - a growing amount of it critical to business operations. Content security in particular faces novel challenges as mobility and Web2.0 applications make web traffic volumes highly unpredictable, with massive spikes and troughs from one moment to the next.
The traditional approach to address concerns like these has been to overprovision compute resources for content filtering, resulting in higher costs and lost efficiency, while still introducing unacceptably high latency and eroding network performance. However, emerging solutions adopt a different approach, by transferring content-filtering tasks to cloud-hosted services where massive resources can be dynamically re-provisioned as needed to handle any volume of web traffic without bogging down the network.
Ensuring security in high data traffic
To maintain network security, content-filtering solutions must examine all the business-critical traffic that these activities create. This can lead to a kind of arms race, where in order to maintain acceptable network performance; organizations must purchase ever larger appliances and provision ever greater resources to content filtering. In addition to this legitimate, business-critical traffic, the volume of non-productive traffic is also growing exponentially.
Traditional Unified Threat Management solutions (UTMs) are designed to bring a variety of security functions - including web content filtering - into a single appliance. To do their job without compromising network performance, they must support massive throughput. More important, they must dedicate massive computing power to scanning content and enforcing policies making them increasingly costly and less efficient given the dedicated nature of the processing power. Furthermore, it means that as new threat profiles emerge, organizations are often required to upgrade their UTM hardware at significant cost. The UTM also tends to bog down network traffic due to a heavy content filtering queue; it can interfere with connectivity to business-critical resources and applications hosted in the cloud - which can result in a significant loss of productivity or interruption of business.
Sign up for Computerworld eNewsletters.