The convergence side, traditionally we used to have storage, servers and networks, these are the three big areas that CIOs looked at when they looked at data centre environments, however the lines in-between these areas have completely blurred, in the sense that you very often can no longer tell the difference of your data centre's object store, distribution analysis, databases and actual servers. These are becoming blurred because of open standards, where companies have been able to do things most cost effectively.
The other reason why this is the case is that companies have started to get devices that do more than one of this things, switches and routers are becoming smart, so how companies are going to handle data generated from these devices is the bigger question. So there are a few ways in which storage is being addressed to allow big data get addressed efficiently, one of them is intelligent tiering where important data is crunched rather than storing it in the backend and then bring it forward into a new environment every time.
Thin provisioning is also another important concept, where a little data storage is allocated and scaled depending on the application on demand. Realtime compression is another aspect when dealing with big data where one needs to decompress lots of data in real time off storage, and because of the capacity, the solutions have become more powerful, for example de-duplication.
Sign up for Computerworld eNewsletters.