Global virtual data centre
Today Amazon.Com enables you to use a Web service and a credit card to buy storage and computer facilities. In effect, you could automate the end-to-end process of increasing and adding such resources, to your global virtual data centre.
Now, consider what Amazon.com must have done to generate this global department store. The internal environment must be incredibly mature and they managed to industrialise their infrastructure, provisioning and other processes, to resell this as a service reducing their total cost of ownership (TCO).
Effectively, every company can do this and, of course, it would be highly expected from service providers. Creating such a technically competent people, and process maturity takes, on one hand, a bottom-up approach, to evolve capabilities via several iterations and, on the other hand, clear goal setting, of a smart CTO, like Werner Vogels (CTO at Amazon.com).
Application delivery advances
We are not only seeing great advances within the infrastructure space towards virtualisation, process control and service enablement, but also in the application delivery space:
Web 3.0 is being called the Semantic Web and is probably best described by Tim Berners Lee's (the original HTTP aka Web 1.0 inventor) in his invention Linked Data: Essentially having an easy data definition engine, enabling the linking related data elements together (check http://www4.wiwiss.fu-berlin.de/bizer/pub/lod-datasets_2009-03-05.html for a visual linked data browser).
This is actually a crucial development. Its not important, whether Linked Data will succeed as a protocol standard, but it shows the amount of industrialisation within the data processing and software layer. In the same way, we can write mechanisms to automatically add storage and computing power, we can write methods to query data regardless of their source and use this data according to the value they represent to us.
The next level of industrialisation happens within the software development itself. We are seeing with new tools (open source or commercial), that we are able to draw processes and use drag and drop windows editing, to paint user interfaces tying back to prebuilt (service) logic. The industrialised way and logical next step is to take these models and attempt to create source code or Web service definitions.
The technology and approach for this is called Model Driven Architecture. This essentially means there is a drawn model (similar to process charts), which contains also configuration items. These models can be aligned to pre-built source code or service artifacts, which can be linked by workflows.
Some of these tools can already be used by power users. Right now, these tools are still in their early stages and either require a lot of manual work to create process flows or models or the technology is highly proprietary in nature, preventing true uptakes and yet other approaches are not accurate. As a result, these frameworks are still only being used in targeted deliveries.
Sign up for Computerworld eNewsletters.