Hewlett Packard Enterprise has given a prototype of its new memory-centric computer architecture, dubbed 'The Machine' its first public airing, in Washington DC.
It has 160TB of addressable memory, but HPE says it expects the architecture could easily scale to "An exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory - 4,096 yottabytes... 250,000 times the entire digital universe today."
The Machine abandons the traditional computer architecture of having a central processor with peripheral storage and replaces it with a vast fabric of non-volatile semiconductor memory that simultaneously fulfills the functions of long term storage and traditional computer memory, and that makes that data it holds available to multiple processors.
According to HPE "By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, memory-driven computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds - to deliver real-time intelligence."
It claims to have already achieved 1000 fold improvements over traditional architectures with certain types of analysis.
Jaap Suermondt vice president, software and analytics at HPE, who is responsible for The Machine's software stack and applications, told Computerworld in August 2016 that The Machine was being designed to cater for the growing requirement to perform analytics on every-larger datasets and in the belief that traditional computing architectures would no longer be able to scale to meet demand.
In a briefing ahead of the prototype's unveiling Kirk Bresniker, chief architect, Hewlett Packard Labs, told Computerwold that HPE would show that it had succeeded in realising not just the individual parts of The Machine's architecture but had been able to combine these at scale and use the prototype to solve real-world highly data-intensive problems.
"Back in December we had everything working, but only one of each what we will demonstrated next week is achieving the scale we set out to achieve," he said.
'More capacity than anyone expected'
"We have a prototype with more capacity than anyone would have expected us to achieve. We have 160 TB of memory on the fabric, 1280 cores of ARM compute on the fabric," he said.
"We have a photonic interconnect connecting a rack scale infrastructure that consists of 100 gigabit four colour coarse wavelength division multiplexing and it is all running a pretty interesting security analysis workload looking for subtle advanced persistent threats in the enterprise DNS architecture; threats that we experience at HPE every day."
Central to commercial realisation of The Machine will be the development of non-volatile memory. Suermondt told Computerworld last year that there were a number of technologies for non-volatile memory approaching commercialisation that HPE was examining.
Sign up for Computerworld eNewsletters.