Internally, Heritage analyzes data to further verify not only the identity, but also the financial worthiness of bidders. "We want to know the potential exposure at any given time," says Shipman. "How recently did they sign up with us? How much have they spent with us previously? Do they owe us money for previous transactions? Has their address recently changed?" All that data is ranked in real time and sent to an analyst to determine on a 30-point scale whether a bidder is suspicious. If there are questions about a bidder's background, the individual could be banned from the auction until his identity can be verified.
Heritage's infrastructure includes auction software from Iasi, Romania-based MigratoryData, whose system is capable of handling 12 million concurrent users and dealing with latency issues that may stem from concurrent bidders using different client devices. The MigratoryData tool also uses hierarchical storage management, Shipman says, so that current data is moved to more expensive flash memory while data that isn't accessed as much resides in slower storage, like hard drives. "This is a far less expensive solution than using all-flash storage, while giving us the benefit of data that must always be as fast as possible," Shipman says.
It's still early days for fast data, as indicated by the number of users building fast data systems themselves. But Ovum's Baer says he has seen a lot of progress. For example, "the introduction of Yarn in Hadoop 2.0 allows you to schedule different workloads, so different parts of a hardware cluster can handle workloads logically rather than physically," he says.
Baer estimates that big data and fast data are in approximately the same place as data warehouses and business intelligence were in 1996. "In the next two years, you'll see a lot of new [analysis] tools coming out," he says. Before long, he adds, fast data won't just be a nice thing for companies to have -- it will be commonplace.
Sign up for Computerworld eNewsletters.