By 2025, it is estimated that enterprises will experience a 50-fold increase in the amount of data they will need to manage. With the increase comes a transition from “big data” to “fast data” as companies seek to harness actionable insights from this tidal wave of data, reveals new research from data storage firm Condusiv Technologies.
“What it boils down to is the difference between being able to do something challenging, and being able to do it accurately and in a hurry,” says James D’Arezzo, CEO of Condusiv Technologies, in a news release. “It’s a little like tennis. Making a good clean serve against your local player is one thing. Making it when someone like Roger Federer is serving at you is something else.”
One industry being forced to move quickly from big data to fast data is retail
Omnichannel retailing—the ability to provide a seamless customer experience whether in-store or online—has come in the past decade from being virtually impossible to being a competitive necessity. While U.S. omnichannel customers make up only 7 percent of all consumers, they account for 27 percent of all retail sales.
To provide them with the service they have come to expect, it is necessary for the retailer to know where a given item is at all times, so as to make certain the customer gets what she wants how and when she wants it. Average inventory accuracy in retail is about 65 percent; accurate omnichannel fulfillment requires 95 percent or greater, plus the ability to integrate—on the fly—data from the supply chain, customer relations management, credit and collections, and sensor networks.
To stay in business, retailers must do all this not only quickly but profitably
One major challenge to retail profitability is customer returns of purchased merchandise for credit or refund; it’s not uncommon to see return rates of 30 percent or more for merchandise bought online, and clothing returns can be closer to 40 percent. To improve inventory turnaround and reduce the cost of returns, innovative retailers are using fast data to link logistics and ecommerce systems so as to immediately add in-process returns to available inventory.
While “data at the speed of insight” can be invaluable to innovative businesses, it also greatly increases performance requirements for those organizations’ IT systems. In many cases, the on-the-fly database integrations characteristic of fast data are carried out by components that are already a pinch point in many enterprises. Condusiv’s most recent I/O Performance Survey reported that 28 percent of all organizations are getting user complaints related to sluggish performance from their Microsoft SQL applications. And that, notes D’Arezzo, is in 2018—seven years before the predicted 50x increase in data organizations will have to manage.
“Throwing hardware at the problem—adding newer, faster storage, for example—won’t fix it,” D’Arezzo said. “Data handling aside, factors like non-application I/O overhead, data pipelines, and file system overhead—all by themselves—can cause degradation to your applications of 30 percent to 50 percent or more. We’ve developed software solutions aimed at application performance problems like MS-SQL workloads, Oracle, ERP, VDI, EHR, business intelligence apps, CRM, Exchange, Share Point, file servers, and backup. We can unjam the data flow and enable big-data operations to become fast—without going broke in the process.”