When it comes to maximizing the efficiency of data, brands and businesses are facing a perfect storm of cost, complexity, and talent when it comes to supporting critical analytics and AI initiatives, new research from consulting firm Boston Consulting Group (BCG) reveals.
The firm’s latest report, A New Architecture to Manage Data Costs and Complexity, sponsored by open source solutions provider Red Hat and data analytics engine Starburst, explores organizational challenges related to exponential growth in data volumes and rapid innovation across the data stack.
The study highlights three main trends reshaping the data landscape:
- The volume and velocity of data are increasing;
- Data use cases are becoming both more accessible, creating the growth of “citizen data scientists”; and
- Technology advancements have shifted the pricing model.
According to the new report, these trends are creating challenges that have put immense pressure on today’s architectures:
- Today’s enterprise architectures are stretched thin, with more than 50 percent of data leaders saying architectural complexity is a significant pain point.
- Vendor proliferation across all data categories is a major issue. For larger companies with a more mature data stack, the total number of unique data vendors has nearly tripled in the last decade—up from about 50 to over 150 today.
- 56 percent of managers said managing data operating costs is a pain point, but they are still continuing to increase their investments in modernizing and building new data architectures.
“Accessibility will continue to increase as data literacy and a basic understanding of programming languages such as SQL become more widespread among nontechnical employees,” said Pranay Ahlawat, partner and associate director at BCG, in a news release. “According to our research, almost three-quarters (73 percent) of survey respondents expect the number of non-technical consumers of data to increase in the next three years.”
The research details the growth of ‘citizen data scientists’ and juxtaposes that growth with the increased sophistication of AI-related use cases. The gap between more advanced analytics use cases and technologies, and the analytics skill sets required, is currently limiting the business outcomes AI can drive. “Only 54 percent of managers believe that their company’s AI initiatives create tangible business value,” said Ahlawat.
“The survey responses confirm that many enterprises are struggling with adapting to increasing data volumes across multi-cloud and edge while also maintaining legacy data architectures. This is compounded by increasing data privacy regulations, pressure on IT and data spend and a shortage of highly-skilled talent,” said Steven Huels, senior director, cloud services for AI and machine learning at Red Hat, in the release. Red Hat believes that the solution to managing these challenges will be to implement data architectures that are agile—built for today’s requirements with the flexibility to evolve quickly in the future.”
The research indicates that given the rapid growth of data and use-case volume, increasing complexity and the skyrocketing costs, more organizations are reaching a breaking point. For those willing to take this on, the report offers a few key lessons to keep in mind:
Data architectures are evolving to be more federated and service-oriented
According to survey respondents, 68 percent of companies aspire to implement a more federated and distributed architectural paradigm (i.e., Data Mesh or Data Products) in the next three years.
Pay close attention to overall data TCO
To keep costs under control, establish baseline spending and de-average customer segments to understand key drivers—such as people, data transfer and movement, data storage, and software. Drive shorter-term tactical cost improvements by exploring multiple approaches.
Economics will drive architecture choices—open source and hyperscalers will continue to influence technology choices
The research shows that cloud and open source are expected to continue to play a significant role in the future of enterprise data architectures as enterprises aim to manage costs. In fact, BCG estimates that open source can reduce the total costs of the data stack by 15-40 percent for some organizations.
“One of the most significant takeaways from this study is the need for organizations to invest in a decoupled and federated data architecture,” said Justin Borgman, Starburst CEO and co-founder, in the release. “This approach meets today’s reality that data is everywhere, and companies can’t afford the time, cost, and architectural complexity to centralize it. It allows companies to bring analytics to the data, making it accessible for decision-making without data movement complexities and costs. It is the only viable approach that will allow companies to meet increased demands for data storage and analytics workloads, while getting costs under control.”
Boston Consulting Group carried out A New Architecture to Manage Data Costs and Complexity study in collaboration with Red Hat and Starburst from September 2022 to November 2022. This research report involved surveying approximately 300 U.S. and Western European organizations, from mid-size and large companies, including several Starburst and Red Hat customers, across a broad range of industry verticals including: consumer, industrial goods, financial, healthcare, technology, education, media, energy, telecom and more. This study surveyed titles in the C-Suite, SVP, VP, Director and Senior Manager and interviews with customers, buyers, industry leaders.