“Intelligent Storage” Market Survey Shows Growing Problem Moving Large Data Sets for a Variety of Applications

G2M Research, an analyst firm covering the Non-Volatile Memory Express® (NVMe) marketplace, released the results of its recent survey on the need for “Intelligence storage” for applications with large data sets. The survey, sponsored by NGD Systems, was conducted across 112 respondents from organizations involved in Big Data, artificial intelligence/machine learning, and Internet of Things (IoT) applications.

The purpose of the study was to gauge whether the movement of large data sets across existing processing and storage architectures negatively impacts the cost and usability of the data by applications. The results of the survey show that existing compute and storage architectures adversely impact the performance and cost of these applications, and that new architectures are needed if these applications are to continue to scale in size and capabilities.

Data sets for applications such as Big Data, AI/ML, and IoT continue to grow at an exponential rate,” said Mike Heumann, Managing Partner, G2M Research. “Our research study shows that the majority of users in these application spaces are very concerned about how this growth will impact their ability to use these applications over the next 12 months. The majority of these end-users also believe that new approaches like processing data within storage devices will be necessary to overcome existing data movement bottlenecks.”

The movement of very large data stores is increasingly critical for real-time analytics in a variety of applications. However, this data movement is not without cost or impact. The key findings of the survey include the following:

  1. 52% of respondents consider the movement of large data stores between storage systems, storage devices, and servers will be a significant problem for their organization either today or within the next 12 months.
  2. 92% of respondents expect that data movement will adversely impact their organization, with 62% responding that it will impact server, networking, or storage costs, 48% saying it will impact application performance, and 29% saying it will limit the way data can be used.
  3. Over 79% of respondents believe that current processing/storage architectures will not be able to handle the amount of data in their industry in the next 5 years.
  4. 64% of respondents believe that processing or preprocessing data inside storage systems/devices could help solve the data movement problem.

As the capacity of SSD drives and the number of SSDs within servers continue to increase, moving the data out of these drives into the CPUs will be become exponentially harder and more cumbersome” said Nader Salessi, President and CEO of NGD Systems. “The G2M Research survey clearly illustrates the issues that large data sets present to application architects for Big Data, IoT, and AI/ML, among others. In-situ processing like that of the NGD Systems Catalina 2 SSD provide a compelling alternative to moving large amounts of data between storage systems, storage devices, and servers/CPU complexes.”

One of the most promising concepts to address the storage-CPU bottleneck is the use of in-situ processing within storage devices. In-situ processing revolutionizes the deployment of a variety of applications that today require huge clusters of expensive multi-socket servers with large amounts of RAM. By significantly reducing the amount of  data that has to be moved between storage systems/devices and servers/CPUs/GPUs, in-situ processing within NVMe flash solid-state drives (SSDs) can significantly reduce network size/complexity, CPU/GPU workload, and power consumption for applications utilizing high number of IOs.

NGD’s Catalina II NVMe SSD enables the capability of in-situ processing, and is the first product to help reset the CPU/GPU-storage gap and improve the data center TCOs. NGD’s NVMe SSDs also has the industry’s highest capacities and lowest power per TB (W/TB).

 

Sign up for the free insideAI News newsletter.