A recent new report from Gartner research offers insight into the “data lakes” craze – Gartner Says Beware of the Data Lake Fallacy. Andrew White, vice president at Gartner, and Nick Heudecker, research director at Gartner, wrote an interesting review of the next big step in big data management technology. White and Heudecker write that the “need for increased agility and accessibility for data analysis is the primary driver for data lakes,” there are major risks associated with using data lakes, including security and data quality control.
Another issue that White and Heudecker have with data lakes is that they feel that data stored in lakes, rather than silos, is inaccessible to non-data scientists. Due to the lack of proper governance, business users will lack the level of sophistication required to obtain value from the content and developing the skills necessary is next to impossible.
Ben Werther of Platfora doesn’t agree with this analysis. Ben believes that only by un-siloing data into lakes can businesses derive the value that big data is supposed to provide. With Forrester Research estimating that only 10 – 12 percent of stored data is analyzed, data cannot be allocated to data scientists alone. As data agility becomes key to deriving insight, Ben believes that data lakes are the natural progression for the technology.
Sign up for the free insideAI News newsletter.