The Shift to Turn-Key Big Data Intelligence

In this special guest feature, Alex Henthorn-Iwane, Vice President of Marketing at Kentik, suggests that yet we’re still in the early innings of big data, and two major aspects of maturation are defining the future of the industry:  the journey from open source to SaaS, and the journey from bespoke business intelligence to real-time operational use cases. These two journeys are related and intertwined. Henthorn-Iwane has more than 20 years of experience bringing new technologies in networking, security and software to global markets. He leads the global marketing strategy for Kentik and helps the company bring its innovative story and solutions to network-dependent organizations around the world. Prior to joining Kentik, Henthorn-Iwane was VP of Marketing at QualiSystems, where he oversaw worldwide marketing at the industry’s leading supplier of DevOps cloud orchestration software for hybrid IT infrastructure, and Packet Design Inc., a provider of network management software founded by serial entrepreneur Judy Estrin. He has written for Embedded Computing, Virtual Strategy Magazine, Datamation, SDN Central, Datacenter Knowledge and InformationWeek.

Big data is growing up. Most large enterprises and carriers have run big data projects and reported significant impacts on diverse areas of their businesses.  Yet we’re still in the early innings, and two major aspects of maturation are defining the future for big data:  the journey from open source to SaaS, and the journey from bespoke business intelligence to real-time operational use cases.  These two journeys are related and intertwined.

The Journey from Open Source to SaaS

For most of its history, big data has been a ‘develop your own toolset’ journey for IT organizations.  That’s fitting, since the roots of big data technology are from the applied research and development teams of webscale giants, who are very much into “rolling their own.”  Open source projects such as Hadoop have become broadly adopted. New careers paths within data science have emerged, along with major consulting practices.  And a proliferation of newer open source projects like Druid and ELK have made big data increasingly accessible for many I.T. teams to innovate on.

As the vendor world began to understand the scope of the commercial opportunity, new organizations like Cloudera, Hortonworks and Pentaho sprang up to package the open source platforms and wrap them with vertical and use case solution frameworks and consulting services to make it easier for enterprise organizations to gain value from big data without having to create their own development organization.  These organizations are akin to the way that Red Hat, Canonical, IBM and Oracle capitalized heavily on Linux.

While these platform companies will dominate many big data deployments within enterprises for a while, the future of big data is SaaS.  Enterprises are looking for ways to move not just infrastructure, but the whole burden of application vision, development, integration and scaling to the cloud.   Big data will increasingly become an integrated, enabling component of a new generation of SaaS applications that can leverage big data in a multi-tenant fashion and deliver far more powerful variants of existing applications (whether enterprise software or SaaS).

Why, for example, shouldn’t SaaS CRM and marketing automation leverage big data engines to be able to create far more powerful, ad-hoc and multi-field-based analyses?  There’s no reason why the infamous separation between leads and accounts, opportunities and contacts in Salesforce.com *must* exist.  It’s simply an artifact of a legacy relational database schema that has severe limitations in its ability to create key fields between various clusters of tables.  Big data could solve that issue and create a degree of unity between various sets of data that would make CRM and marketing automation far more powerful than it currently is.  Marketers and sales operations folks around the world would bow down and give thanks.

The Journey from Bespoke Business Intelligence to Real-Time Operational Use Cases

Big data has mostly been about business intelligence.  The stereotypical example is pivoting massive amounts of retail or web visit data to gain insights into browsing, keyword and buying patterns, but the fields of industry where big data-powered business intelligence are providing value are multiplying.  As industry IoT sensors create massive amounts of time-series data from operations, and as more industrial assets are being tracked more closely than ever, manufacturing, oil & gas, healthcare and other industries are facing both an explosion of data and a huge opportunity.  For example, leveraging advanced analytics powered by big data is one of the reasons why increasing cost-efficiencies continue to rise for energy exploration, allowing oil and gas companies to produce profitably at lower and lower commodity prices.

As valuable as business intelligence is, it’s not where big data’s value ends.  The ability to leverage big data sets in real-time to aid in real-time operations is things go big next.  Harkening back to our sales and marketing operations example, what if average businesses could in real-time pivot and crunch datasets to aid operational decision making within minutes, rather than the hours and days it often takes to gain planning-oriented insights from business intelligence?

There are plenty of domains where existing operational data is already volumetric enough to qualify as big data, and where  real-time capabilities can give a measurable boost in effectiveness.

Take network management for example.  Nearly every business has a network now.  Network infrastructure is a multi-billion dollar industry annually. With increasing amounts of commerce, music, gaming and other data transiting the internet, traffic volume is growing astronomically.  It’s not unusual these days for business with digital initiatives to have tens to hundreds of Gbps in internet bandwidth, let alone within and between large datacenters.

The telemetry data generated by tracking network traffic to support essential operational visibility is massive.  Even with aggressive sampling techniques, traffic flow data alone can easily equal many billions of records daily for a decent-sized network.

The technology deployed for most network management was mostly developed in the early 2000’s, with single server appliances as the primary form factor.  Since these servers can’t retain much of the generated data, they only provide summary views.  Yet, full details retained by and scanned for anomalies or made rapidly analyzable by big data can revolutionize the effectiveness of the networks that so much of our modern economy depends on.

Real-Time, SaaS and the Virtuous Cycle

These two journeys are on a collision course.  There’s no way that most enterprises are going to want to build and maintain operationally-oriented big data-based applications on their own, or even with a consulting firm.  So real-time, operational big data will mostly be embodied as SaaS.

There’s an upside for business intelligence as big data becomes more operationally relevant, which is that the line between them will start to blur.  When it takes a significant amount of work and time to get at insights, only some planning and strategy use cases will be a high enough priority to justify the effort and cost.  As operationally relevant, big data applications become SaaS-ified and highly accessible, business intelligence use cases that are closely adjacent to the already existing operational data will become apparent and easy to reach.  The result is a virtuous cycle, where SaaS applications extend their datasets to meet these adjacent use cases, while more pure business intelligence platforms will increase their usability to compete.  The breadth and depth of big data-enabled intelligence available to enterprises will become all the richer for it.

 

Sign up for the free insideAI News newsletter.