In this special guest feature, Jordan Cardonick, Director of Media Analytics at Merkle, discusses the common pitfalls and missteps that companies fall into when trying to democratize data and insights for their company. It isn’t just about what data you have access to, but how do you ensure it is effectively communicated and embraced and not just how to use it to create a chart, graph or model. Jordan has spent the last decade of his career transforming data and information into actionable and achievable insights for world-class brands, totaling over a quarter billion dollars in spend. While primarily focused on digital media, he and his team combine that knowledge along with marketing savvy to help grow the accounts that they support. Jordan has an undergraduate degree and MBA from the University of Pittsburgh, but still remains a dedicated Philadelphia Sports fan.
In this new data-first environment, companies are diving head first into artificial intelligence (AI), machine learning, data lakes, and cloud-based solutions to drive their marketing efforts, while they hire every resume that contains a Data Scientist title. But these organizations need to do some self-reflection to ensure they are capable and ready to not just ingest the mountain of information that is becoming increasingly accessible to them, but make it accessible to those that need it most i.e. those that use the data to make informed decisions. The evolution of big data has led to an ability to provide insightful, comprehensible information for stakeholders to act upon. While the technology is now available to crunch through millions of records of data in mere seconds and visualize it in an appealing way, it is still going to be up to smart, business-savvy minds to craft the story that delivers true value to the business, as well as the consumer. All the models and reports that can be built will not result in successful people-based marketing, unless there are analytically literate individuals who can translate those numbers and data points into powerful insights.
The concept of big data has been around for some time. But its current role in solving core marketing problems emerged when search, social, display, and other forms of digital marketing started to mature with respect to the data that could be captured across channels and media. The ability to predict “right customer, right message, right time” seemed finally at the tips of our fingers. The issue now was the technology. We weren’t quite ready to manage and leverage this influx of data. Data storage was relatively cheap, but processing power was lacking, and we couldn’t easily and quickly access and structure this data. The cost of purchasing additional servers may have been feasible for larger firms, but middle-of-the-road players and smaller shops did not have the capital to invest in bringing their capabilities up to snuff. It seemed as if all this data was simply a tease to bigger and better things.
Within the last few years, though, companies like Google and Amazon began to open their vaults and make available the technology that powered their organizations. And many of the other larger technology providers quickly followed suit. These cloud-based platforms also enabled open source tools like R and Python to tackle these massive data sets and start developing models and deriving insights. BI tools such as Tableau and DOMO, while already growing in popularity, became even more valuable to start visualizing all this rich information. Everything seemed to be falling into place, except that companies couldn’t facilitate the sharing of this information in a way that was conducive to running the business. This can still be seen across the marketplace, with firms struggling to take that data the last half mile and deliver actionable insights, versus P-values from regression models with lots of fancy interaction effects and complex machine learning algorithms.
Those actionable insights, formed at the intersection of analytics and business goals/values, are difficult to bring to light. Much of the analytics world is focused on answering questions like “How do I access and integrate the data?” or “How do I find insights within the data?” This was very much the focus during the early stages of big data. Marketers were challenged with the difficult task of juggling all this information and turning it into something that they deemed valuable. But they missed the business focus. They weren’t always thinking about or necessarily guided toward answering questions, such as “What is the actual business problem I’m looking to solve?” and “Once insights are created, how do I disseminate that information to people or systems in a digestible fashion?” The typical response to this set of questions is “Well, there is a Tableau report” or “Here is the data you asked for that shows spend is down.” There is often little context and relevance to the business owner. This is what creates one of the worst possible outcomes in an organization: a lack of trust in the data because the end users don’t understand it, potentially leverage it incorrectly, or just have an overall lack of faith because they don’t have clarity around how it was derived.
Culturally, it is difficult for many organizations to quickly and efficiently transform their approach. But there are ways to close this divide and bring the groups together in a way that creates success. Items like simple documentation and KPI definitions can help ensure everyone is on the same page and able to speak the same language with regard to the metrics being studied. A universal ability to read and comprehend a report, versus simply looking at the numbers in a chart, also puts the team in a common mindset, in terms of how to develop insights. Being able to speak to what this pie chart conveys and why it is important creates an environment where people are empowered to use the data instead of just getting it.
For larger companies, putting investment and time into data governance, process, and guidance can help to prevent a lack of trust in the data, because the company understands how the data is being acquired and managed, which instills confidence that it is the best data possible. It isn’t necessary to document every minute detail for everyone or to have non-technical people make decisions on critical data transformations. But giving them a voice and helping to make them feel close to the data provides them with a sense of control.
There is also something to be said for the types of attributes that are valuable to hire for, so instead of hiring data scientists with PhDs and the ability to create exceedingly complex models, you might look for quantitatively focused individuals with a curious mind and the ability to translate intricate analytical concepts into something more digestible to the layperson. Those latter individuals may not be as statistically savvy; however, sometimes less complex approaches can bring just as much value, if not more, because they are easier to comprehend for those who will ultimately implement the solution.
With technology, tools, and data gaining in potency and becoming easier to manage and leverage, it is the people and companies that are holding back the potential value that all this information has to offer. Only the marketers who can operate where those capabilities intersect with defined business objectives – with clear and open communication across the enterprise – will succeed.
Sign up for the free insideAI News newsletter.