And This is the Future of IoT Analytics

Michael_HummelIn this special guest feature, Michael Hummel of ParStream discusses what the future holds for IoT analytics in terms of four major changes we’ll see. Michael Hummel is Co-founder and CTO of ParStream, a real-time Big Data business analytics and solutions company. He previously co-founded Empulse, a portal solutions and software consultancy now specializing in Web 2.0 projects.

Today’s IoT data is analyzed at near real-time speeds due in part to the growth of complex event processing systems. But what does the future hold for IoT analytics? I have been closely watching the trends and foresee some changes in the area. In fact, I believe there are at least four major changes we will see in IoT analytics:

Big Data, Fast Data and More Analytics

The good news is that we will generally see more demand for IoT analytics. Due to increased competitive pressure, organizations are forced to optimize processes and products. The key to identify optimization opportunities and track improvements is in the IoT data itself. But, as sensors and data processing gets cheaper every day, we will see way more data being available and the expectation is that it will be processed in near-real-time. In addition to analyzing data for learning, near-real-time analytics allows companies to react quickly – to avoid problems, cure them before they become serious, offer help or simply prepare for an arising issue to reduce the impact.

Horizontal Integration, Vertical Application

We still find a lot of purpose built solution stacks, from sensors to applications. Although purpose built solutions are able to deliver an ideal solution for the one specific use-case, the trouble is they are expensive, inflexible and do not easily integrate and share data. Today’s business world is seeing tremendous change in the speed in which we have to respond to market changes, leading to the need for a very flexible solution that can be adapted on the fly, ideally by business people, and not requiring complex tasks such as IT-support, coding, deployment etc.

I believe we will see more horizontal components with tremendous market success. Organizations will use building blocks optimized for a certain infrastructural tasks including device management, data collection, storage, analytics and application management. Not surprising given standardization is going to play a big role in enabling such horizontal platform components to integrate and work well with each other. However, we will also see vertical applications because it is the best way to put data in context and convey insight to a user-group. A vertical application, such as conditional monitoring of welding robots, shows the relevant information in a way that the target audience understands easily. Analytic front ends, as beautiful as they may be, are only good for analytically minded people. In my experience not everyone enjoys “surfing data,” most prefer a context specific presentation of just the most relevant data.

Decentralization

We will see much growth in data volume coming from disparate devices that require both-real-time and post-mortem analysis. Knowing the importance of efficiency, do we really want to transfer all of the (low-value) log data from all sensors, machines, and switches to a central data analytics installation? The costs of transferring data from all over the world in real-time to a central location is much higher than the savings through economy-of-scale of a centralized solution. Further, network latencies and interruptions omit the usage of centralized solutions. I learned a fundamental principle for fast and efficient data processing –move the query to the data!

With IoT where data is created from geographically distributed devices, we will see a decentralization of data storage, processing and analytics. There will simply no be enough network bandwidth in the future to transfer all the data in real-time.

Integration of Advanced Analytics and Machine Learning

There is so much talk about advanced analytics (AA), artificial intelligence (AI) and machine learning (ML) that most people have a hard time understanding it. And the good news is, the majority of people do not have to understand, they should care about their operational processes and products – not about math. My view on AA (including AI, ML etc.) is that it delivers hints that normal people might have missed due to volume, speed and complexity of the IoT data available. If AA is done well, such as being focused on a specific set of questions, then these hints are valuable starting points for more detailed analysis. AA is not a wonder-weapon nor is it a self-doing-something – AA is a tool that hints us at potentially overseen effects, dependencies and trends.

Conclusion

Solutions will have to deal with the above requirements to help organizations optimize IoT data now and in the future. Let me know if I missed something, I am curious to hear your thoughts on the future of IoT analytics.

 

Sign up for the free insideAI News newsletter.

Comments

  1. An excellent summary Daniel, any completely aligned with my own thoughts on the matter. We need to be able to model and construct AA & ML services centrally, and push them out to the edge based on vertical application information processing needs, and contextual policies in near real-time. Right now though, the detailed understanding of edge compute capability is also needed for this, and all the specific communications protocol details to make this happen.