Interview: Data Analytics and the Ubiquitous Internet of Things

Print Friendly, PDF & Email

In a world where the Internet of Things (IoT) produces massive amounts of data from mobile devices, vehicular systems and environmental sensors, data scientist will be tasked with what to do with all of this information. We sat down with Cristian Borcea, PhD from the New Jersey Institute of Technology to discuss the IoT and Big Data applications.

insideAI News: It seems we can’t turn on the television or read a newspaper without encountering the phrase, the “Internet of Things”. Aside from being a marketing term, what does this really mean?

Cristian Borcea: The emerging Internet of Things (IoT) consists of heterogeneous computing and wireless  communication systems embedded in the physical environment. Examples of such systems are smart meters, RFID/smart tags of many familiar objects, wearable health monitoring sensors, sensors and actuators to manage the infrastructure of smart cities, and intelligent video cameras. Many researchers consider smart phones, intelligent vehicles, and many types of robots to be part of IoT. Some application examples: manage power and water distribution infrastructure in cities, with special attention paid to fault detection during disaster situations; improve healthcare, including controlling, slowing and stopping the spread of epidemic diseases; and develop services for road traffic congestion avoidance.

insideAI News: As a computer scientist, what excites you about this movement?

Cristian Borcea: It’s always exciting to see our work moving from the lab into the real world. IoT could have an even bigger impact on society than the Internet because computing and wireless communication will be embedded everywhere in our environment and in the objects we are using daily. As I point out below, there are still many challenges to overcome, which makes IoT research intellectually challenging.

insideAI News: What are its challenges?

Cristian Borcea: IoT will have massive scale, with the estimates ranging from 50 billion systems by 2020 to trillions once most daily-use objects are equipped with embedded computing, sensing, and communication capabilities. Additionally, IoT will be highly dynamic because many of these systems are mobile and have energy limitations which make them switch between on and off states frequently.  Unlike the current Internet, where most content is delivered by big content providers with plenty of resources, a huge amount of data will be produced by IoT devices with significantly fewer resources. How to deal with this new communication pattern is challenging. Finally, open standards are needed to ensure that IoT devices or networks can communicate with each other. (Currently most IoT developments use proprietary protocols and do not support interoperability).

insideAI News: How does one harness all this data in a technological way and make sense of it?

Cristian Borcea: The massive scale and highly dynamic nature of the IoT, the huge amounts of data streamed from the physical world, and new communication patterns demand novel programming, content delivery, and network management approaches if we are to achieve the full potential of IoT. Furthermore, data mining and data analytics techniques need to be developed to make sense of this deluge of data.

One potential solution involves leveraging the cloud for efficient IoT service and data management, enhanced performance and availability, lower latency, and energy savings on IoT devices.

insideAI News: What does Big Data mean to you? What can be gleaned from it?

Cristian Borcea: We are already generating petabytes of data and soon will create exabytes of data. (For example, Facebook stores more than 300 petabytes of data generated by its users). IoT will also generate huge amounts of data. Many medical (medical imaging for instance) and scientific instruments (telescopes, etc) also generate large amounts of data. New machine learning techniques could help us extract knowledge from these data – this happens especially for knowledge that we don’t expect and we don’t even know exists – we cannot search for something that we don’t know exists. But machine learning techniques could identify patterns in the data and extract new knowledge. Examples: identify how epidemic diseases are spread by monitoring people’s location and physical symptoms; discover reasons why traffic congestion happens on certain roads at certain times; find out unknown relations between climate change and people migration around the world.

insideAI News: This is exciting, fast-moving stuff. Where’s it all going? What’s in store in 2-3 years from now?

Cristian Borcea: IoT will become more mainstream in the next 2-3 years. We will move closer to the vision of smart cities. Crowdsensing (e.g., collecting sensor data from citizens and their mobile devices) will have a major role in this development.  I expect people will see the benefits clearer. (For example, during hurricanes or other disasters: IoT sensors/actuators will allow our infrastructure to cope better with disasters). In the homes, we will start to see higher energy efficiency due to various sensors embedded in our devices or the homes themselves. In terms of big data, it’s difficult to say what we’ll learn in the next 2-3 years. That’s the beauty of big data: you don’t know what patterns will be uncovered from studying the data.

The New Jersey Institute of Technology is offering an online Master of Science in Computer Science (MSCS) program. We encourage you to check it out.

Speak Your Mind