What’s the best place to see “futuristic” technology in action? Sometimes, you’ll see a demo or paper that explains it well, but often Hollywood (and science fiction in general) is the best source. In the case of IoT (Internet of Things), there are two essential movies: “Terminator” and “Avatar”. (If you haven’t seen either movie yet, shame on you! Your entire future career in technology is in jeopardy.)
In the “Terminator”, Skynet, a global intelligent machine with sensors everywhere, tries to destroy mankind. In “Avatar”, Eywa, a global being centered at the Tree of Souls with biological sensors everywhere, repels and defeats a destructive, militaristic corporation. Before we parse the lessons from those films, however, let’s get a few principles straight.
Mining Hollywood movies for technological insights requires that you remember two things. First, movies, like long-term product roadmaps, are both stories. Just don’t get too tied down to the particular details the storyteller chose. The truth and insights you’re after here are in the big picture (pun intended).
Second, look past the moral aspects of the characters in the story. The writer just picked those to make the movie interesting. For our purposes, it doesn’t matter that Skynet is evil and that Eywa is good. We are more interested in what the portrayed sensor-rich network is and what it could do. The former gives its structure, while the latter gives its function. Those two things are what we most want to understand about IoT.
Now, let’s step back from Hollywood for a second and review some facts.
“It’s really hard to stimulate your brain with no [sensation of] light. It’s blanking me. I can feel my brain just not wanting to do anything.” Adam Bloom, sensory deprivation subject in BBC documentary “Total Isolation” (2008) [1]
Without sensory input, the brain has little to do. Imagine yourself without visual, auditory, tactile, olfactory, taste, and other inputs. What is there for your brain to do?
Brains need sensory inputs to work properly. You see from the Adam Bloom quote that, with limited sensory input, some minds just stop. This can be positive. Most psychotherapists think a short period of time in sensory isolation refreshes the brain. But, everyone agrees the senses are important to a functioning brain.
But sensors also need processing to be useful. If the light patterns detected by the eye didn’t go to the brain for processing but just terminated, there would be no vision. If the audio signals detected by the ear didn’t go the brain for processing but just terminated, there would be no hearing.
This lesson applies to the technological world as well. If your product includes sensors, it necessarily includes some form of processing. Otherwise, the product is useless.
Now with that basic set of comments, let’s grab the popcorn and return to the movies.
Skynet is not just a disembodied disk of facts and Eywa is not just a single tree. No, what makes Skynet and Eywa powerful is that it has, or connects to, sensors everywhere. Those sensors provide real-time information about people, the environment, resources, and so on. Without that real-time data, Skynet is just a database and Eywa is just a single tree.
IoT, like Skynet and Eywa, starts with sensors. In cellphones and tablets alone, we see worldwide annual shipments of over three billion core motion sensors. If you add in other sensor types and other market applications, some observers predict shipments of 100 billion sensors in just a few years. Since sensors don’t immediately wear out, this translates to a world with well over a trillion sensors in use.
Those sensors are networked, at least locally, and increasingly over a wider area than that. Networking is central to Skynet, Eywa and the IoT. As we noted above though, those networked sensors are paired with networked processing.
Each local sensor (or group of sensors) necessarily requires some level of processing. Processing begins with configuration, calibration, sampling, and reading the data. However, in most cases, this processing also includes what is called sensor fusion. Sensor fusion synthesizes improved data and intelligence by viewing the sensor data holistically. The resulting intelligence can be about events, contexts, and more.
Now, notice something interesting. The IoT looks very much like a brain. The local device node resembles a neuron. The networking between the nodes (including higher level processors) resembles the pathways in the brain. Finally, the distributed processing resembles the overall structure and function of the brain. But, in another example of fractal structure, zoom out and you find that human brains and IoT devices network similarly too. In fact, one could easily describe our overall human interactions in IoT terms but I’ll spare you that in this short article.
Consider the figure below. It’s a high level view of the overall IoT where each node is shown as either a device or a human brain. Because the IoT device and the brain share a similar architecture, they are interchangeable, for purposes of our analogy. Inputs can either be from biological sensors like ears and eyes or from man-made ones like microphones and cameras. Outputs can also be either from biological actuators like the mouth/vocal tract or from man-made ones like speakers. Of course, in the case of devices, data itself can be either input or output over network links.
Individual nodes can locally network with other nodes (referred to in the figure as the “local network brain”) and/or interconnect to the “cloud” (referred to in the figure as the “global network brain”). It’s really a beautiful and simple structure. The important thing to remember, though, is that it starts with sensors (including actuators) and processing.
Remember sensors and sensor processing go together. Under-provisioning either one can impair the whole. A system with limited sensors has little to do, while a system with limited sensor processing does little. Poor calibration of a sensor reduces the quality of the data it provides and that dumbs down the whole network. High quality calibration and great sensor fusion improves the intelligence of the whole network.
In fact, one could say that good sensor processing raises the IQ of the whole IoT brain. Higher IQ sensor networks then allow for better decisions. And, whether it’s recommending you take your car in for maintenance or deciding which flying object is an enemy missile, poor decisions have bad consequences. Remember Skynet’s role in the movie Terminator? It’s important we get it right.
But just what can you do with this evolving worldwide brain? Well, lots of things but I’ll cover that in my next post.
A quick request to our readers: Hillcrest was one of the earliest users of MEMS sensors for consumer electronics. We are a leading independent sensor processing software vendor. We also approach everything with an eye to the user experience because we believe that is the real end product the consumer buys. In this column, we will write about news and trends we find interesting, relevant and important. However, we’d really love to hear what you’d like us to write about. Do you have questions, concerns, or a controversial opinion that you’d like to see addressed in this column? Let us know in the comments or by email.
[1] “Alone in the Dark”, by Huw Jones, https://news.bbc.co.uk/2/hi/uk_news/magazine/7199769.stm
[2] Partially an adaptation and repurposing of a human brain diagram in a fantastic article at goo.gl/hNfKJS .