Terminology that loosely ring-fences a group of related technologies is often very helpful in engineering discussions – until the hype machine gets a hold of them. “Cloud” is a fairly obvious victim of this. Initially conceived to describe large-scale, highly-available, geo-redundant, and professionally-managed Internet-based services that are “up there and far away” without the user knowing of or caring about particular machines or even datacenter locations, it’s now come so far that a hard drive manufacturer sells a network attached drive as a “cloud” that allows storing content “safely at home”. Thank you very much. For “cloud”, the dilution of the usefulness of the term took probably a few years and included milestones like the labeling of datacenter virtualization as “private cloud” and more recently the broad relabeling of practically all managed hosting services or even outsourced data center operations as “cloud”.
The term “Internet of Things” is being diluted into near nonsense even faster. It was initially meant to describe, as a sort of visionary lighthouse, the interconnection of sensors and physical devices of all kinds into a network much like the Internet, in order to allow for gaining new insights about and allow new automated interaction with the physical world – juxtaposed with today’s Internet that is primarily oriented towards human-machine interaction. What we’ve ended up with in today’s discussions is that the term has been made synonymous with what I have started to call “Thing on the Internet”.
A refrigerator with a display and a built-in browser that allows browsing the next super-market’s special offers including the ability to order them may be cool (at least on the inside, even when the gadget novelty has worn off), but it’s conceptually and even technically not different from a tablet or phone – and that would even be true if it had a bar code scanner with which one could obsessively check the milk and margarine in and out (in which case professional help may be in order). The same is true for the city guide or weather information functions in a fancy connected car multimedia system or today’s top news headline being burnt into a slice of bread by the mythical Internet toaster. Those things are things on the Internet. They’re the long oxidized fuel of the 1990s dotcom boom and fall. Technically and conceptually boring. Islands. Solved problems.
The challenge is elsewhere.
“Internet of Things” ought to be about internetworked things, about (responsibly) gathering and distributing information from and about the physical world, about temperature and pollution, about heartbeats and blood pressure, about humidity and mineralization, about voltages and amperes, about liquid and gas pressures and volumes, about seismic activity and tides, about velocity, acceleration, and altitude – it’s about learning about the world’s circumstances, drawing conclusions, and then acting on those conclusions, often again affecting the physical world. That may include the “Smart TV”, but not today’s.
The “Internet of Things” isn’t really about things. It’s about systems. It’s about gathering information in certain contexts or even finding out about new contexts and then improving the system as a result. You could, for instance, run a bus line from suburb into town on a sleepy Sunday morning with a promise of no passenger ever waiting for more than, say, 10 minutes, and make public transport vastly more attractive instead of running on a fixed schedule of every 60-90 minutes on that morning, if the bus system only knew where the prospective passengers were and can dynamically dispatch and route a few buses along a loose route.
“Let’s make an app” is today’s knee-jerk approach to realizing such an idea. I would consider it fair if someone were to call that discriminating and elitist as it excludes people too poor to afford a $200 pocket computer with a service plan, as well as many children, and very many elderly people who went through their lives without always-on Internet and have no interest in dealing with it now.
It’s also unnecessary complication, because the bus stop itself can, with a fairly simple (thermographic) camera setup, tell the system whether anyone’s waiting and also easily tell whether they’re actually staying around or end up wandering away, and the system can feed back the currently projected arrival time to a display at the bus stop – which can be reasonably protected against vandalism attempts by shock and glass break sensors triggering alarms as well as remote-recording any such incidents with the camera. The thermographic camera won’t tell us which bus line the prospective passenger wants to take, but a simple button might. It does help easily telling when a rambunctious 10 year-old pushes all the buttons and runs away.
Projecting the bus’ arrival time and planning the optimal route can be aided by city-supplied traffic information collected by induction loops and camera systems in streets and on traffic lights at crossings that can yield statistical projections for days and the time of day as well as ad-hoc data about current traffic disturbances or diversions as well as street conditions due to rain, ice, or fog – which is also supplied by the buses themselves (‘floating car data’) as they’re moving along in traffic. It’s also informed by the bus driver’s shift information, the legal and work-agreement based needs for rest times during the day, as well as the bus’ fuel or battery level, or other operational health parameters that may require a stop at a depot.
All that data informs the computation of the optimal route, which is provided to the bus stops, to the bus (-driver), and those lucky passengers who can afford a $200 pocket computer with a service plan and have asked to be notified when it’s time to leave the corner coffee shop in order to catch the next bus in comfort. What we have in this scenario is a set of bidirectional communication paths from and to bus, bus driver, bus stop, and passengers, aided by sensor data in streets and lights, all connecting up to interconnected set of control and information systems making decisions based on a combination of current input and past experience. Such systems need to ingest, process, and distribute information from and to tens of thousands of sources at the municipal level, and for them to be economically viable for operators and vendors they need to scale across thousands of municipalities. And the scenario I laid just out here is just one slice out of one particular vertical.
Those systems are hard, complex, and pose challenges in terms of system capacity, scalability, reliability, and – towering above all – security that are at the cutting edge or often still beyond the combined manufacturing (and IT) industry’s current abilities and maturity.
“Internet of Things” is not about having a little Arduino box fetch the TV schedule and sounding an alarm when your favorite show is coming on.
That is cool, but it’s just a thing on the Internet.
© Copyright 2016, Clemens Vasters - Powered by: newtelligence dasBlog 1.9.7067.0