What Is the Proper Level of Standardization for the IoT?

    The Internet of Things (IoT) is taking shape at a rapid clip, but like any other cooperative technology initiative, the need for standards is starting to draw interest as well.

    While it seems obvious that millions of sensors distributed around the globe would need some sort of interoperable framework, what about the rest of the IoT infrastructure? When we delve into wireless networks, backhaul and even analytics, where will standards help overall IoT functionality and where will they hurt?

    At the recent Enterprise IoT Summit in Austin, Texas, InterDigital Executive Vice President Jim Nolan pointed out that when it comes to municipal IoT endeavors like smart cities, the need for standardization will be quite broad. Without commonality in M2M communications and other facets, the development of smart cities could see a collective cost overrun of some $341 billion, or about 30 percent of the cost of implementation, according to a recent study by Machina Research. When we consider that the World Economic Forum estimates that the IoT could drive some two-thirds of global GDP over the next decade, this represents a huge, and utterly avoidable, expense for the world economy.

    Already, multiple organizations are looking to standardize specific pieces of the IoT, says’s Maciej Kranz. The IEEE has launched multiple initiatives aimed at bringing industry, academia, entrepreneurs and investors to the table over basic architectural frameworks and the means to merge policy decisions with technological developments. As well, the oneM2M Consortium is working on a common M2M service layer, while the AVnu Alliance is building standards for time-sensitive networking. Meanwhile, various industry consortia are looking into Industrial IoT standards overseeing interconnects, analytics and even employee retraining for the new work environment.

    Still, the shadow of Mac vs. PC and VHS vs. Beta is starting to creep over the IoT as well. According to the Wall Street Journal, two industry giants, GE and Siemens, have developed competing cloud platforms that aim to link devices, software and processes (subscription required) in what will become the next-generation manufacturing sector. GE has so far drawn about 300 partners to its Predix system, while Siemens is said to have about 100 followers for MindSphere. It is difficult to say whether segmentation on this level would help or hurt the broader IoT – is there any reason why, say, rival shoe companies would need interoperable factories? But from a technological perspective, anything that impedes the smooth flow of information is generally considered a net negative.

    And while certain highly regulated industries, like health care, require a fairly open framework for data exchange, it can be difficult to determine exactly how deep the commonality should extend. IBM has adopted SNOMED CT, an international standard for clinical terminology, for its Watson-based health care data analytics platform. This should help streamline the exchange of medical information between organizations, which in turn should improve processes like diagnostics, treatment coordination and prescription fulfillment. At the same time, the company is working with the FDA to implement the blockchain open ledger format for health data collection and other research-related functions. As a vendor solution, however, this data exchange becomes more complicated between IBM users and those who have adopted rival platforms. (Disclosure: I provide content services for IBM.)

    It’s a safe bet that few people are looking forward to a universally interoperable IoT, which would literally have the potential to harness the entire world economy under a single, globally distributed computer brain. But at the same time, we don’t want countless, isolated data silos interfering with the IoT’s primary function: to compile large data sets for advanced analytics purposes.

    Somewhere there is a middle ground, but it will probably take a few years’ worth of trial and error in real-world production environments to find it.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles