Slow Progress Toward IoT Standardization

    The crucial element in the establishment of a functioning IoT environment is connectivity. All devices must connect with newly deployed enterprise infrastructure on the edge, and many devices will need to connect with each other as well.

    The problem is that connectivity is a tricky business, involving multiple layers of data coordination, each one utilizing one or more protocols, file formats, packet structures and the like. So if the IoT is going to get off the ground at all, the data community as a whole needs to come up with some pretty powerful standards, and quick.

    The current lack of standards, in fact, is emerging as the one thing that could curtail what is otherwise expected to be a robust growth market, says research house MarketsandMarkets. The company estimates that the entire connected enterprise industry, which includes everything from smart devices to centralized systems and infrastructure, could nearly quadruple to $400 billion by 2021, a compound annual growth rate of more than 31 percent. Without a set of clearly defined, uniform standards, however, we can expect to see this growth restrained, particularly in areas like manufacturing and utilities, which require close cooperation across a number of systems in order to effectively leverage the IoT.

    A quick look at the range of IoT standards already in place gives a good idea of the challenge facing IoT connectivity, says Black Duck VP of Research Baljeet Malhotra. From the M2M communications protocol being developed by the OPC Foundation to solutions for messaging, web transfer, queueing, and a host of other functions, the IoT connectivity market is a veritable alphabet soup of standards. Making sure you select the right protocol for the right task is nothing new to IT, of course, but this becomes extremely difficult in an environment that is not only highly distributed but is designed to be largely self-functioning.

    And this is only dealing with the narrow focus of connectivity, says Diginomica’s Cath Everett. When we start to contemplate full interoperability, we have to coordinate activity across multiple layers of the IoT stack, plus make sure that all of this back-and-forth can take place across national and international jurisdictions. To accomplish this, the IoT industry will first have to model itself on the mobile communications industry but then figure out a way to push this coordinated ecosystem to generations of legacy equipment, most of which has only rudimentary interoperability capabilities. Without this, IoT infrastructure will suffer from poor performance, security vulnerabilities and unexpected costs, perhaps to the point where both the public and the commercial entities looking to capitalize on the IoT will become disillusioned with the whole idea.

    Fortunately, this necessity for broad interoperability is not lost on the designers and developers of IoT infrastructure. Earlier this week, the OpenFog Consortium, which has the backing of some 50 organizations, released its first reference architecture aimed at setting baseline standards for product design, security and other elements. The goal is to establish a common framework that provides reliable communications from the IoT end node to the cloud service layer, with minimal backhaul and ultra-low latency. The full spec is expected to cover eight technical areas, including SoCs, gateways and applications, as well as testing procedures and certification measures to gauge compliance. Now that the baseline architecture has been defined, the group is turning to low-level functions and the requirements for a minimally viable set of interfaces.

    Despite the rudimentary nature of standardization, there is every reason to expect the IoT will make its way into commercial operation sooner rather than later. This means that as the initial IoT ecosystem evolves, we can expect it to become more standardized, more connected and more interoperable.

    After all, it would sad indeed if instead of getting an Internet of Things, we got multiple Internets of Some Things.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles