Enterprise executives are already reeling from the sea changes taking place in virtual, cloud and now software-defined infrastructure. But while most of these developments are viewed in terms of how they will alter IT environments for the betterment of human users, the fact is that the enterprise is rapidly becoming more machine-centric and will likely become predominantly so in the near future.
Machine-to-machine communications (M2M) is nothing new, but the speed at which it has become a major piece of the IT tapestry has taken some observers by surprise. According to Infonetics, the number of M2M connections is on pace to nearly triple to 4 billion-plus by 2017, with the M2M services market more than doubling to $31 billion. Much of this growth is coming from PAN (personal area network) topologies that incorporate wireless technologies like Wi-Fi and Bluetooth, but about 16 percent comes from backhaul services – largely automated systems that interact with each other with little or no human involvement.
Indeed, enterprises that embrace wireless networking and BYOD architectures will have no choice but to embrace increased M2M operations, and develop policies to deal with it. Analysts at Berg Insight say wireless M2M will grow by 24.4 percent per year to hit 489.2 million connections in 2018. And we’re not just talking about cell phones and tablets. The group says that as everything from cars to washing machines becomes more connected, M2M activity will be a major driver of Big Data traffic going forward.
While at first blush, this may seem like nothing more than a data management problem, it actually goes deeper than that. M2M data has a tendency to introduce latency between various IT components that can hamper application performance even if traditional management and monitoring tools say all systems are functioning normally. This has led to a number of vendor partnerships aimed at boosting visibility into complex transactional environments, such as the recent tie-up between Splunk and Inetco that integrates the latter’s NetStream data transformation tool into the Splunk Enterprise platform. The combo gives Splunk real-time visibility into application payloads by converting network data into rich transactional streams that can be analyzed and managed to improve utilization, boost security, and augment a number of other functions.
The need for enhanced M2M security in particular has not yet sunk in at many organizations, it seems. As Tatu Ylönen, developer of the Secure Shell (SSH) protocol told our sister publication Enterprise Networking Planet, poor SSH key management is a “ticking time bomb” that could blow up in the enterprise at any moment. SSH is widely utilized in router, virtual and cloud service management systems, where it handles login authorization for increasingly heavy M2M environments. But while most installations are optimized for key-based access with requisite passwords and updates, machine access is largely unmanaged. In some instances, this has resulted in 80 or even 90 percent of system access going to automated M2M operations, giving hackers a huge window of opportunity to steal vital secrets or plant malicious code.
Some people may be unnerved by the thought of devices using and manipulating data for their own designs, but this is in fact one of the natural by-products of the cloud-based infrastructure that the enterprise has been pursuing so doggedly for the past decade. Advanced automation is not merely a desirable tool to ease operators’ management burden, but a vital component of modern infrastructure now that newly virtual systems and architectures push operations beyond the level of human comprehension.
The key, though, is not to simply let machines run amok, but to ensure that M2M interactions take place within a well-defined framework – one that is flexible enough to handle rapidly changing circumstances but robust enough to ensure the proper oversight and security of behind-the-scenes data activity.