The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models, according to Gartner, Inc. Gartner estimates that the IoT will include 26 billion units installed by 2020, and by that time, IoT product and service suppliers will generate incremental revenue exceeding $300 billion, mostly in services.
"IoT deployments will generate large quantities of data that need to be processed and analyzed in real time," said Fabrizio Biscotti, research director at Gartner. "Processing large quantities of IoT data in real time will increase as a proportion of workloads of data centers, leaving providers facing new security, capacity and analytics challenges.
The IoT connects remote assets and provides a data stream between the asset and centralized management systems. Those assets can then be integrated into new and existing organizational processes to provide information on status, location, functionality, and so on. Real-time information enables more accurate understanding of status, and it enhances utilization and productivity through optimized usage and more accurate decision support. Business and data analytics give insights into the business requirements data feed from the IoT environment and will help predict the fluctuations of IoT-enriched data and information.
"The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network, as real-time business processes are at stake," said Joe Skorupa, vice president and distinguished analyst at Gartner. "Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT."
The magnitude of network connections and data associated with the IoT will accelerate a distributed data center management approach that calls for providers to offer efficient system management platforms.
"IoT threatens to generate massive amounts of input data from sources that are globally distributed. Transferring the entirety of that data to a single location for processing will not be technically and economically viable," said Mr. Skorupa. "The recent trend to centralize applications to reduce costs and increase security is incompatible with the IoT. Organizations will be forced to aggregate data in multiple distributed mini data centers where initial processing can occur. Relevant data will then be forwarded to a central site for additional processing."
This new architecture will present operations staffs with significant challenges, as they will need to manage the entire environment as a homogeneous entity while being able to monitor and control individual locations. Furthermore, backing up this volume of data will present potentially insoluble governance issues, such as network bandwidth and remote storage bandwidth, and capacity to back up all raw data is likely to be unaffordable. Consequently, organizations will have to automate selective backup of the data that they believe will be valuable/required. This sifting and sorting will generate additional Big Data processing loads that will consume additional processing, storage and network resources that will have to be managed.
"Data center operations and providers will need to deploy more forward-looking capacity management platforms that can include a data center infrastructure management (DCIM) system approach of aligning IT and operational technology (OT) standards and communications protocols to be able to proactively provide the production facility to process the IoT data points based on the priorities and the business needs. Already in the data center planning phase, throughput models derived from statistical capacity management platforms or infrastructure capacity toolkits will include business applications and associated data streams," said Mr. Biscotti. "Those comprehensive scenarios will impact design and architecture changes by moving toward virtualization, as well as cloud services. This will reduce the complexity and boost on-demand capacity to deliver reliability and business continuity."
The pursuit of data-driven decision making has put tracking, logging and monitoring at the forefront of the minds of product, sales and marketing teams. ... More >>
While an EHR is supposed to automate and streamline the clinician's workflow, most systems are not living up to the promise. ... More >>
Recent research finds that 81 percent of IT pros are planning to move or are moving to OpenStack private cloud. Here are five best practices on leading the way to a production-ready OpenStack cloud solution. ... More >>