How the Internet of Things Will Transform the Data Center

Share  
1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9
Previous Next

Click through seven challenges the Internet of Things will bring to the data center, as identified by Gartner.

The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models, according to Gartner, Inc. Gartner estimates that the IoT will include 26 billion units installed by 2020, and by that time, IoT product and service suppliers will generate incremental revenue exceeding $300 billion, mostly in services.

"IoT deployments will generate large quantities of data that need to be processed and analyzed in real time," said Fabrizio Biscotti, research director at Gartner. "Processing large quantities of IoT data in real time will increase as a proportion of workloads of data centers, leaving providers facing new security, capacity and analytics challenges. 

The IoT connects remote assets and provides a data stream between the asset and centralized management systems. Those assets can then be integrated into new and existing organizational processes to provide information on status, location, functionality, and so on. Real-time information enables more accurate understanding of status, and it enhances utilization and productivity through optimized usage and more accurate decision support. Business and data analytics give insights into the business requirements data feed from the IoT environment and will help predict the fluctuations of IoT-enriched data and information.

"The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network, as real-time business processes are at stake," said Joe Skorupa, vice president and distinguished analyst at Gartner. "Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT." 

The magnitude of network connections and data associated with the IoT will accelerate a distributed data center management approach that calls for providers to offer efficient system management platforms. 

"IoT threatens to generate massive amounts of input data from sources that are globally distributed. Transferring the entirety of that data to a single location for processing will not be technically and economically viable," said Mr. Skorupa. "The recent trend to centralize applications to reduce costs and increase security is incompatible with the IoT. Organizations will be forced to aggregate data in multiple distributed mini data centers where initial processing can occur. Relevant data will then be forwarded to a central site for additional processing." 

This new architecture will present operations staffs with significant challenges, as they will need to manage the entire environment as a homogeneous entity while being able to monitor and control individual locations. Furthermore, backing up this volume of data will present potentially insoluble governance issues, such as network bandwidth and remote storage bandwidth, and capacity to back up all raw data is likely to be unaffordable. Consequently, organizations will have to automate selective backup of the data that they believe will be valuable/required. This sifting and sorting will generate additional Big Data processing loads that will consume additional processing, storage and network resources that will have to be managed. 

"Data center operations and providers will need to deploy more forward-looking capacity management platforms that can include a data center infrastructure management (DCIM) system approach of aligning IT and operational technology (OT) standards and communications protocols to be able to proactively provide the production facility to process the IoT data points based on the priorities and the business needs. Already in the data center planning phase, throughput models derived from statistical capacity management platforms or infrastructure capacity toolkits will include business applications and associated data streams," said Mr. Biscotti. "Those comprehensive scenarios will impact design and architecture changes by moving toward virtualization, as well as cloud services. This will reduce the complexity and boost on-demand capacity to deliver reliability and business continuity."

 

Related Topics : IBM Looks to Redefine Industry Standard Servers, APC, Brocade, Citrix Systems, Data Center

 
More Slideshows

cloud23-190x128 Key Security Considerations for Enterprise Cloud Deployments

A review of the enterprise security architecture is an absolute necessity in defining, monitoring and managing enterprise cloud deployments. ...  More >>

mobile25-190x128.jpg How the Cloud Is Changing the Way We Work

Technology as a whole is continuing to change how we all work, and the cloud, in particular, is the key driver in allowing us to clock in, collaborate and contribute from anywhere in the world. ...  More >>

Global9-190x128 Five Steps to Prepare for the Global Intercloud

Five critical steps CIOs and IT pros can take to further develop their cloud strategy and prepare their organizations for the coming Intercloud. ...  More >>

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.