IBM Monitors Social Media to Provision Cloud Capacity at Australian Open

    The trouble with managing any application these days is that there’s not much in the way of a warning before demand for compute capacity for that application suddenly spikes. This is especially true for applications running on cloud computing platforms that might be accessed by hordes of people who suddenly decide to whip out their smartphone or tablet device because some unexpected event occurred.

    But as is the case with all things, to be forewarned is to be forearmed. Taking that premise to heart, IBM at the Australian Open this week is analyzing social media feeds from services such as Twitter and Facebook to better understand where demand for specific applications may be coming from next.

    According to John Kent, IBM worldwide sponsorship lead, IBM already has a pretty good idea what match-ups will drive the most demand. But as is the case with any event, there is always the unexpected event that starts to drive buzz, which usually results in more people suddenly wanting to access a particular application that IBM is running on its cloud computing platform in the U.S. as a sponsor of the Australian Open.

    By monitoring that social media activity, Kent says IBM is now provisioning additional server capacity using IBM Tivoli Provisioning Manager software before that traffic hits because it can literally see interest in a particular event building in real time. To accomplish that goal, IBM developed what it is describing as a predictive cloud provisioning capability that uses that information to dynamically provision additional capacity as needed.

    That’s significant because rather than over provisioning every resource in the data center to accommodate an infinite number of possibilities, IBM can now make better use of capacity of demand that is on standby. The end result, says Kent, is not only less cost, but also better application performance because the amount of time it takes to provision servers to meet additional capacity demand has been sharply reduced.

    The days when capacity planning used to be something IT organizations sketched out over a period of weeks and months are over. The data center of today is expected to be able to dynamically scale up and down to meet any requirement regardless of how much warning there may have not been. Ultimately, that is not only going to rely on more IT automation, but also a lot more intelligence about what’s happening in the world at large outside the data center.

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles