IBM Aggressively Moves to Software-Defined Everything

    Slide Show

    Increasing Enterprise Application Performance with Route Optimization

    Today, IBM announced the release of its workload management software, Spectrum Computing, with the underlying conceptual goal being a software-defined data center. The apparent goal is to make the data center, both on premise and in the cloud, fully defined by software. This isn’t just IBM’s goal, of course; it’s also the natural outcome of a variety of software-defined efforts from other vendors currently defined by software-defined storage, software-defined networking and software-defined infrastructure. IBM, arguably, is the most aggressive on the last component to date.

    Targeting workloads that include both traditional analytics and the IBM Watson-defined area of cognitive computing, this announcement is an interesting indication of where the enterprise IT industry is aggressively going.

    Let’s explore that a bit more.

    Software-Defined Everything

    Overall, Spectrum Computing is an effort to provide users with an unmatched capability for selecting the performance they need for particular tasks dynamically. And with the new blended capability of selecting on premise for control and security, or cloud for cost savings, all overlaid by policy restrictions and guidelines, the result should be an unprecedented level of control and capability that better optimizes both the existing and future resources of the related IT shop.

    Spectrum is an appropriate name for the offering because it represents the utilization of the full range, or spectrum, of IT resources without the inherent limitations of either cloud or on-premise approaches. It can also cover both x86 and power resources on top of IT’s storage and networking resources.

    So this latest announcement from IBM is as much a statement of direction as it is an announcement of current enhancements and offerings.


    There are three components to this latest announcement.

    1. IBM Spectrum Conductor is designed to bridge the operating environments of a company, on premise and cloud, and to decrease the time to results across a variety of complex legacy and scale-out applications, while protecting both the data and results throughout the lifecycle.
    2. IBM Spectrum Conductor with Spark is an adaption specifically targeting Apache Spark, the popular open source Big Data analytics framework, with the claim that it can provide up to 60 percent faster results.
    3. IBM Spectrum LSF is the easy-to-use workload software management component created to accelerate research and design by a claimed 150 times while containing costs through advanced resource sharing and improved existing infrastructure utilization.

    Proof Points

    Examples of those utilizing the software include Red Bull Racing, which is using IBM Spectrum LSF to manage its design simulations and race analytics.

    Other key examples, with vendors choosing to remain anonymous, were in medical research and finance, both industries known for using massive resources to address problems critical to their customer base. From finding cures or identifying new and existing illnesses to modeling financial trends for investors, the need for a tool to aggressively manage and optimize all of a firm’s IT resources based on complex analytical tasks is unprecedented. It is that need that this offering is targeted for.  

    Wrapping Up: The Future of Software-Defined Everything

    As this IBM solution evolves, it should shift from enabling resources like IBM’s SoftLayer and Watson to dynamically integrating them. I think the future of this effort isn’t just giving users the tools they need to provision and manage an increasingly flexible software-defined set of tools. I think it also provides a cognitive resource that dynamically suggests, then proactively implements, the appropriate structure for a task, leaving the user far more time to do research and less time using the tool itself.

    The eventual power of a tool like this is the capability for an AI engine to do the work itself with the dynamic ability to automatically define everything itself. We aren’t there yet, but this IBM Spectrum Computing announcement is a huge step toward enabling an AI-defined future.

    Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm.  With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+.

    Rob Enderle
    Rob Enderle
    As President and Principal Analyst of the Enderle Group, Rob provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles