More

    The Enterprise as Platform Developer

    Slide Show

    Six Warning Signs You’ve Outgrown Your Software

    Competition is heating up in the enterprise market, particularly when it comes to building the infrastructure to handle Big Data and the Internet of Things. But unlike past eras when vendor titans duked it out for IT supremacy, the pressure these days is just as likely to come from the users of technology as the developers.

    The primary example here is Facebook’s Open Compute Project, which seeks to build an end-to-end open source infrastructure suitable for either hyperscale deployments or highly converged platforms to replace current data center environments. Facebook has developed much of its own hardware, primarily servers and networking, but has no interest in becoming a technology vendor so is releasing its designs as reference architectures to the wider enterprise community. The latest release is the “Big Sur” compute module that the company has earmarked for artificial intelligence that will supposedly manage these ultra-complex data ecosystems far more effectively than humans.

    The Big Sur announcement comes on the heels of Google’s release of the TensorFlow machine learning platform, which is used to power speech and image recognition functions for the Google Now and Photo platforms. The system can operate across data center infrastructure or within a smartphone, according to Google officials, and will likely play a key role in the establishment of neural networks that can self-configure according to changing data and traffic conditions. And like Facebook’s contributions, systems like TensorFlow have already gained a fairly vigorous workout in production environments, albeit the proprietary ones of their hyperscale parents.

    Perhaps the most significant advancement in database analytics also got its start not from a commercial developer but by a web-facing company that simply needed a better way of managing data than what was commercial available at the time. I’m talking about Hadoop, of course, which emerged from inside Yahoo when it was having trouble matching its scale-out storage infrastructure with the highly parallel nature of its application processing. The Hadoop Distributed File System (HDFS) was quickly supplemented by the MapReduce parallel programming framework, plus metadata management solutions like HCatalog and the Hive query language, to form the underpinnings of a Big Data analytics solution that continues to evolve in a variety of ways by multiple third-party contributors.

    All of this has not gone unnoticed by traditional enterprise vendors who are in the process of re-aligning their business models to suit the emerging reality. Some, like HP, are splitting themselves up along high-performance and traditional systems business models, while others, like Dell and EMC, are merging into new industry powerhouses to better shield themselves from low-cost commodity rivals. Storage in particular looks to be the next wave of so-called “white box” solutions, with IDC noting recently that while sales of traditional storage arrays are declining, OEM solutions are on the way up and are largely responsible for both the increased profitability of the overall storage market and the amount of storage available on a worldwide basis.

    Large enterprises have long had internal development shops tasked with finding solutions to problems that are not addressed by off-the-shelf products, but by and large these apps had to exist within a defined infrastructure. But with the rise of SDN and the software-defined data center (SDDC), organizations will be better able to customize their own data environments, making it harder for both hardware designers and software developers to produce generic platforms aimed at meeting broad industry needs.

    Like today’s digital consumer, the enterprise is starting to warm up to the “anything, anytime, anywhere” ethos that drives modern app development. If they can’t buy exactly what they want from a traditional vendor, then the only alternative is to create a basic physical layer using low-cost commodity hardware and then design higher layer architectures on their own terms.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles