More

    Is It Time to Consider Open Source?

    Slide Show

    Six Ways Open Source Benefits Your Business

    It’s somewhat ironic these days that even as hardware is becoming denser and more modular, infrastructure is becoming larger and more distributed. Both trends are feeding off one another, of course, driven by the decreasing cost of commodity systems and the emerging needs of Big Data and the data economy.

    But this is also propelling a need for more open architectures, both in the data center and across third-party infrastructure, which is both a blessing and curse for the enterprise. Everyone loves the idea of extending abstract data environments without regard to underlying infrastructure, but open systems also tend to be more operationally complex than proprietary ones, and they do not always produce the most cost-effective solution.

    Open platforms are already taking firm hold in Big Data environments, according to Research and Markets. Its latest Global Big Data Infrastructure Market report has the industry expanding at a compound annual rate of 33.15 percent for the rest of the decade, driven largely by open solutions like Hadoop, R, Cassandra and MongoDB. This makes a lot of sense given the volumes involved and the fact that most of the analytics are taking place on unstructured data, but it also means the enterprise will need to quickly acquire the expertise needed to integrate the various components of Big Data analytics and storage in order to drive true value from the technology.

    Open systems are also making their way to the cloud in general, which should make it easier for organizations to provision the resources they need at scale. But again, these architectures won’t arise by themselves (not yet, anyway), so knowledgeable IT technicians will still have a fair amount of integration to take care of. The new Mikata release of OpenStack, for example, which is designed to address the complexity issues that have hampered the platform’s deployment, offers things like a newly streamlined client interface, simplified setup of crucial components like Nova and Keystone and “intent-based” configuration tools that allow providers to tailor environments more accurately to user needs. In the end, though, OpenStack still requires a lot of hands-on functionality for anyone looking to deploy their own open cloud.

    Still, it’s not like anyone else is offering a plug-and-play cloud environment, if such a thing is even possible. As Gina Longoria, senior analyst for Moor Insights & Strategy, notes, OpenStack has come a long way in the six years since its inception and is now ready for enterprises that are anxious to build scale-out cloud environments without tying themselves to a single vendor. Core stability, for one, has been markedly improved and the user base has expanded to the point where troubleshooting and issue resolution are now handled promptly. As well, the skills shortage is dropping due to stepped up training and certification.

    At the same time, the Open Compute Project (OCP) is working to overcome early impressions of a lack of focus and an overall architectural design that is not conducive to the needs of the average enterprise. According to Hyperscale IT’s James Bailey, OCP now contains a comprehensive portfolio of products designed to enable scale-out infrastructure in various proportions. This is being addressed through multiple rack configurations, improved warranty support from ODM manufacturers and improved energy efficiency, which benefits all deployments regardless of size.

    The open vs. proprietary debate started long before the cloud and Big Data came along, and it will probably continue long after they’ve gone mainstream. In the end, it comes down to a fundamental decision: Do you want someone else to bear the brunt of provisioning and optimizing your data environment, or do you want the freedom to do it yourself?

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles