An open source environment has long been enticing in theory to the enterprise but rather difficult to implement in practice.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=iThe idea of compiling your own data environment from legions of low-cost, interoperable components is indeed compelling, particularly when support is lacking from a proprietary vendor. But integration issues and the fairly substantial in-house expertise required to support an open environment are not to be dismissed.
But that might not be the case for much longer. Along with the increased prevalence of open source solutions in the IT market today, there is also an accelerated trend toward greater automation and intelligent management that just might remove many of the headaches that accompany open architectures.
Datamation blogger Andy Patrizio recently posted some of the pros and cons of open source, and many of the cons look like the kinds of operational difficulties that automation platforms are designed to solve. For instance, open source places the requirement to alter source code and the continued support for those alterations on the user, something that can be more easily accomplished through automated processes. As well, ongoing cost of open systems is usually attributable to labor which, again, can be addressed by automation of the rote, repetitive steps of software development and infrastructure management. And ultimately, an automated architecture will have greater capacity to monitor and incorporate code that is generated by massive development communities and to oversee deployments as they scale into the cloud.
This is part of the reason why large enterprises are moving away from traditional IT vendors and more toward open platforms, says Battery Ventures’ Dharmesh Thakker. For one thing, nearly all of the Fortune 1000 is ramping up Big Data and hyperscale infrastructure modeled on Facebook and Google, both of which use either internally developed architectures or open systems built on commodity hardware. To date, we have yet to see an open vendor with the potential to become the next Microsoft or Cisco in the IT market, but with companies like Docker and Redis Labs pursuing a more viral marketing strategy aimed at business line managers and coders rather than the CIO-facing approach of traditional vendors, it might not be long before the entire industry sees some seismic shifts in its supplier models.
The rise of DevOps is also changing the calculus behind open source. As XebiaLabs VP Patrick Bishop noted recently, the relatively simple task of scripting becomes a major challenge as new products are scaled into production environments. The best way to handle it is to standardize the process by encapsulating the repeatable steps under a common deployment model. Once you’ve done that, of course, you’ve laid the groundwork for an automated deployment stack to take over day-to-day operations, allowing managers and administrators to work on optimizing the models toward more strategic objectives.
Automation is also making it easier to integrate multiple open source products into a cohesive environment, although this is by no means universal. Last summer, Chef announced a new way to implement disparate open solutions into automated, software-defined data ecosystems. The platform combines the core Chef project with the new Habitat system that embeds automation functions directly into applications. This creates a common dashboard that eliminates the need to acquire automation tools one-by-one and enables greater visibility into virtual architectures for improved workflow management and resource optimization.
It should be noted that automation provides these same benefits to proprietary systems as well, but since most vendor solutions are already integrated to a high degree and require less hands-on configuration from the user, the impact is less dramatic.
It is also important to remember that programming the automated processes to enable this kind of seamless operation is not an easy task, and the skillsets needed to accomplish this are only just starting to trickle down to the general knowledge workforce.
But the change is coming, and once automation gains a stronger foothold in the data center, employing open source solutions should become significantly less complicated and less costly.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.