More

    Disk or Solid State Is Only the First of Many Options

    Slide Show

    5 Essential Elements in Building an Agile Data Center

    Solid-state storage comes in many forms, so even after the decision to forgo traditional tape or hard disk is made, the enterprise still faces a number of questions regarding what kind of technology to employ and how it should be integrated into existing infrastructure.

    Quite often, however, decisions regarding storage are driven not by the form factor itself but by the application in need of support and the broader evolution of the data center from a collection of discrete components to an integrated, modular entity.

    The solid-state disk (SSD), for example, is in a peculiar position when it comes to modern data architectures. On the one hand, high-capacity SATA drives like the new Micron 5100 series provide a significant improvement in speed and scale compared to disk and even early solid-state solutions, but they still cater mostly to traditional distributed storage infrastructure. The device scales up to 8 TB and can be tuned to various performance characteristics under the company’s FlexPro architecture, while its SATA interface allows it to seamlessly integrate into existing storage arrays to either supplement hard disks or replace them altogether.

    Still, organizations should take care when making this kind of change, warns AWS’s Jeffrey Layton. SATA has a much higher data error rate than SAS, so you could be doing more harm than good to high-performance applications unless you also incorporate advanced corruption detection capabilities on the file system. This, of course, adds more cost to the overall storage environment, so you must make sure that the performance benefits are worthwhile. As well, solid state in general has lower endurance and requires greater write amplification and block-level updates than hard disk.

    Nevertheless, expect SSDs to continue to eat into the storage market, says Enterprise Storage Forum’s Drew Robb, primarily because the technology offers both the speed and capacity required of emerging data loads, with small form factors and energy-efficient performance to boot. Major developments are taking place in 3D NAND technology, which is driving capacity up to 16 TB of non-volatile data on a 2.5-inch drive. In fact, says HPE’s Ivan Iannaccone, Flash technology is outpacing Moore’s Law by doubling capacity every 12 months, rather than 18.

    But since storage decisions are more about the use case than the technology, wonders the UK Register’s Danny Bradbury, which applications are most suitable for solid state? In the old days, nothing but the most demanding functions could justify the cost, such as high-speed stock trading and medical research. With costs coming down, it is now common for organizations to run OLTP workloads on Flash, and the trend is quickly expanding to active and near-active datasets such as those found in Sharepoint and Exchange environments.

    Solid state will also likely become more at home as hyperscale infrastructure takes hold in the cloud and the enterprise. With Big Data and IoT workloads becoming common, few organizations will be able to afford the operational costs of spinning media, even if they have the room to scale them up to production levels. At the same time, solid state – either in a separate disk array or on a board-level in-memory configuration – is the only viable solution to provide the quick turnaround demanded of high-speed analytics and machine-to-machine communications.

    There are those who argue that tape and disk have no place in the modern digital economy, just as spinning wheels are no longer needed in today’s textile industry. This may come to pass at some point, but for the time being, there is still a wealth of legacy applications that perform just fine with magnetic media and will likely provide value to the enterprise for a while longer.

    Regardless of which media is under consideration, however, the enterprise should base its decisions on a long-term strategic vision rather than immediate problem-solving. The ultimate goal is a cohesive environment that lends itself to continuous improvement, software-defined architectures, and high-degrees of autonomy and user self-service. Solid state has a lot to offer, but it is not necessarily the best choice for every need.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles