The idea of high-performance NAS technology is starting to gain ground as data and speed requirements of even mid-level enterprises continue to push current systems to their limits.
And although no one really agrees as to exactly what constitutes a high-performance NAS, the over-arching goal is to provide an environment that provides all the benefits of file-based storage but is robust enough to accommodate mission-critical data.
HP has been making a lot of noise in this area of late. The company's 2006 purchase of PolyServe is said to be the foundation of an advanced NAS offering under the StorageWorks EVA label. HP is hoping to release a product this fall, preferably before NetApp brings out a system of its own.
However, both companies will have to play catch-up with Hitachi, which launched two high-performance NAS systems last month. The NAS Platform 2000 and 2000 Nearline are stocked with data protection, replication, clustering and virtual server tools. The 2000 Nearline scales up to 2 petabytes. As for file management, the company recently brought in Brocade's StorageX file virtualization and management software to handle data migration and namespace oversight for both local and distributed heterogeneous environments.
While investing in a more powerful NAS may seem like a no-brainer, there's actually quite a lot to consider, according to this article on SearchStorage.com. Depending on your existing configuration, a new NAS may require specialty host software or drivers, while certain applications may require additional tuning and load balancing. In short, you need to understand your workload first, and optimize your platform to fit your needs.
High-performance NAS is in its infancy right now, so the systems available today are light on advanced features like snapshotting, migration and replication. But if demand is high enough, it shouldn't be too long before enhanced versions begin to show up.