Ralph Wynn, FalconStor’s Sr. Product Marketing Manager, had this to say about the win:
IT outages and downtime are unacceptable. Companies need reliable data protection and disaster recovery systems in place that will ensure operations continue normally even in the event of a hardware failure or natural disaster. The FalconStor NSS VS Series Appliance with RecoverTrac technology makes downtime a thing of the past, and the 2012 Editors’ Best Award demonstrates its strategic importance.
FalconStor NSS is a scalable, highly available solution that enables storage virtualization, data migration, and data protection in heterogeneous environments. The solution’s ability to provide rapid automated disaster recovery allows you to restore Any Service, Any Time, Any Place – addressing your most demanding disaster recovery and business continuity needs. It’s no wonder FalconStor NSS is an award-winning product!
For more information on how FalconStor NSS provides complete data protection against disasters, check out one of our recent case studies. This 2-page success story details how one customer uses FalconStor NSS to cost-effectively reduce downtime and recovery time objectives.
As a storage virtualization company “literally” at its core, FalconStor has been one of the pioneers of this technology since early 2000. Very few of those who first took on that challenge are still standing, and while server virtualization has taken the data center over like a wild fire, storage virtualization hasn’t seen massive adoption in IT organizations.
Reflecting on 12 years of our company’s history may provide some answers, but I’ll save this for another blog. The fact is through these 12 years FalconStor grew to be one of the most respected and recognized names in the storage industry by staying faithful to it’s primary mission: simplifying very complex storage management processes while containing IT costs And our storage virtualization technology has been at the heart of everything we do.
As I’ve mentioned before in previous blogs, the way I think about things changed after business school. Life was far simpler as an engineer. I enjoyed and appreciated products for their technology with no consideration for financial implications. I see an analogy between my current car situation and many of the customers I speak with on a regular basis, with regard to SANs. Should you buy a new SAN or enhance and maintain your existing, paid for, SAN?
Now that server virtualization technologies have been proven in many environments, more people are looking at virtualization to improve the efficiency of their primary workloads in the data center. Despite the realized benefits from virtualizing non-mission-critical applications, two questions remain on the minds of IT professionals. One, since traditional backup doesn’t work in virtual environments, how can I effectively protect virtualized workloads? We are talking mission-critical applications here! Two, I know how I reduced my server infrastructure with virtualization, but I also know how my storage cost went way up as a result. So how can I reduce my storage costs while implementing server virtualization?
In a recent report from ESG on the “Impact of Server Virtualization on Data Protection,” when asked about top server virtualization initiatives for 2010, most respondents placed backup, recovery, and replication right after virtualizing more workloads. It is very well understood that server virtualization breaks traditional backup processes. The consolidation of servers and workloads is leaving very little resources for backup applications to perform data copies. In virtual server environments, CPU utilization climbs to more than 60 to 70 percent, up from an average of 20 percent in physical environments, leaving very little for the most demanding job of them all, backup. In addition network resource utilization is increased to such a degree that very little bandwidth remains for massive data transfers required by backup operations.
When VMware took a look at how virtual shops were approaching data protection back in 2008, the responses from data centers indicated that, for all the benefits of virtualization, the shift away from physical machines was coming with some significant growing pains. Eighty percent of respondents to the survey said they were backing up to tape and failing to meet their backup windows. The traditional back-up challenges they faced before going virtual were magnified in their new environments, and they faced shrinking CPU, I/O and networking resources.
And over the past two years, companies have hunted for a solution that would deliver on the promise of virtualization and meet internal expectations for always-available, easily accessible data. Most of those attempts were variations on traditional back-up methods with virtualized Band-Aids on top.
Today, we know that the key to success in protecting data in these environments does not lie in tweaking traditional backup methods to fit virtualized enterprises. Rather, the foundation of the solution starts with storage.
Sometimes, the best way to determine which of two options is preferable is to look at them side by side. Trying to decide between a Honda and a Toyota? Compare them directly on fuel efficiency, price, and reliability. Not sure which menu items at your favorite restaurant are healthiest? Evaluate them both in terms of calories, fat content, and sodium. Wondering whether storage virtualization with really transform your data center? Read The Aberdeen Group’s latest report on companies that deploy storage virtualization versus those who do not.
In a recent piece in the July issue Virtual Strategy Magazine, Aberdeen Senior Research Analyst Richard Csaplar shares the highlights of a recent study exploring the main reasons enterprises are adopting storage virtualization, the primary benefits they’re realizing by doing so, and the evidence that these companies are assuming leadership roles in data center transformation.
Csaplar notes that the combined pressures of increasing storage demands, lengthy backup and restore times, and dwindling space in the data center are moving organizations toward storage virtualization – and that these companies are the ones most savvy about virtualization in general. Csaplar writes, “Companies with virtualized storage show a wider range of virtualization projects across the organization than those with no storage virtualization. They are more likely to have server virtualization (95%), purchase servers optimized for virtualization (74%), have a converged network (49%), and have virtualized their desktops (35%). These organizations are clearly familiar with virtualization technologies and understand the benefits of deploying it.
Companies in the Aberdeen study and in practically every industry are seeking out more efficient business processes. They have to. The proliferation of data across the enterprise has become so extreme, they can no longer ignore the pressure to change the way they handle that data. Nobody wants to get stuck managing separate islands of storage in the data center. The most dedicated, hard-working, knowledgeable data center managers will buckle under the complexity of such a scenario, which requires independent management and protection policies and leads to resources that are either under- or over-utilized.
So, storage virtualization is becoming an attractive option for numerous enterprises. The Aberdeen study shows that the most skilled companies are leading the way. The study illustrates that the adoption of storage virtualization is not a random or impulsive move by inexperienced businesses. It is increasingly embraced by those who have achieved success in other parts of their infrastructure through the benefits of virtual technologies. Csaplar writes that deploying virtualized storage leads to significant benefits, including reduced strain of managing storage area network (SAN) devices, reduced number of SANs, and reduced time to deploy new servers.
And storage virtualization reduces something else too – cost. Our customers report that virtualizing their storage devices and servers saves them about half a million dollars a year. It doesn’t take a careful comparison to realize that kind of savings is a good choice.
Yesterday Searchstorage.com published a Q&A with Curtis Preston, widely known as Mr. Backup. The topic was around the use of snapshot technology for data backups – basically a look at the increased use of SAN-based technology as an alternative or complement to traditional backup solutions.
Curtis brings some great points to the discussion and argues that, if you want better backup, snapshots is the way to go. I tend to agree with him. I’ll try to answer some of the questions that he leaves open, since the question of backup transformation is probably broader than I can cover in one or even a few blogs. The poignant fact that Curtis points out at the end of the Q&A is this: “…the backup and recovery space moves at a glacial pace. Backup people, by nature, are paranoid.” Therefore, I think it’s up to us as a collective – vendors, subject matter experts, and industry analysts in conjunction with end users – to redefine that space.
Welcome to virtual reality! Now that virtualization is real, it’s time for us to talk about the reality behind its application. We are all excited about how virtualization is changing the IT world – providing better resource utilization, cost reductions, faster provisioning, automation, higher availability, and better mobility. It all sounds good – no it actually sounds great – and the technology is completely transforming the way we design data centers and define business processes. Now that we’ve seen the benefits that server virtualization brought to the table, the next logical application is desktop virtualization. The concept is bringing a new approach to enterprise-wide desktop deployments that is aimed at providing a better end-user and administrator experience than physical desktops: lowering the cost of acquisition and management while offering a highly scalable, easy to deploy, and fully protected desktop environment. Nevertheless, this consolidation raises new challenges in terms of compute resource allocation and granular data protection and recovery processes – this is where virtual reality starts.
Last month, I wrote a couple of blogs about the benefits and challenges of SSD, along with some advice for companies that want to take advantage of this technology’s speed without falling victim to its potential cost or management challenges.
It seems that more and more businesses are grappling with this issue and piecing together ways to adopt SSD despite its dramatically steeper price.The difference in transactional throughput and latency between solid-state memory and traditional disk is equally dramatic and makes the move to SSD attractive, even for small to mid-size companies.
Recently, in pursuit of OMB's federal data center consolidation initiative, many government agencies have focused on realizing the benefits of virtualization to better leverage existing budgets by consolidating physical infrastructure and commoditizing their server hardware through the use of virtual servers. Software and hardware vendors have been very focused on selling the benefits of server virtualization.What many agencies don't realize is that they are missing the other half of the story -- Storage virtualization, and a huge opportunity to increase overall efficiency and generate substantial savings, while optimizing asset utilization and improving data mobility.