Welcome!

Apache Authors: Elizabeth White, Pat Romanski, Liz McMillan, Christopher Harrold, Janakiram MSV

Related Topics: Apache, Containers Expo Blog

Apache: Blog Post

Storage Virtualization Takes Steps Forward

Storage Virtualization Adoption

Anyone wonder why storage virtualization is still growing relatively slowly while server virtualization platforms like VMware have gone on a tear? If so, a look at the licensing strategies that these two respective virtualization technologies have adopted provides some insight into why.

In server virtualization, once a software feature is licensed, it is available to all of the virtual machines (VMs) hosted by that server virtualization OS. Granted, there is a caveat in that the software license pricing is dependent upon the number of CPUs in the physical host. In the case of VMware, it licenses its software in pairs with 2, 4, 8 and 16 license combinations so the price does go up as more CPUs are added to the physical host.

Conversely many storage virtualization software licensing models are based upon a capacity based model. Under this licensing model, an organization may initially buy 10 TBs of storage capacity and some software features to go with it (volume management, snapshots and possibly replication).

Yet what happens with the storage virtualization licensing model is that if more storage capacity is purchased (say another 10 TBs), the organization must purchase new additional capacity licenses in order for their existing software licenses for volume management, snapshot or replication software to work with that new storage capacity.

This is not wrong, per se, but companies vote with their check books and based on the rapid adoption of server virtualization, it suggests that this is the licensing model clearly preferred by companies. After all, if server virtualization and its associated costs didn't make economic sense, companies would not do it. However, it appears that the server virtualization licensing model is more palatable to end-users than the one currently offered by many storage virtualization providers.

To understand why this it is and what needs to change, two questions need to be answered:

· Why is the processing-based licensing model more acceptable to companies?

· What changes are storage virtualization providers making to their licensing models?

In response to the first question, a server virtualization OS such as VMware allows organizations to create as many VMs as they want on a single physical host. As long as there is adequate processing power on the physical host to keep up with the demand of each application on each VM, the server virtualization software does not restrict them from adding more VMs or using the features found in its OS with all of these VMs.

This is where the licensing model associated with storage virtualization looks to need some changes. Users view new storage capacity in much the same context as they view new VMs - as long as they have ample processing power, why should they be penalized if they add more storage capacity?

In the same manner, if companies purchase more storage capacity and now need more CPUs to support new features like asynchronous replication or snapshots, I suspect they will be more inclined to pay for new software licenses. However as long as storage virtualization providers continue to tie software licensing for new features solely to growth in storage capacity, this can only serve to slow its adoption.

One storage virtualization provider that is looking to change this trend in storage virtualization licensing practices is RELDATA. It has recently changed its software model so that when you license the software features with a base unit, all of those features are available regardless of how much storage capacity an organization adds in the future.

So for example, the MSRP for its 9240i is approximately $25K which includes software licensing for iSCSI, unlimited snapshots and twelve (12) 1 TB SAS drives. But because of RELDATA's new "Save as you Grow" program, organizations can now add more capacity to the 9240i without needing to pay more to use its iSCSI or snapshots features in conjunction with their newly added storage capacity. Further, if an organization does opt to add another software feature, such as replication, at some point in the future, it can also be used with all storage capacity under that the 9240i's management.

No matter how great or wonderful a new technology is (and virtualization certainly falls into that category), its adoption is often impeded by costly licensing models that preclude its broader adoption. RELDATA's recent decision to remove this software licensing barrier to the adoption of storage virtualization is representative of a step in the right direction for users. It encourages them to implement and take advantage of storage virtualization without worrying about any hidden licensing costs lurking in the background should they decide to add more performance or storage capacity in the future.

More Stories By Joe Austin

Joe Austin is Vice President, Client Relations, at Ventana Public Relations. He joined Ventana PR in 2006 with more than 14 years experience in high-tech strategic communications. His media relations experience spans both broadcast and print, and he maintains longstanding relationships with editors and reporters at business, IT, channel, and vertical publications. Austin's relationship with the media includes marquee outlets including CNN, BusinessWeek, USA Today, Bloomberg, and the Associated Press for clients ranging from startups to billion-dollar enterprises. Experience includes working with Maxell, McDATA (Acquired by Brocade), Center for Internet Security, Securent (Acquired by Cisco), Intrepidus Group/PhishMe, FireEye, Mimosa Systems, Xiotech, MOLI.com, EMC/Rainfinity, Spinnaker Networks (Acquired by NetApp), ONStor, Nexsan, Asigra, Avamar (Acquired by EMC), BakBone Software, Dot Hill, SANRAD, Open-E and others. With more than a decade of strategic planning, media tours, press conferences, and media/analyst relations for companies in the data storage, security, server virtualization, IT outsourcing and networking arenas, Austin's domain expertise assists in positioning clients for leadership. Austin was recently recognized as a “Top Tech Communicator” for the second year in a row by PRSourceCode. The editorial community – represented by more than 300 participating IT journalists – rated each winner based on best overall performance and recognized those who added the most value to their editorial processes in terms of responsiveness, reliability, and overall understanding of editorial needs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...