Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

Blog Feed Post

Windows Server 2012 (Windows Server “8″) – Virtual Fibre Channel

This is one of a series of posts discussing the new features in Windows Server 2012, now shipping and previously in public beta as Windows Server 8.  You can find references to other related posts at the end of this article.  This post reviews the new Hyper-V 3.0 feature, Virtual Fibre Channel.

Background

Virtual Fibre Channel (VFC) enables a Hyper-V guest to access the physical storage HBAs (host bus adaptors) installed in the Hyper-V server.  Normally, storage adaptors would be reserved for the use of the Hyper-V guest itself however this new feature acts as a passthrough, enabling any Hyper-V 3.0 guest (at the right O/S level) to access the HBAs and so connect directly to fibre channel storage devices.

VFC is implemented through the use of NPIV, or N_Port ID virtualisation.  This a fibre channel standard that permits a single HBA to act as multiple nodes within a SAN environment.  Normally, a single HBA connects to the SAN and presents a physical ID known as a World Wide Port Name or WWPN.  This deals with the physical connectivity of the fabric.  At the same time, the connecting server or storage device presents a node name ID or WWNN (World Wide Node Name).  A WWNN can be unique per adaptor as is the case with most host-based HBAs or can be a single node representing an entire device such as a storage array.  NPIV allows a single physical adaptor to present multiple node names to the fabric and so effectively “virtualise” the physical device.  Each new node also has to have virtual WWPNs in order to adhere with fibre channel standards.

The benefits of being able to use NPIV to virtualise an HBA is that each guest in a Hyper-V environment can be assigned its own WWNN and so have a direct connection to the SAN.  It may not be immediately obvious how this helps when virtual server infrastructure is supposed to abstract the physical layer but there are a number of distinct advantages in zoning storage devices in this way:

  • Zoning can be done to the individual guest and is therefore more secure (albeit that it still goes through the hypervisor)
  • Tape drives can be supported, so backup software can write directly to devices
  • Storage that requires failover, snapshots and other SCSI based functionality can be directly supported, especially where non-standard SCSI commands are used

Implementation

VFC is configured in Hyper-V Manager using the new Virtual SAN Manager option (see the screenshots).  Only HBAs and firmware that support NPIV can be used for VFC.  This means newer HBAs only, for example Emulex HBAs at speeds of 4Gb/s and above.  Obviously the SAN fabric needs to support NPIV too.  An HBA can only be attributed to one virtual SAN, however a virtual SAN can contain multiple HBAs.  Once the virtual SAN is created, a virtual HBA can be assigned to a guest using the Add Hardware section under Settings.  Fibre channel IDs can be set as any 16-digit hexadecimal number, although it’s not advisable to use values that are already reserved out for vendors.  Microsoft defaults to some standard values, which can be auto-generated to new values through the “Create Addresses” button.  As yet I’ve not worked out why there are two sets of addresses as only the first appears to be visible on the fabric.

As soon as a guest is started, the fabric login process begins, even if no guest O/S has been installed.  As you can see from screenshot 4, the additional node indicates the source Hyper-V server (in this case PH03) but doesn’t pass through the guest name, indicating it only as “Hyper-V VM Port”.  It would be a nice update to be able to see the VM name there.

Using VFC within the Hyper-V guest requires two things; a supported O/S – one of Windows Server 2008, Windows 2008 R2 or Windows 2012 – plus the installation of the latest Integration Services update that comes with Windows Server 2012.  This means that the virtual fibre channel adaptor is not emulated as a native device and so can’t be used with other operating systems like Linux (more on this later).  The fifth screenshot shows the emulated HBA controller and tape drive I presented to the host.  One question that seems to have been discussed on a number of blogs is the support for tape drives.  I can confirm tape drives do work, but can’t see any documentation from Microsoft to say whether they are officially supported.

Performance

I chose a tape drive as this is a good way of demonstrating performance.  Deploying Backup Exec 2012 onto my Windows 2008 R2 guest, writing to an LTO2 drive, I achieved around 12MB/s, better than I’ve managed with an emulated drive through vSphere 5.0.  This is well under the spec of the drive itself (max 40MB/s) but is certainly usable in small environments.  More testing is needed here I think, as there appeared to be little overhead on the Hyper-V server to manage the data passthrough.

The Architects View

Virtual Fibre Channel is a great feature for providing native SAN device support.  However there are few restrictions on use, most notably on the need to have latest hardware and be using Microsoft platforms.  I haven’t yet seen any best practices for using VFC; for example should HBAs be placed in a single virtual SAN or should multiple ones be configured for failover; these are questions that need to be answered.  VFC could be massively improved on two fronts; firstly drivers could be provided for other platforms, especially Linux installations.  Second, if vendors were able to write code using the virtual device, then virtual SAN appliances (VSA) could use fibre channel rather than be reliant on iSCSI as they are today.

One final comment; Microsoft are doing a poor job of providing detail on these new storage features.  There is precious little to find, other than high-level blog information and as mentioned previously, no best practice documentation that I can locate.  I’d be happy to be pointed in the direction of anything useful and I will link it from this post.

Related Links

Comments are always welcome; please indicate if you work for a vendor as it’s only fair.  If you have any related links of interest, please feel free to add them as a comment for consideration. Screenshot 5 Screenshot 4 Screenshot 3 Screenshot 2 Screenshot 1

Read the original blog entry...

@ThingsExpo Stories
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...