Welcome!

Apache Authors: Liz McMillan, William Schmarzo, Christopher Harrold, Elizabeth White, Talend Inc.

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Cloud Brokerage: The Market Unified

Commodity brokers don’t own the transport system for the market

This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.

The feedback and fallout from Part II of this post has been quite interesting.  I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business.  But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.

For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.

Yes, 10 times.

And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.

But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.

On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage.  Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.

IDC recently reported that the 2011 server shipments topped out at 8.3 million units.  At a $42K clip, that is a market growing by a healthy $350 billion each year.

But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.

A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure.  “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”

If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.

The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources.  And it is this massive untapped market that will drive the next wave of innovation.

If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.

The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market.  In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.

Just what are these technologies?  Let’s take a look at three areas of innovation that will underpin the future of the utility computing.

Cloud brokerage technologies are best considered in the context of supply, demand and delivery.

Universal Resource Metering: Quantification of Demand and Supply

I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions:  Utility and Commodity.

A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”

A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”

Theoretically, you can have a utility without it necessarily being commodity.  But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale.   And in order to achieve scale, the utility must be pervasive and uniform.  One should not require any special skills in order to use it.  It must be simple and consistent to use.   Think about your interaction with things like power or water services or subscribing to the Internet.

Utility is a word used quite often to describe the cloud.   In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility.  The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”

So is the cloud really a computer ‘utility’?  Not yet.

You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently.  Common utilities all share the characteristic of universal measurement.  Think about it.  Power. Water.  Energy.  The Internet.  Whatever.

A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.

Cloud Infrastructure Federation: Tapping Global Supply

When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand.   You buy a commodity.  Cloud brokers of the future will move commodities, not brands.   Today, cloud brokers form ‘partnerships’ with service providers.   But for a true brokerage model to blossom, there can be no possibility for vendor discrimination.  Anyone that brings product to market can and should trade it.  The denial of interoperability cannot happen.

With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation.   This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become.   It means absolving oneself from the politics of the API game completely.

The Underlying Transport System:  Delivering the Commodity

It doesn’t always happen, but when a commodity contract comes due, something must be delivered.   The party that holds the paper for a hundred thousand units of corn must be able to take possession of it.  Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.

The equivalent underlying transport system must exist for the cloud infrastructure market.

Commodity brokers don’t own the transport system for the market.   And for good reason.  However, if you subscribe to the early analyst view of cloud brokerage, they do.   The analysts see brokers facilitating the transaction and delivering the compute commodity itself.   To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.

Cloud interoperability is not a new concept.  It has been bandied about the blogosphere for several years already.  The problem to date is that such movements have been nothing more than thinly veiled product sales pitches.  The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”

In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...