Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache

@DXWorldExpo: Article

Examining the True Cost of Big Data

As you start on your Big Data journey or project, be sure to ask what exactly the business requires

The good news about the Big Data market is that we generally all agree on the definition of Big Data, which has come to be known as data that has volume, velocity and variety where businesses need to collect, store, manage and analyze in order to derive business value or otherwise known as the "4 V's." However, the problem with such a broad definition is that it can mean different things to different people once you start to put some real values next to those V's.

Let's be honest, Volume can be a different thing to different organizations. To some it is anything above 10 terabytes of managed data in their BI environment and to others it is petabyte scale and nothing less. Likewise velocity can be multi-billions of daily records coming into the enterprise from various external and internal networks. When it really comes down to it, each business situation will be quite different not only from a size and speed perspective but also more important from the business use-case or requirement. A large bank's Big Data problem could be very different to that of an online retailer or an airline. If you compare what say a hospital is trying to do collecting and analyzing all the sensor patient data compared to a utilities provider running a smart-grid or a telecommunications operator. True, all could be categorized as machine generated or raw data but the exact type of data might be different not to mention the volume or growth rate. Probably the one unique common denominator across all aforementioned industries is that everyone is keeping the data for longer time-periods. No one is throwing it away - not even the detailed data.

The Many Cost Factors to Consider
Costs will of course vary depending on the individual allocated IT budget but regardless, how the company allocates IT budget dollars to new Big Data initiatives needs consideration. Let's face it, enterprise buyers didn't suddenly come into a bunch of newfound IT assets or line items on their budget and the current world economic situation would certainly not suggest so. More likely existing budgets are being re-allocated and instead of spending more on say existing traditional data warehouses or appliances, monies are being allocated to new projects running on open source projects including Apache Hadoop which promises both low cost, ease of scale not to mention the obvious best approach to managing and analyzing multi-structured data sets. The difficultly then arises how do you integrate or have your Hadoop environment co-exist with the established BI or DW environment that the business has grown to love and rely upon?

Leverage What You Already Have
Let's assume you have a data warehouse or data mart in place today and you already use various ETL or data movement tools and BI dashboard, analytics or reporting tools and you don't want to disrupt business users which could not only impacting performance levels but also training up on a new set of tools. In fact you already likely beholden to strict SLA's around response times for the various business reports and KPI's. However, at the same time the business is demanding access to new data sets in order to glean better insights either directly analyzing this data or co-mingling it with existing customer data. This could take the form of web-logs, click stream data or social media data from various interactive sites the business is now leveraging and tracking. The promise of impacting profit margins and gaining a competitive edge just cannot be avoided.

As we all know, traditional relational or columnar databases can't handle the unstructured data types so IT needs to rollout a different solution to satisfy the business demands. Evaluations can take many forms but typically will start with which Hadoop distribution, which NoSQL or NewSQL database and what query access tools in addition to MapReduce. It is certainly no easy task as there are a large number of technology solutions on the market today that claim to run on or with Hadoop providing MapReduce or SQL-like capabilities which all satisfy the requirement of managing volumes of unstructured data. Some are more mature than others; some proven and not all are low-cost. Open source on the surface looks very low cost but as soon as you require any level of support, which lets face it once it's live and relied upon as a business critical environment, you will need to allocate a line item on your budget. The Big Data line item won't just be one line as it will need to include all components required to properly rollout a Big Data solution to truly satisfy the business demands. Just like any other IT environment the obvious pieces will include: Software licensing and support, hardware, skilled dedicated resources, professional services and training and the dedicated time of business users to provide input on key requirements including specifying types of reports, queries and analysis which will naturally change and evolve over time.

Big Data Costs Can Quickly Creep Up
In terms of the hardware expenditure required to manage the new Big Data set, you may start out with a Hadoop cluster of say 10 nodes and yes that is certainly manageable but if your data velocity is significant, you can quickly reach 100+ nodes and now you will face a number of other expenses including additional headcount and skilled resources to manage the environment proactively in addition to tools for managing the cluster including system management and alerting and potentially add-on software which can vary by business use-case but might cover real-time analytics against streaming data for say fraud detection or detection of unusual patterns. You may also need a business tool to provide a front-end GUI dashboard to track specific KPIs or data visualization tools so business users can quickly understand what is going on. Very quickly the costs become less about the storage and hardware and more around the software that focuses on getting the most value from this newly collected data set.

There is no denying the fact that Big Data presents great new opportunities but reaching the point of a quantifiable ROI in a fast time frame is still a very real challenge. Everyone is talking about Big Data and all the innovative technology approaches to tackling it but it is still difficult to find lots of business success stories within any one-industry sector. It's still fairly immature but the good news is that its moving at a much faster pace than any other IT project today and certainly our data warehouse and BI forefathers have provided lessons learned over the past two decades.

Big Data Is Big Business but It Comes with Strict Requirements
If we want to examine more closely the main areas of expenditure for a Big Data project, it is probably best to look at it through the lens of a specific type of business and use-case. Let's take a large financial institution that has a number of existing traditional data warehouse / BI environments but because the business doesn't want to throw any data away (well let's face it regulations don't allow that for a number of years) and realistically the business wants to retain specific data sets for ongoing trending and analysis. This includes examining questions such as "what constitutes a low-risk client based on spending behavior patterns over a specific time period cross-referenced with customer demographics" which will help the institution better target a particular segment of the market.

Given the IT budget doesn't allow for increased spend that correlates with data growth rates, they need to seriously reduce costs and so decide to go the route of a Hadoop-based environment given its promise for low-cost scale and the fact that it can provide insights into customer patterns by capturing semi- and unstructured data. Front-ending the warehouse with a dedicated Hadoop cluster is the preferred architectural approach but the business users still want access to both the Hadoop environment and the existing traditional data warehouse environment.

Given we are talking about a financial institution, the question of security and availability quickly come to the top of the requirements list. At the same time, if business users want to access that data, SQL query access and using the current BI tool against that new set of data is also a requirement. If you can avoid having to the move large chunks of data on a frequent basis from one to the other, it will not only reduce costs but also latency. In an ideal world, being able to leverage the skill sets you already have and avoiding duplication of work is key.

Below is a quick table outlining the main cost factors to be considered and a set of comments against each of these areas that could reduce costs.

 

Big Data on Hadoop Cost Factors

Key Consideration to drive down cost

 

Storage

Look at databases that provide data compression to yield storage savings (better than GZip or LZO).

 

Hardware (Nodes)

Granular data compression at database level will reduce nodes over time.

 

Data Analytics - Skilled Resources

Examine technology solutions that provide standard SQL or BI tool access in addition to MapReduce (Pig etc.)

 

Cluster management - Skilled Resources

Leverage existing Dev-operations staff if you deploy a SQL-compliant data environment

 

Security

Look for database solutions that provide built-in security permissions and access.

 

Availability / DR

Consider a data management environment that doesn't require additional tools for replication.

 

Training

Consider solutions where you don't need to retrain or hire all new resources. Leverage what you have (standard SQL-skilled DBAs)

Summary: Consider All Factors and Get Business Buy-in Quickly
Big Data is fundamentally a business problem. If you begin with the question of "what is the business trying to achieve by collecting, storing and analyzing this new set of data...", you will start down the right path to realizing business gains. Whether you outsource the initiative or bring in external consultants and vendors to manage the project, the same questions will arise and in order to leverage what you already have which includes both existing IT environments and skills, you will be better able to contain costs.

Furthermore, we all love the promise of new innovative technologies including Hadoop and MapReduce but without leveraging tried and tested standards we have come to love and respect, it doesn't make a whole lot of sense from both a technical or economic sense. As you start on your Big Data journey or project, be sure to ask what exactly the business requires and how can you leverage what you already have today. We all know, getting business user buy-in and success is half the battle to a successful rollout.

More Stories By John Bantleman

John Bantleman, CEO of RainStor, has more than 20 years’ experience in the management of software companies. Prior to overseeing RainStor, he transformed LBMS into a $45 million business prior to its successful NASDAQ flotation in 1997. Today’s LBMS’ technology is now part of CA’s product portfolio. The following year John was instrumental in the launch of Evolve, and drove the company through to a successful IPO on NASDAQ.

Returning to the UK in 2003, John spent 12 months working on the advisory boards of venture capital organizations such as Apax Partners. He joined RainStor Inc. as Chairman in 2004 and became CEO at the start of 2007 and relocated back to the US to head-up worldwide operations in 2009.

Comments (3) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Vikas.Deolaliker 09/21/12 06:49:00 PM EDT

Great article. Another data point, the IT budget is up only 4% in 2013 over 2012, so don't expect everyone to rush into Bigdata.

The fourth "V" is visualization. If you cannot render the analysis in a intuitive way, there is no value in that analysis. In fact, visualization should be the first step in design of a bigdata system - it helps trim down the architectural bloat into something that is within budget and useful.

Elad Israeli 09/19/12 06:07:00 PM EDT

Fascinating post. Still waiting for someone to crack the nut that is Big Data Analytics.

douglaney 08/29/12 03:36:00 PM EDT

Great piece John. Excellent detail. Thought you and your readers might be interested in where the "3Vs" of big data originated--in a Gartner piece I authored over 11 years ago. I recently unearthed a copy so folks to refer to and cite it.

Cheers,
Doug Laney, VP Research, Gartner, @doug_laney

@ThingsExpo Stories
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo at the Javits Center in New York City, NY.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...