Welcome!

Apache Authors: Carmen Gonzalez, Pete Pickerill, AppDynamics Blog, Trevor Parsons, Sematext Sematext

Related Topics: Cloud Expo, SOA & WOA, Open Source, Virtualization, Web 2.0, Apache

Cloud Expo: Article

The Age of Big Data: How to Gain Competitive Advantage

The Drivers Behind Hadoop Adoption

We have entered the "Age of Big Data" according to a recent New York Times article. This comes as no surprise to most organizations already struggling with the onslaught of data coming from an increasing number of sources and at an increasing rate. The 2011 IDC Digital Universe Study reported that data is growing faster than Moore's Law. This trend points to a paradigm shift in how organizations process data where isolated islands and silos are being replaced by large clusters of commodity servers that keep data and compute resources together.

Another way of looking at this paradigm shift is that the growing volume and velocity of data require a new approach to networked computing. A good example of this change is found at Google. The industry now takes Google's dominance for granted, but when Google launched its beta search engine in 1998, the company was late entering the market. At the time, Yahoo! was dominant; other contenders included infoseek, excite, Lycos, Ask Jeeves and AltaVista (dominating technical searches). Within two years, Google was the dominant search provider. It wasn't until 2003, when Google published a paper on MapReduce, that the world got a glimpse into Google's back-end architecture.

Google's architecture revealed how the company was able to index significantly more data, to get far better results faster, and to achieve these superior results much more efficiently and cost-effectively than all competitors. The shift Google made was to divide complex data analysis tasks into simple subtasks that could be performed in parallel on commodity servers. Separate processes were being used to Map the data, and then Reduce it into interim or final results. This MapReduce framework would eventually become available to organizations through distributions of Apache Hadoop.

A Brief History of Hadoop
After reading Google's paper in 2003, Yahoo engineer Doug Cutting developed a Java-based implementation of MapReduce, and named it after his son's stuffed elephant, Hadoop. In 2006, Hadoop became a subproject of Lucene (a popular text search library) at the Apache Software Foundation (www.apache.org), and became its own top-level Apache project in 2008.

Essentially, Hadoop provides a way to capture, organize, store, search, share, analyze and visualize disparate data sources (structured, semi-structured and unstructured) across a large cluster of commodity computers, and is designed to scale up from dozens to thousands of servers, each offering local computation and storage.

While there are several elements that are now part of Hadoop, two are fundamental to its operation. The first is the Hadoop Distributed File System (HDFS), which serves as the primary storage system. HDFS replicates and distributes the blocks of source data to the compute nodes throughout the cluster of servers to be analyzed by one or more applications. The second is MapReduce, which creates a software framework and a programming model for writing applications capable of processing vast amounts of distributed data in parallel on very large clusters.

The open source nature of Apache Hadoop creates an ecosystem that facilitates constant advancements in its capabilities, performance, reliability and ease of use. These enhancements can be made by any individual or organization-a global community of contributors-and are then either contributed to the basic Apache library or made available in a separate (often free) commercial distribution.

In effect, Hadoop is a complete system or "stack" for data analysis. The stack includes not only the HDFS and MapReduce foundation, but also job management, development tools, schedulers, machine learning libraries, etc.

KISS: Keep It Simple, Scalable
In a paper titled The Unreasonable Effectiveness of Data, the authors (all research directors from Google) make a contrast between the elegant simplicity of physics (with equations like E = mc2) and other disciplines, noting that, "... sciences that involve human beings rather than elementary particles have proven more resistant to elegant mathematics."

The fact that simple formulas are fully capable of explaining the complex natural world, while remaining elusive in understanding human behavior, is fundamental to why Hadoop is gaining in popularity. The paper notes the frustration of economists, who lack similar simple equations or models, and explores advances being made in fields like natural language processing-a notoriously complex area that has been studied for years with many attempts at artificial intelligence as a means to gain some insight.

The authors found that relatively simple algorithms applied to massive datasets produced stunning results. One example involves scene completion. An algorithm was used to eliminate something in a picture, a car for instance, and then based on a corpus of thousands of pictures, fill in the missing background. The algorithm performed rather poorly until the corpus was increased to millions of photos. With sufficient data, the same, simple algorithm performed extremely well. This need to find patterns and fill in the "missing pieces" in any puzzle is a common theme in many data analytics applications today.

Data analytics also confronts another inherent complexity: the growth in unstructured and semi-structured data. The sources of unstructured data, such as log files, social media, videos, etc., are growing in both their size and importance. But even structured data that goes through a series of changes eventually loses some or all of its structure. Traditional analytic techniques require considerable preprocessing of unstructured and semi-structured data before being able to produce results, and the results can be wrong or misleading if the preprocessing is somehow flawed.

The ability of Hadoop to employ simple algorithms and obtain meaningful results when analyzing unstructured, semi-structured and structured data in its raw form is unprecedented-and currently unparalleled. MapReduce enables data to be analyzed in an incremental fashion (and with parallel processing) without any need to engage in complex data transformations or to otherwise preprocess any data sources, or to create any schemas or aggregate any data in advance. Sometimes the interim results can be quite revealing on their own, and any unexpected results can be used to further fine-tune additional analysis. In fact, Hadoop was designed to accommodate virtually all forms of data directly, thus eliminating the need to engage in extraordinary measures before being able to unlock the value hidden deeply within.

The Price/Performance of Data Analytics
Not only does Hadoop deliver superior data analytics capabilities and results, it does so (as Google found) with an infrastructure that is far more cost-effective than traditional data analysis tools. The reason is that scaling data analytics capabilities has long been subject to the 80/20 rule: Big gains can be achieved with little initial effort (and cost), but the returns diminish as the datasets grow to become Big Data.

In stark contrast, Hadoop can scale linearly, which is the key to both effective and cost-effective data analytics. As datasets grow, traditional data analysis environments scale in an exponential fashion, causing the additional cost required to gain additional insight to eventually become prohibitive. With Hadoop, by contrast, the cluster of commodity (read: inexpensive) servers with direct-attached storage scales linearly with the growth in the number and sizes of datasets.

Hadoop's ability to satisfy these prerequisites well is the reason for its growing popularity in Web-based businesses and data-intensive organizations, as well as at aggressive start-ups. For the former, the need to wrestle with truly Big Data justifies the need for a data analytics environment like Hadoop. For the latter, the lack of anything legacy makes it easy to benefit from Hadoop's advantages.

One major challenge to Hadoop adoption, however, remains its file system. HDFS is an append-only storage that requires data to be batch loaded in a Hadoop cluster and then later exported post-processing for use by other applications that don't support the HDFS API. And Big Data can be difficult and costly to move back and forth in this fashion owing to the inherent duplication of data across the "semantic wall" between the existing and Hadoop infrastructures.

Another barrier to production adoption of Hadoop in larger organizations involves the extraordinary measures required to make the environment dependable. Constant care is needed to ensure that single points of failure (especially in the NameNode and JobTracker) cannot cause catastrophe, and that in the case of data loss, data can be re-loaded into the Hadoop cluster.

Breaking Through the Barriers
These problems with Hadoop are, themselves, becoming part of the past. Open source communities can be quite large, creating a vibrant ecosystem. This is the case with Hadoop, where several companies are now providing commercial distributions based on open source Hadoop.

The growing number of commercial Hadoop distributions available is systematically breaking through the barriers to widespread adoption. In general, these distributions provide enhancements that make Hadoop easier to integrate into the enterprise, as well as more enterprise-class in its operation, performance and reliability. One way of achieving these enhancements is to use existing and standard communications protocols as a foundation to enable more seamless integration between legacy and Hadoop environments.

Such a common foundation facilitates making the paradigm shift in data analytics in virtually any organization. It eliminates the need to throw data back and forth over a "semantic wall" by tearing down that wall. The compatibility afforded also extends beyond the physical infrastructure and into development environments and routine operating procedures, especially those involving data protection, such as snapshots and mirroring. With standards-based file access into the Hadoop cluster, existing applications and tools, and even ordinary browsers are able to access the data directly and in real-time (vs. Hadoop's traditional batch processing.

The End - or Just the Beginning
The data analytics paradigm is changing, and the change presents a real opportunity for established organizations to take full advantage of some new and powerful capabilities without sacrificing any existing ones. Just as Google was able to do, Hadoop makes it possible for any organization to gain a significant competitive edge by taking full advantage of the insight provided by this paradigm shift.

Hadoop is indeed a game-changing technology, and Hadoop is now itself changing with the advent of enterprise-class commercial distributions. By making Hadoop more mission-critical in its operation (potentially with the same or an even lower total cost of ownership), these "next-generation" solutions make beginning the shift to the new data analytics paradigm less risky and more rewarding than ever before.

More Stories By Jack Norris

Jack Norris is vice president, marketing, MapR Technologies. He has over 20 years of enterprise software marketing experience. He leads worldwide marketing for the industry’s most advanced distribution for Hadoop. Jack’s experience ranges from defining new markets for small companies, leading marketing and business development for an early-stage cloud storage software provider, to increasing sales of new products for large public companies. Jack has also held senior executive roles with Brio Technology, SQRIBE, EMC, Rainfinity, and Bain and Company.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada. Our partner network encompasses some 300 of the world's leading systems integrators and security s...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nigeria has the largest economy in Africa, at more than US$500 billion, and ranks 23rd in the world. A recent re-evaluation of Nigeria's true economic size doubled the previous estimate, and brought it well ahead of South Africa, which is a member (unlike Nigeria) of the G20 club for political as well as economic reasons. Nigeria's economy can be said to be quite diverse from one point of view, but heavily dependent on oil and gas at the same time. Oil and natural gas account for about 15% of Nigera's overall economy, but traditionally represent more than 90% of the country's exports and as...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"At our booth we are showing how to provide trust in the Internet of Things. Trust is where everything starts to become secure and trustworthy. Now with the scaling of the Internet of Things it becomes an interesting question – I've heard numbers from 200 billion devices next year up to a trillion in the next 10 to 15 years," explained Johannes Lintzen, Vice President of Sales at Utimaco, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the industry’s first all flash version of HyperConverged Appliances that include both compute and storag...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at @ThingsExpo, Andrew Bolwell, Director of Innovation for HP's Printing and Personal Systems Group, discussed how key attributes of mobile technology – touch input, sensors, social, and ...