Welcome!

Apache Authors: Liz McMillan, Pat Romanski, Elizabeth White, Christopher Harrold, John Mertic

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, Containers Expo Blog, Agile Computing, Apache

@CloudExpo: Article

The Age of Big Data: How to Gain Competitive Advantage

The Drivers Behind Hadoop Adoption

We have entered the "Age of Big Data" according to a recent New York Times article. This comes as no surprise to most organizations already struggling with the onslaught of data coming from an increasing number of sources and at an increasing rate. The 2011 IDC Digital Universe Study reported that data is growing faster than Moore's Law. This trend points to a paradigm shift in how organizations process data where isolated islands and silos are being replaced by large clusters of commodity servers that keep data and compute resources together.

Another way of looking at this paradigm shift is that the growing volume and velocity of data require a new approach to networked computing. A good example of this change is found at Google. The industry now takes Google's dominance for granted, but when Google launched its beta search engine in 1998, the company was late entering the market. At the time, Yahoo! was dominant; other contenders included infoseek, excite, Lycos, Ask Jeeves and AltaVista (dominating technical searches). Within two years, Google was the dominant search provider. It wasn't until 2003, when Google published a paper on MapReduce, that the world got a glimpse into Google's back-end architecture.

Google's architecture revealed how the company was able to index significantly more data, to get far better results faster, and to achieve these superior results much more efficiently and cost-effectively than all competitors. The shift Google made was to divide complex data analysis tasks into simple subtasks that could be performed in parallel on commodity servers. Separate processes were being used to Map the data, and then Reduce it into interim or final results. This MapReduce framework would eventually become available to organizations through distributions of Apache Hadoop.

A Brief History of Hadoop
After reading Google's paper in 2003, Yahoo engineer Doug Cutting developed a Java-based implementation of MapReduce, and named it after his son's stuffed elephant, Hadoop. In 2006, Hadoop became a subproject of Lucene (a popular text search library) at the Apache Software Foundation (www.apache.org), and became its own top-level Apache project in 2008.

Essentially, Hadoop provides a way to capture, organize, store, search, share, analyze and visualize disparate data sources (structured, semi-structured and unstructured) across a large cluster of commodity computers, and is designed to scale up from dozens to thousands of servers, each offering local computation and storage.

While there are several elements that are now part of Hadoop, two are fundamental to its operation. The first is the Hadoop Distributed File System (HDFS), which serves as the primary storage system. HDFS replicates and distributes the blocks of source data to the compute nodes throughout the cluster of servers to be analyzed by one or more applications. The second is MapReduce, which creates a software framework and a programming model for writing applications capable of processing vast amounts of distributed data in parallel on very large clusters.

The open source nature of Apache Hadoop creates an ecosystem that facilitates constant advancements in its capabilities, performance, reliability and ease of use. These enhancements can be made by any individual or organization-a global community of contributors-and are then either contributed to the basic Apache library or made available in a separate (often free) commercial distribution.

In effect, Hadoop is a complete system or "stack" for data analysis. The stack includes not only the HDFS and MapReduce foundation, but also job management, development tools, schedulers, machine learning libraries, etc.

KISS: Keep It Simple, Scalable
In a paper titled The Unreasonable Effectiveness of Data, the authors (all research directors from Google) make a contrast between the elegant simplicity of physics (with equations like E = mc2) and other disciplines, noting that, "... sciences that involve human beings rather than elementary particles have proven more resistant to elegant mathematics."

The fact that simple formulas are fully capable of explaining the complex natural world, while remaining elusive in understanding human behavior, is fundamental to why Hadoop is gaining in popularity. The paper notes the frustration of economists, who lack similar simple equations or models, and explores advances being made in fields like natural language processing-a notoriously complex area that has been studied for years with many attempts at artificial intelligence as a means to gain some insight.

The authors found that relatively simple algorithms applied to massive datasets produced stunning results. One example involves scene completion. An algorithm was used to eliminate something in a picture, a car for instance, and then based on a corpus of thousands of pictures, fill in the missing background. The algorithm performed rather poorly until the corpus was increased to millions of photos. With sufficient data, the same, simple algorithm performed extremely well. This need to find patterns and fill in the "missing pieces" in any puzzle is a common theme in many data analytics applications today.

Data analytics also confronts another inherent complexity: the growth in unstructured and semi-structured data. The sources of unstructured data, such as log files, social media, videos, etc., are growing in both their size and importance. But even structured data that goes through a series of changes eventually loses some or all of its structure. Traditional analytic techniques require considerable preprocessing of unstructured and semi-structured data before being able to produce results, and the results can be wrong or misleading if the preprocessing is somehow flawed.

The ability of Hadoop to employ simple algorithms and obtain meaningful results when analyzing unstructured, semi-structured and structured data in its raw form is unprecedented-and currently unparalleled. MapReduce enables data to be analyzed in an incremental fashion (and with parallel processing) without any need to engage in complex data transformations or to otherwise preprocess any data sources, or to create any schemas or aggregate any data in advance. Sometimes the interim results can be quite revealing on their own, and any unexpected results can be used to further fine-tune additional analysis. In fact, Hadoop was designed to accommodate virtually all forms of data directly, thus eliminating the need to engage in extraordinary measures before being able to unlock the value hidden deeply within.

The Price/Performance of Data Analytics
Not only does Hadoop deliver superior data analytics capabilities and results, it does so (as Google found) with an infrastructure that is far more cost-effective than traditional data analysis tools. The reason is that scaling data analytics capabilities has long been subject to the 80/20 rule: Big gains can be achieved with little initial effort (and cost), but the returns diminish as the datasets grow to become Big Data.

In stark contrast, Hadoop can scale linearly, which is the key to both effective and cost-effective data analytics. As datasets grow, traditional data analysis environments scale in an exponential fashion, causing the additional cost required to gain additional insight to eventually become prohibitive. With Hadoop, by contrast, the cluster of commodity (read: inexpensive) servers with direct-attached storage scales linearly with the growth in the number and sizes of datasets.

Hadoop's ability to satisfy these prerequisites well is the reason for its growing popularity in Web-based businesses and data-intensive organizations, as well as at aggressive start-ups. For the former, the need to wrestle with truly Big Data justifies the need for a data analytics environment like Hadoop. For the latter, the lack of anything legacy makes it easy to benefit from Hadoop's advantages.

One major challenge to Hadoop adoption, however, remains its file system. HDFS is an append-only storage that requires data to be batch loaded in a Hadoop cluster and then later exported post-processing for use by other applications that don't support the HDFS API. And Big Data can be difficult and costly to move back and forth in this fashion owing to the inherent duplication of data across the "semantic wall" between the existing and Hadoop infrastructures.

Another barrier to production adoption of Hadoop in larger organizations involves the extraordinary measures required to make the environment dependable. Constant care is needed to ensure that single points of failure (especially in the NameNode and JobTracker) cannot cause catastrophe, and that in the case of data loss, data can be re-loaded into the Hadoop cluster.

Breaking Through the Barriers
These problems with Hadoop are, themselves, becoming part of the past. Open source communities can be quite large, creating a vibrant ecosystem. This is the case with Hadoop, where several companies are now providing commercial distributions based on open source Hadoop.

The growing number of commercial Hadoop distributions available is systematically breaking through the barriers to widespread adoption. In general, these distributions provide enhancements that make Hadoop easier to integrate into the enterprise, as well as more enterprise-class in its operation, performance and reliability. One way of achieving these enhancements is to use existing and standard communications protocols as a foundation to enable more seamless integration between legacy and Hadoop environments.

Such a common foundation facilitates making the paradigm shift in data analytics in virtually any organization. It eliminates the need to throw data back and forth over a "semantic wall" by tearing down that wall. The compatibility afforded also extends beyond the physical infrastructure and into development environments and routine operating procedures, especially those involving data protection, such as snapshots and mirroring. With standards-based file access into the Hadoop cluster, existing applications and tools, and even ordinary browsers are able to access the data directly and in real-time (vs. Hadoop's traditional batch processing.

The End - or Just the Beginning
The data analytics paradigm is changing, and the change presents a real opportunity for established organizations to take full advantage of some new and powerful capabilities without sacrificing any existing ones. Just as Google was able to do, Hadoop makes it possible for any organization to gain a significant competitive edge by taking full advantage of the insight provided by this paradigm shift.

Hadoop is indeed a game-changing technology, and Hadoop is now itself changing with the advent of enterprise-class commercial distributions. By making Hadoop more mission-critical in its operation (potentially with the same or an even lower total cost of ownership), these "next-generation" solutions make beginning the shift to the new data analytics paradigm less risky and more rewarding than ever before.

More Stories By Jack Norris

Jack Norris is vice president, marketing, MapR Technologies. He has over 20 years of enterprise software marketing experience. He leads worldwide marketing for the industry’s most advanced distribution for Hadoop. Jack’s experience ranges from defining new markets for small companies, leading marketing and business development for an early-stage cloud storage software provider, to increasing sales of new products for large public companies. Jack has also held senior executive roles with Brio Technology, SQRIBE, EMC, Rainfinity, and Bain and Company.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mob...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, discussed the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports.
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.