Welcome!

Apache Authors: Carmen Gonzalez, Liz McMillan, Elizabeth White, Pat Romanski, Christopher Harrold

Related Topics: @DXWorldExpo, Open Source Cloud, Containers Expo Blog, @CloudExpo, Apache, SDN Journal

@DXWorldExpo: Article

Hadoop and Big Data Easily Understood - How to Conduct a Census of a City

What is Hadoop, what is Map Reduce and what is Big Data - how to count the residents of San Francisco, Los Angeles or a village.

BigData (and Hadoop) are buzzword and growth areas of computing; this article will distill the concepts into easy-to-understand terms.

As the name implies, BigData is literally "big data" or "lots of data" that needs to be processed. Lets take a simple example: the city council of San Francisco is required to take a census of its population - literally how many people live at each address. There are city employees who are employed to count the residents. The city of Los Angeles has a similar requirement.

Consider are two methods to accomplish this task:

1. Request all the San Francisco residents to line up at City Hall and be prcessed by the city employees. Of course, this is very cumbersome and time consuming because the people are brought to the city hall and processed one by one - in scientific terms the data are transfered to the processing node. The people have to wait in line for a long time, the processing time is lengthy as the employees do down the line counting and processing the residents: "How many people live at your address?" In scientific terms, the data is processed serially, one after the next; then the data is aggregated at the end of the processing phase.

2. Send census forms to each address and request the residents to complete the form on a specific date and return to city hall for aggregation. In scientific terms the data is processed at the data node (resident's address) in parallel (all forms completed simultaneously by residents on a target date) and then aggregated at the processing node (city hall).

These two phases: process and aggregate are known in the Big Data community as "Map Reduce" - phase one map the data; phase two reduce the data to aggregate totals.

So far so good.

How do we process large amounts of data, in parallel, very quickly using computers?

An open source project, licensed under Apache, known as Hadoop, grew out of research from Yahoo and Google. Hadoop performs the MapReduce function on a distributed file system known as HDFS (Hadoop Distributed File System). Why is the file system distributed ? Remember, the Map Reduce function performs the phase one processing of data at the data node (the census form is completed at the resident's address). It is too time consuming to copy the data to a central processing node (request all the residents to line up at city hall). Thus Hadoop uses a distributed file system, so that the processing takes place on many distributed servers at once (in parallel). Because Hadoop is distributing the processing task, it can take advantage of cheap commodity hardware - compare this to processing all the data centrally on big expensive hardware.

The advantage of cheap commodity hardware is you only need to use as many servers as needed. To use the census analogy - how many computers would Hadoop require to process the census forms of San Francisco (four square miles) compared to Los Angeles (400+ square miles) ? If the census bureau buys one very large computer, the computer would be able to process the data for Los Angeles, but most of the compute power (and electricity) used by the computer would be wasted when the computer processed the census data for San Francisco. So the census bureau can rent say four computers to process the San Francisco data, and then rent perhaps another 90 to process the census data for Los Angeles. It is cheaper to rent commodity physical hardware and the Hadoop task is also an opportunity to use an elastic cloud computing environment where compute power is used on demand.

More Stories By Jonathan Gershater

Jonathan Gershater has lived and worked in Silicon Valley since 1996, primarily doing system and sales engineering specializing in: Web Applications, Identity and Security. At Red Hat, he provides Technical Marketing for Virtualization and Cloud. Prior to joining Red Hat, Jonathan worked at 3Com, Entrust (by acquisition) two startups, Sun Microsystems and Trend Micro.

(The views expressed in this blog are entirely mine and do not represent my employer - Jonathan).

IoT & Smart Cities Stories
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...