Welcome!

Apache Authors: Carmen Gonzalez, Liz McMillan, Elizabeth White, Pat Romanski, Christopher Harrold

Related Topics: @DXWorldExpo, Open Source Cloud, Containers Expo Blog, @CloudExpo, Apache, SDN Journal

@DXWorldExpo: Article

Hadoop and Big Data Easily Understood - How to Conduct a Census of a City

What is Hadoop, what is Map Reduce and what is Big Data - how to count the residents of San Francisco, Los Angeles or a village.

BigData (and Hadoop) are buzzword and growth areas of computing; this article will distill the concepts into easy-to-understand terms.

As the name implies, BigData is literally "big data" or "lots of data" that needs to be processed. Lets take a simple example: the city council of San Francisco is required to take a census of its population - literally how many people live at each address. There are city employees who are employed to count the residents. The city of Los Angeles has a similar requirement.

Consider are two methods to accomplish this task:

1. Request all the San Francisco residents to line up at City Hall and be prcessed by the city employees. Of course, this is very cumbersome and time consuming because the people are brought to the city hall and processed one by one - in scientific terms the data are transfered to the processing node. The people have to wait in line for a long time, the processing time is lengthy as the employees do down the line counting and processing the residents: "How many people live at your address?" In scientific terms, the data is processed serially, one after the next; then the data is aggregated at the end of the processing phase.

2. Send census forms to each address and request the residents to complete the form on a specific date and return to city hall for aggregation. In scientific terms the data is processed at the data node (resident's address) in parallel (all forms completed simultaneously by residents on a target date) and then aggregated at the processing node (city hall).

These two phases: process and aggregate are known in the Big Data community as "Map Reduce" - phase one map the data; phase two reduce the data to aggregate totals.

So far so good.

How do we process large amounts of data, in parallel, very quickly using computers?

An open source project, licensed under Apache, known as Hadoop, grew out of research from Yahoo and Google. Hadoop performs the MapReduce function on a distributed file system known as HDFS (Hadoop Distributed File System). Why is the file system distributed ? Remember, the Map Reduce function performs the phase one processing of data at the data node (the census form is completed at the resident's address). It is too time consuming to copy the data to a central processing node (request all the residents to line up at city hall). Thus Hadoop uses a distributed file system, so that the processing takes place on many distributed servers at once (in parallel). Because Hadoop is distributing the processing task, it can take advantage of cheap commodity hardware - compare this to processing all the data centrally on big expensive hardware.

The advantage of cheap commodity hardware is you only need to use as many servers as needed. To use the census analogy - how many computers would Hadoop require to process the census forms of San Francisco (four square miles) compared to Los Angeles (400+ square miles) ? If the census bureau buys one very large computer, the computer would be able to process the data for Los Angeles, but most of the compute power (and electricity) used by the computer would be wasted when the computer processed the census data for San Francisco. So the census bureau can rent say four computers to process the San Francisco data, and then rent perhaps another 90 to process the census data for Los Angeles. It is cheaper to rent commodity physical hardware and the Hadoop task is also an opportunity to use an elastic cloud computing environment where compute power is used on demand.

More Stories By Jonathan Gershater

Jonathan Gershater has lived and worked in Silicon Valley since 1996, primarily doing system and sales engineering specializing in: Web Applications, Identity and Security. At Red Hat, he provides Technical Marketing for Virtualization and Cloud. Prior to joining Red Hat, Jonathan worked at 3Com, Entrust (by acquisition) two startups, Sun Microsystems and Trend Micro.

(The views expressed in this blog are entirely mine and do not represent my employer - Jonathan).

IoT & Smart Cities Stories
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected pat...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the mod...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Druva is the global leader in Cloud Data Protection and Management, delivering the industry's first data management-as-a-service solution that aggregates data from endpoints, servers and cloud applications and leverages the public cloud to offer a single pane of glass to enable data protection, governance and intelligence-dramatically increasing the availability and visibility of business critical information, while reducing the risk, cost and complexity of managing and protecting it. Druva's...
BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
DSR is a supplier of project management, consultancy services and IT solutions that increase effectiveness of a company's operations in the production sector. The company combines in-depth knowledge of international companies with expert knowledge utilising IT tools that support manufacturing and distribution processes. DSR ensures optimization and integration of internal processes which is necessary for companies to grow rapidly. The rapid growth is possible thanks, to specialized services an...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...