|By Jack Norris||
|August 19, 2012 08:15 AM EDT||
We have entered the "Age of Big Data" according to a recent New York Times article. This comes as no surprise to most organizations already struggling with the onslaught of data coming from an increasing number of sources and at an increasing rate. The 2011 IDC Digital Universe Study reported that data is growing faster than Moore's Law. This trend points to a paradigm shift in how organizations process data where isolated islands and silos are being replaced by large clusters of commodity servers that keep data and compute resources together.
Another way of looking at this paradigm shift is that the growing volume and velocity of data require a new approach to networked computing. A good example of this change is found at Google. The industry now takes Google's dominance for granted, but when Google launched its beta search engine in 1998, the company was late entering the market. At the time, Yahoo! was dominant; other contenders included infoseek, excite, Lycos, Ask Jeeves and AltaVista (dominating technical searches). Within two years, Google was the dominant search provider. It wasn't until 2003, when Google published a paper on MapReduce, that the world got a glimpse into Google's back-end architecture.
Google's architecture revealed how the company was able to index significantly more data, to get far better results faster, and to achieve these superior results much more efficiently and cost-effectively than all competitors. The shift Google made was to divide complex data analysis tasks into simple subtasks that could be performed in parallel on commodity servers. Separate processes were being used to Map the data, and then Reduce it into interim or final results. This MapReduce framework would eventually become available to organizations through distributions of Apache Hadoop.
A Brief History of Hadoop
After reading Google's paper in 2003, Yahoo engineer Doug Cutting developed a Java-based implementation of MapReduce, and named it after his son's stuffed elephant, Hadoop. In 2006, Hadoop became a subproject of Lucene (a popular text search library) at the Apache Software Foundation (www.apache.org), and became its own top-level Apache project in 2008.
Essentially, Hadoop provides a way to capture, organize, store, search, share, analyze and visualize disparate data sources (structured, semi-structured and unstructured) across a large cluster of commodity computers, and is designed to scale up from dozens to thousands of servers, each offering local computation and storage.
While there are several elements that are now part of Hadoop, two are fundamental to its operation. The first is the Hadoop Distributed File System (HDFS), which serves as the primary storage system. HDFS replicates and distributes the blocks of source data to the compute nodes throughout the cluster of servers to be analyzed by one or more applications. The second is MapReduce, which creates a software framework and a programming model for writing applications capable of processing vast amounts of distributed data in parallel on very large clusters.
The open source nature of Apache Hadoop creates an ecosystem that facilitates constant advancements in its capabilities, performance, reliability and ease of use. These enhancements can be made by any individual or organization-a global community of contributors-and are then either contributed to the basic Apache library or made available in a separate (often free) commercial distribution.
In effect, Hadoop is a complete system or "stack" for data analysis. The stack includes not only the HDFS and MapReduce foundation, but also job management, development tools, schedulers, machine learning libraries, etc.
KISS: Keep It Simple, Scalable
In a paper titled The Unreasonable Effectiveness of Data, the authors (all research directors from Google) make a contrast between the elegant simplicity of physics (with equations like E = mc2) and other disciplines, noting that, "... sciences that involve human beings rather than elementary particles have proven more resistant to elegant mathematics."
The fact that simple formulas are fully capable of explaining the complex natural world, while remaining elusive in understanding human behavior, is fundamental to why Hadoop is gaining in popularity. The paper notes the frustration of economists, who lack similar simple equations or models, and explores advances being made in fields like natural language processing-a notoriously complex area that has been studied for years with many attempts at artificial intelligence as a means to gain some insight.
The authors found that relatively simple algorithms applied to massive datasets produced stunning results. One example involves scene completion. An algorithm was used to eliminate something in a picture, a car for instance, and then based on a corpus of thousands of pictures, fill in the missing background. The algorithm performed rather poorly until the corpus was increased to millions of photos. With sufficient data, the same, simple algorithm performed extremely well. This need to find patterns and fill in the "missing pieces" in any puzzle is a common theme in many data analytics applications today.
Data analytics also confronts another inherent complexity: the growth in unstructured and semi-structured data. The sources of unstructured data, such as log files, social media, videos, etc., are growing in both their size and importance. But even structured data that goes through a series of changes eventually loses some or all of its structure. Traditional analytic techniques require considerable preprocessing of unstructured and semi-structured data before being able to produce results, and the results can be wrong or misleading if the preprocessing is somehow flawed.
The ability of Hadoop to employ simple algorithms and obtain meaningful results when analyzing unstructured, semi-structured and structured data in its raw form is unprecedented-and currently unparalleled. MapReduce enables data to be analyzed in an incremental fashion (and with parallel processing) without any need to engage in complex data transformations or to otherwise preprocess any data sources, or to create any schemas or aggregate any data in advance. Sometimes the interim results can be quite revealing on their own, and any unexpected results can be used to further fine-tune additional analysis. In fact, Hadoop was designed to accommodate virtually all forms of data directly, thus eliminating the need to engage in extraordinary measures before being able to unlock the value hidden deeply within.
The Price/Performance of Data Analytics
Not only does Hadoop deliver superior data analytics capabilities and results, it does so (as Google found) with an infrastructure that is far more cost-effective than traditional data analysis tools. The reason is that scaling data analytics capabilities has long been subject to the 80/20 rule: Big gains can be achieved with little initial effort (and cost), but the returns diminish as the datasets grow to become Big Data.
In stark contrast, Hadoop can scale linearly, which is the key to both effective and cost-effective data analytics. As datasets grow, traditional data analysis environments scale in an exponential fashion, causing the additional cost required to gain additional insight to eventually become prohibitive. With Hadoop, by contrast, the cluster of commodity (read: inexpensive) servers with direct-attached storage scales linearly with the growth in the number and sizes of datasets.
Hadoop's ability to satisfy these prerequisites well is the reason for its growing popularity in Web-based businesses and data-intensive organizations, as well as at aggressive start-ups. For the former, the need to wrestle with truly Big Data justifies the need for a data analytics environment like Hadoop. For the latter, the lack of anything legacy makes it easy to benefit from Hadoop's advantages.
One major challenge to Hadoop adoption, however, remains its file system. HDFS is an append-only storage that requires data to be batch loaded in a Hadoop cluster and then later exported post-processing for use by other applications that don't support the HDFS API. And Big Data can be difficult and costly to move back and forth in this fashion owing to the inherent duplication of data across the "semantic wall" between the existing and Hadoop infrastructures.
Another barrier to production adoption of Hadoop in larger organizations involves the extraordinary measures required to make the environment dependable. Constant care is needed to ensure that single points of failure (especially in the NameNode and JobTracker) cannot cause catastrophe, and that in the case of data loss, data can be re-loaded into the Hadoop cluster.
Breaking Through the Barriers
These problems with Hadoop are, themselves, becoming part of the past. Open source communities can be quite large, creating a vibrant ecosystem. This is the case with Hadoop, where several companies are now providing commercial distributions based on open source Hadoop.
The growing number of commercial Hadoop distributions available is systematically breaking through the barriers to widespread adoption. In general, these distributions provide enhancements that make Hadoop easier to integrate into the enterprise, as well as more enterprise-class in its operation, performance and reliability. One way of achieving these enhancements is to use existing and standard communications protocols as a foundation to enable more seamless integration between legacy and Hadoop environments.
Such a common foundation facilitates making the paradigm shift in data analytics in virtually any organization. It eliminates the need to throw data back and forth over a "semantic wall" by tearing down that wall. The compatibility afforded also extends beyond the physical infrastructure and into development environments and routine operating procedures, especially those involving data protection, such as snapshots and mirroring. With standards-based file access into the Hadoop cluster, existing applications and tools, and even ordinary browsers are able to access the data directly and in real-time (vs. Hadoop's traditional batch processing.
The End - or Just the Beginning
The data analytics paradigm is changing, and the change presents a real opportunity for established organizations to take full advantage of some new and powerful capabilities without sacrificing any existing ones. Just as Google was able to do, Hadoop makes it possible for any organization to gain a significant competitive edge by taking full advantage of the insight provided by this paradigm shift.
Hadoop is indeed a game-changing technology, and Hadoop is now itself changing with the advent of enterprise-class commercial distributions. By making Hadoop more mission-critical in its operation (potentially with the same or an even lower total cost of ownership), these "next-generation" solutions make beginning the shift to the new data analytics paradigm less risky and more rewarding than ever before.
SYS-CON Events announced today that Roundee / LinearHub will exhibit at the WebRTC Summit at @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LinearHub provides Roundee Service, a smart platform for enterprise video conferencing with enhanced features such as automatic recording and transcription service. Slack users can integrate Roundee to their team via Slack’s App Directory, and '/roundee' command lets your video conference ...
Sep. 28, 2016 05:30 AM EDT Reads: 1,467
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Sep. 28, 2016 05:15 AM EDT Reads: 2,799
Digital transformation is too big and important for our future success to not understand the rules that apply to it. The first three rules for winning in this age of hyper-digital transformation are: Advantages in speed, analytics and operational tempos must be captured by implementing an optimized information logistics system (OILS) Real-time operational tempos (IT, people and business processes) must be achieved Businesses that can "analyze data and act and with speed" will dominate those t...
Sep. 28, 2016 05:00 AM EDT Reads: 1,201
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Sep. 28, 2016 05:00 AM EDT Reads: 3,826
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Sep. 28, 2016 04:45 AM EDT Reads: 2,988
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Sep. 28, 2016 04:30 AM EDT Reads: 1,644
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Sep. 28, 2016 04:30 AM EDT Reads: 1,685
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Sep. 28, 2016 04:15 AM EDT Reads: 4,606
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Sep. 28, 2016 03:30 AM EDT Reads: 3,121
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
Sep. 28, 2016 03:15 AM EDT Reads: 2,986
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Sep. 28, 2016 03:00 AM EDT Reads: 1,834
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Sep. 28, 2016 03:00 AM EDT Reads: 1,781
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Sep. 28, 2016 02:45 AM EDT Reads: 1,886
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
Sep. 28, 2016 02:15 AM EDT Reads: 1,349
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Sep. 28, 2016 02:00 AM EDT Reads: 1,145
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
Sep. 28, 2016 01:45 AM EDT Reads: 3,044
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Sep. 28, 2016 01:30 AM EDT Reads: 2,035
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
Sep. 28, 2016 01:00 AM EDT Reads: 1,570
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
Sep. 27, 2016 10:45 PM EDT Reads: 3,431
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
Sep. 27, 2016 10:30 PM EDT Reads: 2,208