Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Open Source Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, Apache, @DXWorldExpo, @ThingsExpo

@DevOpsSummit: Article

Apache #Hadoop and #BigData Standards | @CloudExpo #IoT #M2M #BI #ML

The platform’s penetration into enterprises has not lived up to Hadoop’s game-changing business potential

Making Apache Hadoop Less Retro: Bringing Standards to Big Data

Ten short years ago, Apache Hadoop was just a small project deployed on a few machines at Yahoo and within a few years, it had truly become the backbone of Yahoo's data infrastructure. Additionally, the current Apache Hadoop market is forecasted to surpass $16 billion by 2020.

This might lead you to believe that Apache Hadoop is currently the backbone of data infrastructures for all enterprises; however, widespread enterprise adoption has been shockingly low.

While the platform is a key technology for gaining business insights from organizational Big Data, its penetration into enterprises has not lived up to Hadoop's game-changing business potential. In fact, according to Gartner, "Despite considerable hype and reported successes for early adopters, 54 percent of survey respondents report no plans to invest [in Hadoop] at this time, while only 18 percent have plans to invest in Hadoop over the next two years," said Nick Heudecker, research director at Gartner.

These findings demonstrate that although the open source platform may be proven and popular among seasoned developers who require a technology that can power large, complex applications, its fragmented ecosystem has caused enterprises difficulty extracting value from Apache Hadoop investments.

Another glaring barrier to adoption is the rapid and fragmented growth happening with Apache Hadoop components and its platform distribution, ultimately slowing Big Data ecosystem development and stunting enterprise implementation.

For legacy companies, platforms like Apache Hadoop seem daunting and risky. If these enterprises aren't able to initially identify the baseline business value they stand to gain from a technology, they are unlikely to invest - and this is where the value of industry standards comes into play.

Increasing adoption of Apache Hadoop, in my opinion, will require platform distributions to stop asking legacy corporations to technologically resemble Amazon, Twitter or Netflix. Through compatibility across platform distribution and application offerings for management and integration, widespread industry interoperability standards would allow Big Data application and solution providers to offer enterprises a guaranteed and official bare-minimum functionality and interoperability for their Apache Hadoop investments.

Additionally, this baseline of technological expectation will also benefit companies looking to differentiate their offerings. Similarly, standards within this open source-based Big Data technology will enable application developers and enterprises to more easily build data-driven applications - including standardizing the commodity work of the components of an Apache Hadoop platform distribution to spur the creation of more applications, which boosts the entire ecosystem.

A real world illustration of standardization in practice occurs within the container shipping industry, which was able to grow significantly once universal guidelines were implemented. When a formal shipping container standard was implemented by the International Standards Organization (ISO), to ensure the safe and efficient transport of containers, its significant impact increased trade more than 790 percent over 20 years - an incredible case for the unification and optimization of an entire ecosystem to ensure its longevity.

To help today's growing enterprise buyer looking to harness the estimated 4ZB of data the world is generating, the open data community will need to work together to foster the support of standardization across Apache Hadoop to ensure confidence from new adopters in their investment - regardless of the industry they serve.

From platform distributions, to application and solution providers and system integrators, known standards in which to operate will not only help to sustain this piece of the Big Data ecosystem pie, but it will define how these pieces interoperate and integrate more simply for the benefit of the ever-important enterprise.

More Stories By John Mertic

John Mertic is Director of Program Management for ODPi and Open Mainframe Project at The Linux Foundation. Previously, he was director of business development software alliances at Bitnami. He comes from a PHP and Open Source background, being a developer, evangelist, and partnership leader at SugarCRM, board member at OW2, president of OpenSocial, and frequent conference speaker around the world. As an avid writer, he has published articles on IBM Developerworks, Apple Developer Connection, and PHP Architect, and authored the book The Definitive Guide to SugarCRM: Better Business Applications and the book Building on SugarCRM.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...