Click here to close now.



Welcome!

Apache Authors: William Schmarzo, Christopher Harrold, Elizabeth White, Talend Inc., Adrian Bridgwater

Related Topics: Microservices Expo, Java IoT, Containers Expo Blog, Agile Computing, @CloudExpo, Apache

Microservices Expo: Article

Big Data Analytics: Thinking Outside of Hadoop

UIMA and Hadoop in Big Data Analytics

Big Data Predictions
In the recent release of '2012 Hype Cycle Of Emerging Technologies,' research analyst Gartner evaluated several technologies to come up with a list of technologies that will dominate the future . "Big Data" related technologies form a significant portion of the list, in particular the following technologies revolve around the concept and usage of Big Data.

  • Social Analytics: This analytics allow marketers to identify sentiment and identify trends in order to accommodate the customer better.
  • Activity Streams: Activity Streams are the future of enterprise collaboration, uniting people, data, and applications in real-time in a central, accessible, virtual interface. Think of a company social network where every employee, system, and business process exchanged up-to-the-minute information about their activities and outcomes
  • Natural Language Question Answering: NLP is a technique for analysing naturally occurring texts as part of linguistic processing to achieve human like language processing.
  • Video Analytics: forming a conceptual and contextual understanding of the content of the video .
  • Context Enriched Services: Context-enriched services use information about a person or an object to proactively anticipate the user's need and serve up the content.

These areas are just representative but in general many of the emerging technologies revolve around the ability to process large amounts of data from hither to unconventional sources and extract meaning out of them.

Doing a general search on Big Data on Google, or any other technical forum, we find that Big Data is almost synonymous with Hadoop, the main reason being that  the storage and computational operations in Hadoop are parallel by design. Hadoop is highly scalable. Data can be accessed and operated upon without interdependencies until the results need to be reduced - and even the reduction itself can be performed in parallel. The result is that large amounts of data can be processed concurrently by many servers, greatly reducing the time to obtain results.

MapReduce is the basic component of Hadoop. It is a parallel programming framework that processes unordered data. All data are composed of a key and an associated value and processing occurs mainly in two phases, the Map phase and the Reduce phase.

However, careful analysis of the above technologies like Social Analyics, Video Analytics, and NLP, reveal that all of them definitely needed the massively parallel processing power of Hadoop, but they also need the level of intelligence that extracts meaningful information such that the Hadoop's Map & Reduce functions are effective to provide the required insight.

UIMA and Big Data Analytics
Hadoop in its raw form will not solve all the insight required for Big Data processing like in the technologies mentioned earlier. This is evident from the fact that most of the tutorials and examples on Hadoop Big Data Processing are about counting words or parsing text streams for specific words, etc. Naturally these examples are good from an academic perspective but may not solve the real life needs of Big Data processing and the associated insight needed for the enterprises.

UIMA stands for Unstructured Information Management Architecture. It is a component software architecture for the development, discovery, composition and deployment of multi-modal analytics for the analysis of unstructured information and its integration with search and knowledge management technologies.

The UIMA architecture supports the development, discovery, composition and deployment of multi-modal analytics, including text, audio and video.

Once the initial processors understands Video, Audio and other media documents like email and creates textual meaning out of it like the string of tokens or other patterns, these values can be parsed by various annotators of UIMA pipe line.

The parser subcomponent is responsible for converting the crawled document in its native format .

UIMA is an architecture in which basic building blocks called Analysis Engines (AEs) are composed to analyze a document and infer and record descriptive attributes about the document as a whole, and/or about regions therein. This descriptive information, produced by AEs is referred to generally as analysis results. Analysis results typically represent meta-data about the document content. One way to think about AEs is as software agents that automatically discover and record meta-data about original content.

Analysis Engines are constructed from building blocks called Annotators. An annotator is a component that contains analysis logic. Annotators analyze an artifact (for example, a text document) and create additional data (metadata) about that artifact. It is a goal of UIMA that annotators need not be concerned with anything other than their analysis logic - for example the details of their deployment or their interaction with other annotators.

An Analysis Engine (AE) may contain a single annotator (this is referred to as a Primitive AE), or it may be a composition of others and therefore contain multiple annotators (this is referred to as an Aggregate AE).

Some of the examples of Annotators could be :

  • Language Identification annotator
  • Linguistic Analysis annotator
  • Dictionary Lookup annotator
  • Named Entity Recognition annotator
  • Pattern Matcher annotator
  • Classification Module annotator
  • Custom annotators

While a detailed description of Annotators in processing unstructured data is beyond the scope of this article, you can appreciate the power of Annotator with one specific example below.

The OpenCalais Annotator component wraps the OpenCalais web service and makes the OpenCalais analysis results available in UIMA. OpenCalais can detect a large variety of entities, facts and events like for example Persons, Companies, Acquisitions, Mergers, etc.

Summary
As is evident from the above facts, frameworks like UIMA extend the Big Data processing towards much more meaningful insights and map them to real world scenarios. While the Massively Parallel Processing abilities of Hadoop will be a key factor in Big Data initiatives, it is not alone enough and frameworks like UIMA will be playing a much larger part.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

@ThingsExpo Stories
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Apixio Inc. has raised $19.3 million in Series D venture capital funding led by SSM Partners with participation from First Analysis, Bain Capital Ventures and Apixio’s largest angel investor. Apixio will dedicate the proceeds toward advancing and scaling products powered by its cognitive computing platform, further enabling insights for optimal patient care. The Series D funding comes as Apixio experiences strong momentum and increasing demand for its HCC Profiler solution, which mines unstruc...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....