Welcome!

Apache Authors: Liz McMillan, Carmen Gonzalez, Elizabeth White, Pat Romanski, Christopher Harrold

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Apache, @DXWorldExpo, SDN Journal

@CloudExpo: Article

The Three ‘ilities’ of Big Data

Part 1: Portability, usability & quality converge to define how well the processing power of Big Data platforms can be harnessed

When talking about Big Data, most people talk about numbers: speed of processing and how many terabytes and petabytes the platform can handle. But deriving deep insights with the potential to change business growth trajectories relies not just on quantities, processing power and speed, but also three key ilities: portability, usability and quality of the data.

Portability, usability, and quality converge to define how well the processing power of the Big Data platform can be harnessed to deliver consistent, high quality, dependable and predictable enterprise-grade insights.

Portability: Ability to transport data and insights in and out of the system

Usability: Ability to use the system to hypothesize, collaborate, analyze, and ultimately to derive insights from data

Quality: Ability to produce highly reliable and trustworthy insights from the system

Portability
Portability is measured by how easily data sources (or providers) as well as data and analytics consumers (the primary "actors" in a Big Data system) can send data to, and consume data from, the system.

Data Sources can be internal systems or data sets, external data, data providers, or the apps and APIs that generate your data. A measure of high portability is how easily data providers and producers can send data to your Big Data system as well as how effortlessly they can connect to the enterprise data system to deliver context.

Analytics consumers are the business users and developers who examine the data to uncover patterns. Consumers expect to be able to inspect their raw, intermediate or output data to not only define and design analyses but also to visualize and interpret results. A measure of high portability for data consumers is easy access - both manually or programmatically - to raw, intermediate, and processed data. Highly portable systems enable consumers to readily trigger analytical jobs and receive notification when data or insights are available for consumption.

Usability
The usability of a Big Data system is the largest contributor to the perceived and actual value of that system. That's why enterprises need to consider if their Big Data analytics investment provides functionality that not only generates useful insights but also is easy to use.

Business users need an easy way to:

  • Request analytics insights
  • Explore data and generate hypothesis
  • Self-serve and generate insights
  • Collaborate with data scientists, developers, and business users
  • Track and integrate insights into business critical systems, data apps, and strategic planning processes

Developers and data scientists need an easy way to:

  • Define analytical jobs
  • Collect, prepare, pre process, and cleanse data for analysis
  • Add context to their data sets
  • Understand how, when, and where the data was created, how to interpret data and know who created them

Quality
The quality of a Big Data system is dependent on the quality of input data streams, data processing jobs, and output delivery systems.

Input Quality: As the number, diversity, frequency, and format of data channel sources explode, it is critical that enterprise-grade Big Data platforms track the quality and consistency of data sources. This also informs downstream alerts to consumers about changes in quality, volume, velocity, or the configuration of their data stream systems.

Analytical Job Quality: A Big Data system should track and notify users about the quality of the jobs (such as map reduce or event processing jobs) that process incoming data sets to produce intermediate or output data sets.

Output Quality: Quality checks on the outputs from Big Data systems ensure that transactional systems, users, and apps offer dependable, high-quality insights to their end users. The output from Big Data systems needs to be analyzed for delivery predictability, statistical significance, and access according to the constraints of the transactional system.

Though we've explored how portability, usability, and quality separately influence the consistency, quality, dependability, and predictability of your data systems, remember it's the combination of the ilities that determines if your Big Data system will deliver actionable enterprise-grade insights.

This piece is the first in a three-part series on how businesses can squeeze maximum business value out of their Big Data analysis.

More Stories By Kumar Srivastava

Kumar Srivastava is the product management lead for Apigee Insights and Apigee Analytics products at Apigee. Before Apigee, he was at Microsoft where he worked on several different products such as Bing, Online Safety, Hotmail Anti-Spam and PC Safety and Security services. Prior to Microsoft, he was at Columbia University working as a graduate researcher in areas such as VOIP Spam, Social Networks and Trust, Authentication & Identity Management systems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected pat...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
CloudEXPO has been the M&A capital for Cloud companies for more than a decade with memorable acquisition news stories which came out of CloudEXPO expo floor. DevOpsSUMMIT New York faculty member Greg Bledsoe shared his views on IBM's Red Hat acquisition live from NASDAQ floor. Acquisition news was announced during CloudEXPO New York which took place November 12-13, 2019 in New York City.
In an age of borderless networks, security for the cloud and security for the corporate network can no longer be separated. Security teams are now presented with the challenge of monitoring and controlling access to these cloud environments, at the same time that developers quickly spin up new cloud instances and executives push forwards new initiatives. The vulnerabilities created by migration to the cloud, such as misconfigurations and compromised credentials, require that security teams t...
The graph represents a network of 1,329 Twitter users whose recent tweets contained "#DevOps", or who were replied to or mentioned in those tweets, taken from a data set limited to a maximum of 18,000 tweets. The network was obtained from Twitter on Thursday, 10 January 2019 at 23:50 UTC. The tweets in the network were tweeted over the 7-hour, 6-minute period from Thursday, 10 January 2019 at 16:29 UTC to Thursday, 10 January 2019 at 23:36 UTC. Additional tweets that were mentioned in this...
The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that ...
After years of investments and acquisitions, CloudBlue was created with the goal of building the world's only hyperscale digital platform with an increasingly infinite ecosystem and proven go-to-market services. The result? An unmatched platform that helps customers streamline cloud operations, save time and money, and revolutionize their businesses overnight. Today, the platform operates in more than 45 countries and powers more than 200 of the world's largest cloud marketplaces, managing mo...
When Enterprises started adopting Hadoop-based Big Data environments over the last ten years, they were mainly on-premise deployments. Organizations would spin up and manage large Hadoop clusters, where they would funnel exabytes or petabytes of unstructured data.However, over the last few years the economics of maintaining this enormous infrastructure compared with the elastic scalability of viable cloud options has changed this equation. The growth of cloud storage, cloud-managed big data e...
Your applications have evolved, your computing needs are changing, and your servers have become more and more dense. But your data center hasn't changed so you can't get the benefits of cheaper, better, smaller, faster... until now. Colovore is Silicon Valley's premier provider of high-density colocation solutions that are a perfect fit for companies operating modern, high-performance hardware. No other Bay Area colo provider can match our density, operating efficiency, and ease of scalability.