Apache Authors: Liz McMillan, John Mertic, Pat Romanski, Elizabeth White, Janakiram MSV

Related Topics: @CloudExpo, Microservices Expo, PowerBuilder, Containers Expo Blog, SAP HANA Cloud, Apache

@CloudExpo: Article

Predictive Real-Time Analytics Is the Big Data Lifeline

SAP is not the only firm driving this space, but its work to publicize its competencies in this arena is very prevalent

Big Data is everywhere. Predictive analytics and real time in-memory computing isn't everywhere.

This truth (if we can accept it to be so) represents something of an imbalance.

As a subset of data mining, predictive analytics driven by in-memory computing efficiencies now has an opportunity to bring real-time analysis and insight to fast-moving live transactional data flows. Or to put it another (rather shorter) way, we can now start to manage and understand Big Data better than ever.

Application use cases here might typically include:

  • Meteorology
  • Genetics
  • Economics
  • Climate simulation
  • Oil exploration
  • Financial analysis and scientific research
  • Telecommunications, to name but seven good examples

If we combine contemporary approaches to predictive analytics with the newly arrived Intel Xeon Phi coprocessor that produces what is claimed to be over one teraflop per second in terms of workload computational power for highly-parallel workloads, then CIOs can start to think about what "50-processor core computing" will mean for us in the very near future.

Pushing forward in this space is SAP with its HANA in-memory computing appliance and platform. The firm is openly partnering with HP, Oracle, Cognizant and variety of other big (and smaller specialist) players to form strategic alliances that will help further the uptake of this kind of technology.

Why Is Analytics Working Better?
Part of the reason that in-memory predictive analytics intelligence is now becoming so important and much easier to bring to bear is hardware related and part of it is a software-related development.

On the hardware end, Intel has worked to sharpen data throughput between memory and processor cores. This means that work that goes on right in the heart of the machine that might have traveled at around five Gigabytes per second (Gbps) five years ago, now has a chance to move at 100 Gbps. On the software end, firms like HP and SAP have been working to produce what are typically referred to as "business process solutions" that can produce "context-aware experiences" to enable one sense-and-respond scenarios and, therefore, faster and more personalized interactions with customers.

As a matter of interest, HP also works to provide datacenter services to SAP that support its enterprise-wide hosting solutions for e-business applications - but that's another story.

In terms of actual application solutions running on SAP HANA, new products include the SAP Liquidity Risk Management application, the SAP Accelerated Trade Promotion Planning application and SAP Operational Process Intelligence software.

"These innovations show how SAP is rapidly delivering real-time, data-centric and industry-specific applications on the SAP HANA platform," said Dr Vishal Sikka, member of the SAP executive board for technology and innovation.

To take one example, SAP Liquidity Risk Management aims to provide banks with the ability to perform real-time, high-speed liquidity risk management and reporting on very large volumes of cash flows. In future then, banks will be able to instantly measure key liquidity risk ratios (such as the Basel III liquidity coverage ratio) and cash flow gaps to resolve potential liquidity bottlenecks. The application aims to allow banks to apply different stress scenarios, such as adjusted run-off rates and bond haircuts, to gain a deeper understanding of how market volatility can impact liquidity positions.

Lessons for CIOs
Now you don't have to be a bank CIO or financial analyst to understand the wider importance of this technology, i.e., this is the "harnessing of Big Data" catchline that you've already heard bandied about by countless IT firm's press departments, except now it's really happening.

The lesson for CIOs and the software application developers serving them is that we now have a route to predictive real-time analysis and the power to view billions of stored records and live transactional data at the same time. CIOs should look to their solution architects and business process experts to compose the analytical data models that will drive the next phase of their technology growth.

SAP is not the only firm driving this space, but its work to publicize its competencies in this arena is very prevalent. The fact that SAP pushes many of the interfaces for managing the result of its data analysis to both Apple iPad (and now Windows 8 format) mobile devices may have confounded Steve Jobs at the time, but this is the firm's proof point for showing off its Big Data number crunching applications. One day, quite soon, none of this will be a surprise.

This post first appeared on Enterprise CIO Forum.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...