Click here to close now.

Welcome!

Apache Authors: Carmen Gonzalez, Liz McMillan, Plutora Blog, Pat Romanski, Trevor Parsons

Related Topics: Virtualization, Java, SOA & WOA, .NET, Cloud Expo, Apache

Virtualization: Article

Simplify Data Backup and Protection in Complex Enterprise IT Environments

Improving speed and confidence

The latest BriefingsDirect IT trends discussion targets enterprise backup, why it’s broken, and how to fix it.

Nowadays the backup of enterprise information and associated data protection are fragmented, complex, and inefficient. But new approaches are helping to simplify the data-protection process, keep costs in check, and improve recovery speed and confidence.

Joining us to share insights on how data protection became such a mess -- and how new techniques are being adopted to gain comprehensive and standard control over the data lifecycle -- are John Maxwell, Vice President of Product Management for Data Protection at Quest Software, now part of Dell, and George Crump, Founder and Lead Analyst at Storage Switzerland, an analyst firm focused on the storage market. The chat is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.  [Disclosure: Quest Software is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: Why has something seemingly as straightforward as backup become so fragmented and disorganized?

Maxwell: Dana, I think it’s a perfect storm, to use an overused cliché. If you look back 20 years ago, we had heterogeneous environments, but they were much simpler. There were NetWare and UNIX, and there was this new thing called Windows. Virtualization didn’t even really exist. We backed up data to tape, and a lot of data was in terabytes, not petabytes.

Flash forward to 2012, and there’s more heterogeneity than ever. You have stalwart databases like Microsoft SQL Server and Oracle, but then you have new apps being built on MySQL. You now have virtualization, and, in fact, we're at the point this year where we're surpassing the 50 percent mark on the number of servers worldwide that are virtualized.

John Maxwell

Now we're even starting to see people running multiple hypervisors, so it’s not even just one virtualization platform anymore, either. So the environment has gotten bigger, much bigger than we ever thought it could or would. We have numerous customers today that have data measured in petabytes, and we have a lot more applications to deal with.

And last, but not least, we now have more data that’s deemed mission critical, and by mission critical, I mean data that has to be recovered in less than an hour. Surveys 10 years ago showed that in a typical IT environment, 10 percent of the data was mission critical. Today, surveys show that it’s 50 percent and more.

Crump: I would dovetail into what he just mentioned about mission criticality. There are definitely more platforms, and that’s a challenge, but the expectation of the user is just higher. The term I use for it is IT is getting "Facebooked."

High expectations

I've had many IT guys say to me, "One of the common responses I get from my users is, 'My Facebook account is never down.'" So there is this really high expectation on availability, returning data, and things of that nature that probably isn’t really fair, but it’s reality.

One of the reasons that more data is getting classified as mission critical is just that the expectation that everything will be around forever is much higher.

George Crump

The other thing that we forget sometimes is that the backup process, especially a network backup, probably unlike any other, stresses every single component in the infrastructure. You're pulling data off of a local storage device on a server, it’s going through that server CPU and memory, it’s going down a network card, down a network cable, to a switch, to another card, into some sort of storage device, be it disk or tape.

So there are 15 things that happen in a backup and all 15 things have to go flawlessly. If one thing is broken, the backup fails, and, of course, it’s the IT guy’s fault. It’s just a complex environment, and I don’t know of another process that pushes on all aspects of the environment in one fell swoop like backup does.

Gardner: So the stakes are higher, the expectations are higher, the scale and volume and heterogeneity are all increased. What does this mean, John, for those that are tasked with managing this, or trying to get a handle on it as a process, rather than a technology-by-technology approach?

Maxwell: There are two issues here. One, you expect today's storage administrator, or sysadmin, to be a database administrator (DBA), a VMware administrator, a UNIX sysadmin, and a Windows admin. That’s a lot of responsibility, but that’s the fact.

A lot of people think that they are going to have as deep level of knowledge on how to recover a Windows server as they would an Oracle database. That’s just not the case, and it's the same thing from a product perspective, from a technology perspective.

Is there really such thing as a backup product, the Swiss Army knife, that does the best of everything? Probably not.

Is there really such thing as a backup product, the Swiss Army knife, that does the best of everything? Probably not, because being the best of everything means different things to different accounts. It means one thing for the small to medium-size business (SMB), and it could mean something altogether different for the enterprise.

We've now gotten into a situation where we have the typical IT environment using multiple backup products that, in most cases, have nothing in common. They have a lot of hands in the pot trying to manage data protection and restore data, and it has become a tangled mess.

Gardner: Before we dive a little bit deeper into some of these major areas, I'd like to just visit another issue that’s very top of mind for many organizations, and that’s security, compliance, and business continuity types of issues, risk mitigation issues. George Crump, how important is that to consider, when you look at taking more of a comprehensive or a holistic view of this backup and data-protection issue?

Disclosure laws

Crump: It's a really critical issue, and there are two ramifications. Probably the one that strikes fear in the heart of every CEO on the planet is all the disclosure laws that exist now that say that, when you lose a customer’s data, you have to let him know. Unfortunately, probably the only effective way to do that is to let everybody know.

I'm sure everybody listening to this podcast has gotten more than one letter already this year saying their Social Security number has been exposed, things like that. I can think of three or four I've already gotten this year.

So there is the downside of legally having to admit you made a mistake, and then there is the legal requirements of retaining information in case of a lawsuit. The traditional thing was that if I got a discovery motion filed against me, I needed to be able to pull this information back, and that was one motivator. But the bigger motivator is having to disclose that we did lose data.

And there's a new one coming in. We're hearing about big data, analytics, and things like that. All of that is based on being able to access old information in some form, pull it back from something, and be able to analyze it.

That is leading many, many organizations to not delete anything. If you don't delete anything, how do you store it? A disk-only type of solution forever, as an example, is a pretty expensive solution. I know disk has gotten a lot cheaper, but forever, that’s a really long time to keep the lights on, so to speak.

We need to step back, take inventory of what we've got, and choose the right solution to solve the problem at hand, whether you're an SMB or an enterprise.

Gardner: Let's look at this a bit more from the problem-solution perspective. We have multiple platforms, we have operating systems, hypervisors, application types, even appliances. What's the solution?

Maxwell: The problem is we need to step back, take inventory of what we've got, and choose the right solution to solve the problem at hand, whether you're an SMB or an enterprise.

But the biggest thing we have to address is, with the amount and complexity of the data, how can we make sysadmins, storage administrators, and DBAs productive, and how can we get them all on the same page? Why do each one of these roles in IT have to use different products?

George and I were talking earlier. One of the things that he brought up was that in a lot of companies, data is getting backed up over and over by the DBA, the VMware administrator, and the storage administrator, which is really inefficient. We have to look at a holistic approach, and that may not be one-size-fits-all. It may be choosing the right solutions, yet providing a centered means for administration, reporting, monitoring, etc.

Gardner: Is there anything different and specific about backup that makes this even harder to move from that point solution, best-of-breed mentality, into more of a comprehensive process standardization approach?

Demands and requirements

Crump: It really ties into what John said. Every line of business is going to have its own demands and requirements. To expect not even a backup administrator, but an Oracle administrator that’s managing an Oracle database for a line of business, to understand the nuances of that business and how they want to keep things is a lot to ask.

When backup is broken, the default survival mechanism is to throw everything out, buy the latest enterprise solution, put the stake in the ground, and force everybody to centralize on that one item. That works to a degree, but in every project we've been involved with, there are always three or four exceptions. That means it really didn’t work. You didn't really centralize.

Then there are covert operations of backups happening, where people are backing up data and not telling anybody, because they still don't trust the enterprise application. Eventually, something new comes out. The most immediate example is virtualization, which spawned the birth of several different virtualized specific applications. So bringing all that back in again becomes very difficult.

I agree with John. What you need to do is give the users the tools they want. Users are too sophisticated now for you to say, "This is where we are going to back it up and you've got to live with it." They're just not going to put up with that anymore. It won't work.

So give them the tools that they want. Centralize the process, but not the actual software. I think that's really the way to go.

Gardner: So we recognize that one size fits all probably isn’t going to apply here. We're going to have multiple point solutions. That means integration at some level or multiple levels. That brings us to our next major topic. How do we integrate well without compounding the complexity and the problems set? John?

We’re keenly interested in leveraging those technologies for the DBAs and sysadmins in ways that make their lives easier and make sure they are more productive.

Maxwell: We've been working on this now for almost two years here at Quest, and now at Dell, and we are launching in November, something called NetVault XA. “XA” stands for Extended Architecture. We have a portfolio of very rich products that span the SMBs and the enterprise, with focus on virtual backup, heterogeneous backup, instantaneous snapshots and deep application recovery, and we’re keenly interested in leveraging those technologies for the DBAs and sysadmins in ways that make their lives easier and make sure they are more productive.

NetVault XA solves some really big issues. First of all, it unifies the user experience across products, and by user, I mean the sysadmin, the DBA, and the storage administrator, across products. The initial release of NetVault XA will support both our vRanger and NetVault Backup, as well as our NetVault SmartDisk product, and next year, we'll be adding even more of our products under NetVault XA as well.

So now we've provided a common means of administration. We have one UI. You don’t have to learn something different. Everyone can work on the same product, yet based on your login ID, you will have access to different things, whether it's data or capabilities, such as restoring an Oracle or SQL Server database, or restoring a virtual machine (VM).

That's a common UI. A lot of vendors right now have a lot of solutions, but they look like they're from three, four, or five different companies. We want to provide a singular user experience, but that's just really the icing on the cake with NetVault XA.

If we go down a little deeper into NetVault XA, once it’s is installed, learning alongside vRanger, NetVault, or both, it's going to self identify that vRanger or NetVault environment, and it's going to allow you to manage it the way that you have already set about from that ability.

New approach

We're really delivering a new approach here, one we think is going to be unique in the industry. That's the ability to logically group data and applications within lines of business.

You gave an example earlier of Oracle. Oracle is not an application. Oracle is a platform for applications, and sometimes applications span databases, file systems, and multiple servers. You need to be looking at that from a holistic level, meaning what makes up application A, what makes up application B, C, D, etc.?

Then, what are the service levels for those applications? How mission critical are they? Are they in that 50 percent of data that we've seen from surveys, or are they data that we restored from a week ago? It wouldn’t matter, but then, again, it's having one tool that everyone can use. So you now have a whole different user experience and you're taking up a whole different approach to data protection.

Gardner: There really seems to be a drilling down into these technologies and surfacing information to such a degree that it strikes me as similar to what IT service management (ITSM) did for managing IT systems at a higher level. We're now bringing that to a discrete portion backup and recovery. Does that sound about right, George, or did I overstate it?

We're really delivering a new approach here, one we think is going to be unique in the industry. That's the ability to logically group data and applications within lines of business.

Crump: No, that's dead-on. The benefits of that type of architecture are going to be substantial. Imagine if you are the vRanger programmer, when all this started. Instead of having to write half of the backend, you could just plug into a framework that already existed and then focus most of your attention on the particular application or environment that you are going to protect.

You can be releasing the equivalent of vRanger 6 on vRanger 1, because you wouldn’t have to go write this backend that already existed. Also, if you think about it, you end up with a much more reliable software product, because now you're building on a library class that will have been well tested and proven.

Say you want to implement deduplication in a new version of the product or a new product. Instead of having to rewrite your own deduplication engine, just leverage the engine that's already there.

One common means

Maxwell: By having one common means -- whether you're a DBA, a sysadmin, a VMware administrator, or a storage administrator -- you are all on the same page. You can have people all buying into one way of doing things, so we don't have this data being backed up two or three times.

But the other thing that you get, and this is a big issue now, is protecting multiple sites. When we talk about multiple sites, people sometimes say, "You mean multiple data centers. What about all those remote office branch offices?" That right now is a big issue that we see customers running into.

The beauty of NetVault XA is I can now have various solutions implemented, whether it's vRanger running remotely or NetVault in a branch office, and I can be managing it. I can manage all aspects of it to make sure that those backups are running properly, or make sure replication is working properly. It could be halfway around the country or halfway around the world, and this way we have consistency.

Speaking of reporting, as you said earlier, what about a dashboard for management? One of our early users of NetVault XA is a large multinational company with 18 data centers and 250,000 servers. They have had to dedicate people to write service-level reports for their backups. Now, with NetVault XA, they can literally give their IT management, meaning their CIO and their CTOs, login IDs to NetVault XA, and they can see a dashboard that’s been color coded.

It can say, "Well, everything is green, so everything is protected," whether it's the Linux servers, Oracle databases, Exchange email, whatever the case. So by being able to reduce that level of complexity into a single pane of glass -- I know it's a cliché, but it really is -- it's really very powerful for large organizations and small.

I can manage all aspects of it to make sure that those backups are running properly, or make sure replication is working properly.

Even if you have two or three locations and you're only 500 employees, wouldn’t it be nice to have the ability to look at your backups, your replicas, and your snapshots, whether they're in the data center or in branch offices, and whether you're a sysadmin, DBA, storage administrator, to be using one common interface and one common set of rules to all basically all get on the same plane?

Dispersed operations

So it's having a means to take an inventory and ensure that the servers are being maintained, that everything is being protected, because next to your employees, your data is the most important asset that you have.

Data is everywhere now. It’s in mobile devices. It certainly could be in cloud-based apps. That's one of the things that we didn’t talk about. At Quest we use seven software-as-a-service (SaaS)-based applications, meaning they're big parts, whether it's Salesforce.com or our helpdesk systems, or even Office 365. This is mission-critical corporate data that doesn’t run in our own data center. How am I protecting that? Am I even cognizant of it?

The cloud has made things even more interesting, just as virtualization has made it more interesting over the past couple of years. With NetVault XA, we give you that one single pane of glass with which you can report, analyze, and manage all of your data.

Mobile devices

Gardner: Just to be clear John, this console is something you can view as a web interface, and I'm assuming therefore also through mobile devices. I'm going to guess that at some point, there will perhaps be even a more native application for some of the prominent mobile platforms.

Maxwell: It’s funny that you mentioned that. This is an HTML5-based application. So it's very new, very fresh, and very graphical. If you look at the UI, it was designed with tablets and laptops in mind. It's gotten to where you can do controls with your thumbs, assuming you're running this on a tablet.

In-house, and with early support customers, you can log into this remotely via laptops, or tablet computing. We even have some people using them on mobile phones, even though we're not quite there yet. I'm talking about the form factor of how the screens light up, but we will definitely be going that way. So a sysadmin or storage administrator can have at their fingertips the status of what’s going on in the data-protection environment.

What's nice is because this is a thin client, a web UI, you can define user IDs not only for the sysadmins and DBAs and storage administrators, but like I said earlier, IT management.

So if your boss, or your boss’ boss, wants to dial in and see the health of things, how much data you’re protecting, how much data is being replicated, what data is being protected up in the cloud, which is on-prem, all of that sort of stuff, they can now have a dashboard approach to seeing it all. That’s going to make everyone more productive, and it's going to give them a better sense that this data is being protected, and they can sleep at night.

If you don’t have a way to manage and see all of your data protection assets, it's really just a lot of talk.

Gardner: Is there anything here going forward that will make having a process approach to a data lifecycle and backup and recovery even more important?

Maxwell: Dana, you hit on something that's really near and dear to my heart, which is data deduplication. We have a very broad strategy. We offer our own software-based dedupe. We support every major hardware based dedupe appliance out there, and we're now adding support for Dell’s DR Series, DR4000 dedupe appliances. But we're still very much committed to tape, and we're building initiatives based on storing data in the cloud and backing up, replicating, failover, and so forth.

One of the things that we built into NetVault XA that's separate from the policy management and online monitoring is that we now have historical data. This is going to give you the ability to do some capacity management and capacity planning and see what the utilization is.

How much storage are your backups taking? What's the most optimum number of generations? Where are you keeping that data? Is some data being kept too long? Is some data not being kept long enough?

For every ounce of flexibility, it feels like we have added two ounces of complexity, and it's something we just can't afford to deal with.

By offering a broad strategy that says we support a plethora of backup targets, whether it's tape, special-purpose backup appliances, software-based dedupe, or even the cloud, we're giving customers flexibility, because they have unique needs and they have different needs, based on service levels or budgets. We want to make them flexible, because, going back to our original discussion, one size doesn’t fit all.

Crump: Just to tie in with what John said, we need flexibility that doesn’t add complexity. Almost everything we've done so far in the environment up to now, has added flexibility, but also, for every ounce of flexibility, it feels like we have added two ounces of complexity, and it's something we just can't afford to deal with. So that's really the key thing.

Looking forward, at least on the horizon, I don't see a big shift, something like virtualization that we need to be overly concerned with. What I do see is the virtual environment becoming more and more challenging, as we stack more and more VMs on it. The amount of I/O and the amount of data protection process that will surround every host is going to continue to increase. So the time is now to really get the bull by the horns and institute a process that will scale with the business long-term.

You may also be interested in:

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and IT-Director.com. As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

@ThingsExpo Stories
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.