Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

News Feed Item

Databricks Unveils Spark-Based Cloud Platform to Simplify Big Data Processing, Announces Close of Series B Financing

Databricks, the company founded by the creators of Apache Spark—the powerful open-source processing engine that provides blazingly fast and sophisticated analytics—announced today the launch of Databricks Cloud, a cloud platform built around Apache Spark. In addition to this launch, the company is announcing the close of $33 million in series B funding led by New Enterprise Associates (NEA) with follow-on investment from Andreessen Horowitz.

“Getting the full value out of their Big Data investments is still very difficult for organizations. Clusters are difficult to set up and manage, and extracting value from your data requires you to integrate a hodgepodge of disparate tools, which are themselves hard to use,” said Ion Stoica, CEO of Databricks. “Our vision at Databricks is to dramatically simplify big data processing and free users to focus on turning data into value. Databricks Cloud delivers on this vision by combining the power of Spark with a zero-management hosted platform and an initial set of applications built around common workflows.”

Databricks Cloud is powered by Spark, a unified processing engine that eliminates the need to stitch together a disjoint set of tools. Spark provides support for interactive queries (Spark SQL), streaming data (Spark Streaming), machine learning (MLlib) and graph computation (GraphX) natively with a single API across the entire pipeline. Additionally, Databricks Cloud reaps the benefit of the rapid pace of innovation in Spark, driven by the 200+ contributors that have made it the most active project in the Hadoop ecosystem.

The hosted platform also dramatically simplifies the pain of provisioning a Spark cluster. Users simply specify the desired capacity of a new cluster, and the platform handles all the details: provisioning servers on the fly, streamlining import and caching of data, handling all elements of security, and continually patching and updating Spark—freeing users of all the typical headaches and allowing them to explore and harness the power of Spark. The platform is currently available on Amazon Web Services, though expanding to additional cloud providers is on the roadmap.

Databricks Cloud comes with a set of built-in applications for those eager to immediately begin using Spark to access and analyze data to better compete in the marketplace:

  • Notebooks. Provides a rich interface that allows users to perform data discovery and exploration and to plot the results interactively, execute entire workflows as scripts, and enable advanced collaboration features.
  • Dashboards. Create and host dashboards quickly and easily. Users can pick any outputs from previously created notebooks, assemble these outputs in a one-page dashboard with a WISIWYG editor, and publish the dashboard to a broader audience. The data and queries underpinning these dashboards can be regularly updated and refreshed.
  • Job Launcher. Enables anyone to run arbitrary Apache Spark jobs and trigger their execution, simplifying the process of building data products.

“One of the common complaints we heard from enterprise users was that big data is not a single analysis; a true pipeline needs to combine data storage, ETL, data exploration, dashboards and reporting, advanced analytics, and creation of data products. Doing that with today’s technology is incredibly difficult,” continues Stoica. “We built Databricks Cloud to enable the creation of end-to-end pipelines out of the box while supporting the full spectrum of Spark applications for enhanced and additional functionality. It was designed to appeal to a whole new class of users who will adopt big data now that many of the complexities of using it have been alleviated.”

Beyond the built-in applications, Databricks Cloud enables users to seamlessly deploy and leverage the rapidly growing ecosystem of third-party Spark applications. Databricks Cloud is powered by the 100 percent open source Apache Spark, meaning that it will support all current and future “Certified on Spark” applications out of the box, and that all applications developed on Databricks Cloud will work across any of the “Certified Spark Distributions.”

“Databricks remains committed to developing and expanding Apache Spark fully in the open and continuing to add to the capabilities that made it a vital big data platform,” said Matei Zaharia, CTO of Databricks. “We will continue to commit significant resources to drive open-source innovation in Spark alongside the community. Furthermore, we look forward to enabling a whole new set of users and developers to experience and leverage the power of Spark to drive enterprise value.”

Databricks Cloud is currently in limited availability with several beta users. Databricks is gradually opening up more capacity so visit www.Databricks.com to learn more about the platform and to get on the waiting list for getting access to the platform that is redefining how enterprises utilize big data.

About Databricks

Databricks (databricks.com) was founded by the creators of Apache Spark, and is using technology based on years of research to build an advanced platform for analyzing and extracting value from big data. They believe big data is a tremendous opportunity that is still largely untapped, and are working to revolutionize what enterprises can do with it. They are venture-backed by Andreessen Horowitz and NEA.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...