Apache Authors: Liz McMillan, John Mertic, Pat Romanski, Elizabeth White, Janakiram MSV

News Feed Item

New Editions of Moab HPC Suite Enhance HPC System Efficiency

Adaptive Computing, the largest provider of private cloud management and High-Performance Computing (HPC) workload management software, today announced the release of two new editions of its Moab HPC Suite: Application Portal Edition and Remote Visualization Edition. These two new solutions are designed to leverage next-generation access models to simplify the collection and interpretation of data, improving the time it takes to achieve meaningful results.

Moab HPC Suite – Application Portal Edition

With an integrated NICE EnginFrame application portal, Moab streamlines the process of accessing job information by making it all available from a single point – applications, data, resources and job submissions – thus keeping costs to a minimum throughout the design and research processes. It supports the most common ISV and open-source applications used in a variety of disciplines, including manufacturing, energy, life science, government and education, without requiring users to undergo specialized HPC training. Data is also stored efficiently to minimize the need for file transfers.

It also makes it easier for users to collaborate on jobs, while maintaining security through standard browser access. This allows users both inside and outside the organization to access the data, which is encrypted for transmission. Access can be granted based on user groups, applications or resource parameters.

This edition of Moab also reduces costs by improving the efficiency of accessing and sharing the centralized pool of resources. Its advanced scheduling policies allow maximum HPC resource utilization by more users. Application licenses are also pooled to reduce costs and improve service to users. In addition, policies and usage budgets are enforced to ensure that service level agreements are met and project priorities are maintained. This enables optimized use of the centralized HPC storage capacity.

Moab HPC Suite – Remote Visualization Edition

The new Remote Visualization Edition of Moab allows organizations to take advantage of the ability to consolidate workstations into centralized resources, saving money and making specialized computing resources available to more users. This reduces management costs by keeping overhead low, as well as keeping network congestion at a minimum by removing the need to transmit full data sets. The result is less data transfer time lags for users. Less data transmission in turn reduces energy usage and storage costs as storage is centralized. This approach also minimizes security risks, keeping data sets closer to the centralized computing resources. Its shared pool of resources also decreases software licensing costs.

The Remote Visualization Edition also provides guaranteed shared visualization resource access for users, with automated policies and budgets that enforce priorities and service levels. Moab’s allocating and scheduling policies enable workload packing, allowing optimized utilization of GPUs and other accelerators, as well as multiple Linux user sessions on a single machine. Changing demands within workloads can also be better managed by Moab’s ability to dynamically re-provision the operating system as needed.

Moab HPC Suite – Remote Visualization Edition also includes an integrated NICE EnginFrame Views portal, as well as NICE Desktop Cloud Visualization, allowing secure remote access to cloud sessions with ideal compression, in Linux, Windows or virtual environments.

Moab HPC Suite – Application Portal Edition and Moab HPC Suite – Remove Visualization Edition are available in November from Adaptive Computing as well as through our partner, HP, and HP’s reseller and integrator partners.

Supporting Quotes

“In order for our customers to get the most out of our cluster technology, it’s important that they be able to schedule and allocate resources as effectively as possible,” said Jan Wallenberg, CEO of Go Virtual. “That’s where Moab comes in. Its efficiency makes a significant difference in the user experience and overall performance of the system.”

“We’re excited to see NICE EnginFrame and Desktop Cloud Visualization included in these latest versions of Moab,” said Andrea Rodolico, CTO of NICE. “Integrating our leading-edge HPC portal and remote 3D visualization technology into the framework of Moab makes it an even more capable product, enabling greater usability and productivity for HPC users.”

“These latest editions of Moab demonstrate our continued commitment to improving the user experience of Moab as well as the back-end functionality,” noted Michael Jackson, president of Adaptive Computing. “By integrating the latest technology from other industry leaders into our solutions, we are making HPC systems run more effectively, which means manufacturers and researchers can more quickly bring their discoveries to the world.”

About Adaptive Computing

Adaptive Computing is the largest provider of High-Performance Computing (HPC) workload management software and manages the world’s largest cloud computing environment with Moab, a self-optimizing dynamic cloud management solution and HPC workload management system. Moab®, a patented multidimensional intelligence engine, delivers policy-based governance, allowing customers to consolidate and virtualize resources, allocate and manage applications, optimize service levels and reduce operational costs. Adaptive Computing offers a portfolio of Moab cloud management and Moab HPC workload management products and services that accelerate, automate, and self-optimize IT workloads, resources, and services in large, complex heterogeneous computing environments such as HPC, data centers and cloud. Our products act as a brain on top of existing and future diverse infrastructure and middleware to enable it to self-optimize and deliver higher ROI to the business with its:

Moab Cloud Suite for self-optimizing cloud management

Moab HPC Suite for self-optimizing HPC workload management

For more information, call (801) 717-3700 or visit www.adaptivecomputing.com.

NOTE TO EDITORS: If you would like additional information on Adaptive Computing and its products, please visit the Adaptive Computing Newsroom at http://www.adaptivecomputing.com/category/news/. All prices noted are in U.S. dollars and are valid only in the United States.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.