Welcome!

Apache Authors: Carmen Gonzalez, Elizabeth White, Pat Romanski, Liz McMillan, Christopher Harrold

Blog Feed Post

Achieveing Large Scale In-Memory Computing in the Federal Sector

By

The term “In-Memory Computing” is in growing use among federal technologists. The term refers to the ability to store and process data directly in the main Random Access Memory (RAM) of a computer vice storing the data off into slow devices like hard disks. By using in-memory techniques, incredibly large quantities of data can be understood by a computer all at once and processed quickly, enabling three major benefits:

  1. Legacy systems can scale their performance up almost instantly by a relatively minor architecture change, enabling even older capabilities to serve many more users than they were first designed for.
  2. It is now possible to rapidly compare new data to all existing data holdings and generate actionable results that can have immediate impact on ongoing missions. This has important benefits to a wide variety of missions that involve a need to take action based on new information, including missions like emergency response, law enforcement, military, intelligence and space exploration.
  3. New systems leveraging in-memory computing can be designed with much smarter data replication, backup and security features and can do so for relatively low cost since backup and recovery options are much smarter.

The most widely used in-memory capability is the open source Java distributed cache called Ehcache, and most all new enterprise solutions in the federal sector are probably either already leveraging Ehcache to some extent or are ready to. Ehcache is used by Java developers all over the world to provide features such as a powerful in-memory data management platform, scalable in-process memory storage, smart data synchronizations across application clusters, and elegant cache management, all within a very simple and ubiquitous API. It is pure open source available under the Apache open source license. If the open-source features are not enough for your requirements, the Enterprise EhCache offering, an in-memory data management platform offered by Software AG – the parent company of developer Terracotta,provides enterprise support CIOs like to see, as well as advanced capabilities such as TB-size storage all in memory, high availability, SQL querying, data center replications, and more.

Federal Computer Week’s Frank Konkel just did a great write-up of SoftwareAG’s Enterprise Ehcache in a piece titled “Feds tiptoe toward in-memory computing.”

In it he cites Software AG Government Solutions VP Bill Lochten discussing a use case for an agency’s 500 Gigabyte in-memory data store. This system is now receiving more than 100 Gigabytes, about 1 million new records, per day, and is handling them all in memory. They do it in a way designed to let analysts and agents in the field access the data immediately. From Frank’s reporting:

“What we’re opening up for this customer is the ability for them to have a tremendous amount of information at their disposal, in-memory, and that means real-time,” said Bill Lochten, vice president of Software AG Government Solutions.

“We’re able to improve performance in that we can provide information back to agents in the field in the moment,” Lochten said. “You can think of an agent literally having to verify the status of an individual or pieces of information about a potential criminal matter – we provide a 360-degree view of the situation in microseconds rather than minutes.”

The system can handle requests to its central repository from thousands of users at once. If a request for information isn’t available in the in-memory system, the IMC database pulls the data from main storage for further processing or access, said Fabien Sanglier, solution architect for Software AG Government Solutions.

Our recommendations:

  1. If you have systems in use that were designed more than 5 years ago, have your engineers consider a quick upgrade to in-memory Enterprise Ehcache to enable fast, economical modernization to enable your app to keep scaling to new demands.
  2. If you have missions that require fast access to new information or need rapid correlation between new data and existing stores, architect it for speed at the beginning. Big Data in memory is probably the way to go.

 

 

 

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

IoT & Smart Cities Stories
The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get tailored market studies; and more.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
AI and machine learning disruption for Enterprises started happening in the areas such as IT operations management (ITOPs) and Cloud management and SaaS apps. In 2019 CIOs will see disruptive solutions for Cloud & Devops, AI/ML driven IT Ops and Cloud Ops. Customers want AI-driven multi-cloud operations for monitoring, detection, prevention of disruptions. Disruptions cause revenue loss, unhappy users, impacts brand reputation etc.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility.
Today's workforce is trading their cubicles and corporate desktops in favor of an any-location, any-device work style. And as digital natives make up more and more of the modern workforce, the appetite for user-friendly, cloud-based services grows. The center of work is shifting to the user and to the cloud. But managing a proliferation of SaaS, web, and mobile apps running on any number of clouds and devices is unwieldy and increases security risks. Steve Wilson, Citrix Vice President of Cloud,...
When Enterprises started adopting Hadoop-based Big Data environments over the last ten years, they were mainly on-premise deployments. Organizations would spin up and manage large Hadoop clusters, where they would funnel exabytes or petabytes of unstructured data.However, over the last few years the economics of maintaining this enormous infrastructure compared with the elastic scalability of viable cloud options has changed this equation. The growth of cloud storage, cloud-managed big data e...
Artificial intelligence, machine learning, neural networks. We're in the midst of a wave of excitement around AI such as hasn't been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. This time is (mostly) different. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Pattern recognition can equal or exceed the ability of human experts in some domains. It's devel...