What Is In-Memory Computing?
In-Memory Computing provides super-fast performance (thousands of times faster) and scale of never-ending quantities of data, and simplifies access to increasing numbers of data sources. By storing data in RAM and processing it in parallel, it supplies real-time insights that enable businesses to deliver immediate actions and responses.That’s what makes it ideal for implementation in transactional and analytical applications sharing the same data infrastructure (point-of-decision HTAP) and transactional analytics guided by real-time analytics (in-process HTAP).
The Rise of In-Memory Computing
Adoption of In-memory Computing, also known as IMC, is on the rise. This can be attributed to the growing demand for faster processing and analytics on big data, the need for simplifying architecture as the number of various data sources increases, and technology enhancements that are optimizing TCO.
In Forrester’s report for enterprise architecture professionals, the case for In-Memory Computing is clear:
“Data is the lifeblood of your entire organization. It should enlighten every function of the business, including CX, operations, marketing, sales, service, and finance.A data management (DM) strategy is critical. The goal should be clear: Provide all business functions with quick and complete access to all of the data and analytics that they need, both now and in the future.”
Gartner further reinforces this in its In-Memory Computing Technologies 2018 Predictions, noting that:
“The digitalization of business generates an inexhaustible demand for faster performance, greater scalability and deeper real-time insight, which is boosting innovation around IMC technologies.”
The fact that In-memory Computing is the preferred technology for unifying transactional and analytical processing for real-time insights and closed-loop analytics is also driving IMC innovation and growth. In the Market Guide for HTAP-Enabling In-Memory Computing Technologies, Gartner states that:
“Enabled by IMC technologies, HTAP architectures support the needs of many new application use cases in the digital business era, which require scalability and real-time performance. Data and analytics leaders facing such demands should consider HTAP solutions supported by IMC.”
Gartner’s HTAP, also called HOAP (hybrid operational and analytical processing) by 451 Research and Translytical Platforms by Forrester, enables organizations to carry out analytics on incoming transactions, taking advantage of the transaction window and triggering instant actions.
Furthermore, the availability of SSD and persistent memory technologies are also driving costs down. Consequently, IMC platforms that support these data storage tiers intelligently are helping organizations optimize their TCO while delivering in-memory performance.
Why In-Memory Computing?
To maintain a competitive edge and meet today’s demands for optimal customer experience, enterprises must deal with the constant upsurge of available data and the never-ending demands for better and faster performance.
This is firing the development of In-Memory Computing technologies. Because In-Memory Computing is all about how much data can be ingested and analyzed, and how fast the analysis can be performed.
In-Memory Computing has evolved because traditional solutions, typically based on disk storage and relational databases using SQL query language, are inadequate for today’s business intelligence (BI) needs – namely the provision of super-fast computing and scaling of data in real-time.
In-Memory Computing: Basic Principles and Significance
Figure 1: In-Memory Computing Basic Principles and Significance
In-Memory Computing is based on two main principles: the way data is stored and scalability – the ability of a system, network or process to handle constantly growing amounts of data, or its potential to be elastically enlarged to accommodate that growth. This is achieved by leveraging two key technologies: random-access memory (RAM) and parallelization.
High Speed and Scalability: To achieve high speed and performance, In-Memory Computing is based on RAM data storage and indexing. This results in data processing and querying at more than 100 times faster than any other solution, delivering optimal and uncompromised performance and scalability for any given task.
For scalability – which is essential for big data processing – In-Memory Computing is based on parallelized distributed processing. In contrast to a single, centralized server managing and providing processing capabilities to all connected systems, distributed data processing offers a computer-networking method in which multiple computers across different locations share computer-processing capabilities.
Real-time Insights: In-Memory Computing allows for the collocations of business logic, analytics and data that can be ingested from multiple sources (multi-model store). In this way, In-Memory Computing is much more than just producing an analysis much faster than before; it’s about becoming predictive in analysis itself!
By simultaneously addressing massive amounts of streaming, hot and historical data (as in GigaSpaces’ solution), In-Memory Computing supports the running of real-time advanced analytics and machine learning for instant insights that are immediately leveraged by collocated business logic with the memory fabric. When something happens that can affect business operations, customers’ actions, regulatory compliance and more, an immediate understanding of the impact and consequences are made available, enabling the provision of an appropriate, real-time response and decision-making.
Furthermore, continuous predictive analysis leveraging the ability to ingest millions of events per second and analyze the data, prevents undesired occurrences from equipment breakdowns, customer churn, cyber attacks and more.
Abundant Range of Use Cases: In-Memory Computing is, of course, applicable for companies dealing with large volumes of data, particularly when there’s a consumer dimension such as retail, financial services, insurance, transportation, telco and utilities. Typical examples include risk and transaction management in banks/financial institutions, fraud detection for payments and in insurance, trade promotion simulations in consumer product companies and real-time/personalized advertising. But In-Memory Computing is applicable for any industry or market where real-time analysis, insights and predictions based on streaming and historical data offer business value, such as geospatial data analysis, predictive maintenance and route optimization in transportation.
Enabling Technology: Many of today’s applications and technologies would not be possible without the integration of In-Memory Computing. Typical examples of this enabling technology role include applications implementing blockchain technology (which allows digital information to be distributed but not copied), or applications involving geospatial/GIS processing for transportation (such as real-time directions on traffic congestion, recommended routes and traffic hazards).
The Hybrid Transactional and Analytical Processing (HTAP) Use Case
In contrast to the traditional computing paradigm of moving data to a separate database, processing it and then saving it back to the data store, with In-Memory Computing everything can be placed in an in-memory data grid and distributed across a horizontally scalable architecture. This is accomplished at low latency, because the disk I/O that prevents workloads and mixed heterogeneous workloads from happening in real time has been eliminated.
The In-Memory Computing concept powers a new unified paradigm. Instead of separating transactional databases from analytics databases, which leads to disk I/O and disk bottlenecks, working with in-memory data stores enables the easy elimination of bottlenecks and handles mixed workloads within the same architecture.
To leverage the full value of the unified analytical and transactional processing paradigm, In-Memory Computing platform must support:
- Applications with polyglot persistence (microservices, multiple data sources).
- Analytics that are mostly real-time streaming and predictive, in addition to historical reporting.
- Data science with modeling against live data for continuous machine and deep learning. This is based on an intuitive fail-fast/recover-fast workflow model that needs to be as close as possible to transaction data in order to act in the moment and adjust properly.
In-Memory Computing: GigaSpaces’ Approach
Figure 2: Customer Challenges that are addressed with GigaSpaces In-Memory Computing Platforms
At GigaSpaces, we’ve been delivering In-Memory Computing solutions for over a decade, powering mission-critical applications for leading enterprises worldwide. We’ve taken our core In-Memory Computing competence and developed the fastest in-memory real-time analytics platform, InsightEdge. This platform is geared for operationali