Next Generation of Intelligent Applications Require Real-Time Analytics

Thanks to the rise of in-memory computing platforms from both Oracle and SAP IT organizations have had the ability to process transactions and analytics simultaneously in real time for some time now. But having a capability and being able to exploit it are not often one on the same thing. We’re just now finally starting to see the emergence of enterprise applications such as the latest version of S/4 HANA suite of ERP applications from SAP that are designed from the ground up to enable end users to take advantage of real-time analytics.

Sven Denecken, senior vice president of product management, co-innovation and packaging for SAP, says SAP has been able to take advantage of a new Fiori user interface it developed to make it possible to surface real-time analytics up alongside the latest transaction data inside an ERP application. That capability is nothing short of critical, says Denecken, because it means for the first time an end user can make an informed decision using analytics derived from the latest transaction data.

Historically, business leaders have been making decisions based on a finite amount of data residing in a data warehouse. Unfortunately, those data warehouses have always been out of sync with the latest transaction data being processed in a separate production environment. It simply wasn’t feasible to cost-effectively process transact and analyze massive amounts of data in real time before the arrival of in-memory computing platforms. That means that data being made available via the data warehouse could be out of date anywhere from a few days to several weeks.

Because the SAP HANA platform can function both as a relational and columnar data store, Deneken says it’s now feasible to run analytics in memory right alongside the transaction. In an age of digital business transformation Denecken contends being able to act on data analyzed in real time is going to be the new table stakes for enterprise applications. In fact, Denecken says real-time analytics coupled with advances in machine learning algorithms will drive a massive wave of enterprise applications upgrades in the months ahead. Businesses that employ real-time analytics to make more accurate decisions faster are simply going to have a competitive edge that rivals will not be able to match. Before too long anything less than real-time analytics embedded inside every application will not be acceptable.

Of course, what makes all that real-time intelligence possible are the latest Intel Xeon-class servers coupled DDR4 memory and all-flash arrays such as Kaminario K2. While there’s more access to memory on servers than ever, enterprise applications running across multiple servers required high-speed access to shared persistent data. That can only be accomplished when all-flash arrays are employed. Otherwise, access to DRAM and Flash memory is constrained to whatever applications happen to be running on a specific server.

The I/O performance issue that IT organizations need to be aware of is the different I/O attributes of transactional and analytics applications. Online transaction processing (OLTP) applications tend to generate small blocks of data that involve large numbers of writes versus simple reads. In contrast, analytics workloads will usually be characterized by IOs with a large block size. The Kaminario K2 architecture is specifically designed from the ground up to concurrently support both type of workloads in a way enables real-time analytics to be run against the latest copy of transaction data.

In addition, Shai Maskit, director of technical marketing for Kaminario notes that a Kaminario Clarity application-aware QoS service embedded in the Kaminario K2 arrays uniquely make it possible to prioritize I/O by block size as well as reads and writes to further optimize performance regardless of the number or types of workloads deployed.“That can only be achieved by being able to gain insights into different parts of the IT stack,” says Maskit.

Obviously, IT organizations could load every server with the maximum amount of DDR4 and Flash memory possible. But it quickly becomes apparent how prohibitively expense that approach would be for all the but the most financially well-heeled organizations.

The good news is that magnetic storage devices are no longer going to be required to drive application performance. Instead of over provisioning magnetic storage to drive application performance by painstakingly making sure data optimally lands on specific platters on a magnetic disk drive, all-Flash arrays guarantee application performance across multiple classes of enterprise applications. The days when storage administrators spent their time trying to manually squeeze every ounce of performance out of archaic magnetic storage devices are now forever over.
For most IT organizations replacing magnetic storage systems that consume far too much space and energy can’t happen soon enough. Reducing the footprint of the data center alone goes a long way to deferring the expense of IT infrastructure upgrade. ERP applications such as SAP S/4 HANA are only to tip of the proverbial iceberg. Thousands of intelligent applications that exploit real-time analytics will be rolled out in the months ahead. Of course, if those intelligent applications wind up being starved for memory the whole purpose of deploying them in the first place is defeated. Next-generation intelligent applications require access to modern IT infrastructure. Otherwise, neither the IT professionals responsible for managing those applications or the end users that rely on them will be nearly as smart as they all now need to be.

New Call-to-action