Intel’s DC Persistent memory is revolutionary that will power the next innovation in business intelligence. Intel DC Persistent Memory is BIT addressable 3D-NAND (FLASH) memory. The last Intel product allows you to insert it into DIMM slots on a server. However, what does that mean for performance and business capability? I think SAP HANA is the best example of what’s possible.
Consider the point of an in-memory database like SAP HANA. HANA allows business leaders to ask questions of their data. With the database held in-memory, the processor remains unencumbered by a relatively slow storage subsystem.
I deployed SAP Business Warehouse (BW) on HANA a couple of years ago. Queries that previously took hours took minutes. Queries that would time-out would now complete in an hour or less. The increase in business capability was incredible.
What did we find as a result? The business needed answers to more significant questions. Having transactional data inside the data warehouse wasn’t enough. Data from the accounting systems and 3rd parties could provide more substantial insights. The result is more massive HANA databases which required more memory.
An SAP HANA database could always be larger than the memory in a system. Basic computer science enables caching to disk. Disk caching allows for more sizable working datasets. However, that comes at a performance cost. Even the worlds faster storage creates a bottleneck for in-memory databases.
Enter Intel DC Persistent Memory. By leverage less expensive yet fast 3D-NAND, Intel can expand the memory of a system to 24TB of space. The system is a mix of 3D-NAND and traditional DRAM. Intel isn’t the first manufacturer to introduce the concept. SanDisk previewed UltraDIMMs over 5-years ago. However, Intel leverages the power of processor design to further the innovation.
While 3D-NAND is indeed fast, it is still much slower than DRAM. Intel placed a memory controller within their new Xeon CPU design. An on-CPU controller means that SAP and Intel could work together to determine what parts of the HANA DB could reside in DRAM vs. 3D-NAND.
The promise? A balance between CPU addressable 3D-NAND memory and DRAM. From an SAP HANA perspective, the memory of a system increases to 24TB without any significant changes to the HANA interface. The assumption is a seamless increase in in-memory DB capacity.
I haven’t spoken with a customer that has implemented a system based on Intel’s DC Persistent Memory. However, I assume that customers will see incredible increases in capabilities to answer questions. At Intel’s DataCentric event where the product launched, Intel shared a healthcare use case. A healthcare provider was able to provide near real-time MRI results combining Intel’s new CPUs and Persistent Memory technology.
Facing the mountain of data in the enterprise, 24TB is a drop in a bucket. Enterprises must still intelligently filter data to determine which parts of the datasets should load into in-memory DB. Intel and SAP’s innovation doesn’t eliminate the challenge. It does reduce some of the friction in getting the job done.
In-memory technology moves fast. I still recommend limiting the time commitment to the related infrastructure. From a CapEx perspective, in most cases, I recommend taking as short of a depreciation cycle as possible on these systems. While IT infrastructure teams could repurpose these systems, it’s unlikely to match the standard of virtualization platforms existing in the data center.
To reduce the barrier to entry, I’d also highly recommend looking at cloud implementations of the technology. Microsoft Azure announced a 24TB bare-metal instance of HANA while AWS offers an 18TB instance. As technology continues to march forward, enterprises must remain an agile infrastructure to keep pace.
Disclaimer: I was invited to the Intel event by Gestalt IT. Neither Gestalt IT or Intel provided consideration for this post. You can find out more about Gestalt IT and their Tech Field Day events at https://techfieldday.com.