In-memory is finally in vogue!

In-memory analytics, and databases, are reaching a disruptive tipping point. Why? Why now? How can enterprise IT put them to use effectively?

If you are looking for the coolest IT trend for 2012, and a hot topic within that, it is an easy bet that ‘Big Data’ and ‘In-memory Analytics/DBs’ respectively could vie for the honors. By the same token, with any such attention and momentum comes a level of skepticism that prompts us to write both of them off as buzz and hype. A more balanced view, however, says that we might have something serious shaping up here, especially if you tend to be long on IT’s commitment to constant innovation and the track record thereof.  In support of that viewpoint, here is an analysis to say that the innovations in in-memory computing might have indeed brought us to an inflection point when the game of mainstream databases and data warehouses as we know of them, with spinning disks and related I/O overheads, is about to change.
Although a lot of the discussion will leverage data points from SAP’s HANA (High Performance ANalytic Appliance, or HAsso’s New Architecture as it is sometimes fondly called in reference to SAP’s co-founder/chairman and chief software advisor, Hasso Plattner), arguably a torch bearer of this (r)evolution, this is meant to be an analysis of the broader trend and ecosystem of in-memory computing.

The numbers
Let us begin our discussion with some numbers first, and for that let us see what Wall Street Journal’s Christopher Lawton had to say earlier this year about SAP’s financial results (in Jan 26th 2012’s article called “Inside SAP’s Skunkworks..’ – a must read for its inspiring account of Mr. Plattner’s efforts to create the next big thing, a mainstream in-memory DB appliance, for SAP and other business applications).
Built into the impressive 2011 sales of €14.23 billion (which were up by 14%), were the SAP HANA’s own high achievements of €160 million in sales, way ahead of the €100 million goal for 2011.
A more recent data point, from IDC on June 12th, 2012, says that SAP became the fastest growing DBMS vendor thanks to its acquisition of Sybase (together with Sybase’s in-memory DB product called ASE) and the development of HANA. The report says that SAP’s estimated database revenues grew from $697.2m in 2010 to $1bn in 2011.

The excitement
Market analysts are unanimously predicting that in-memory computing is about to unleash major disruptions in the coming years as a combination of new technologies/offerings and customer demand shakes up the market. The market size itself, for the broader High Performance Computing (HPC) solutions, is estimated to reach $220 billion by 2020, according to a new study by Market Research Media. The study lists in-memory computing among the fastest growing components of HPC.

From the excitement and the traction across the ecosystem it is clear that in-memory technologies would be given due consideration as we move deeper into transforming our businesses to be more real-time, more responsive, and very insights-driven.

As an enthused member of the ecosystem, SAP HANA (since its formal launch pilot with Proctor & Gamble in 2010 ) has definitely drawn the needed attention to the advantages of, and the need for, in-memory technologies in mainstream business applications, in addition to the established niche areas.

The roadmap
SAP’s roadmap, and the execution thereof, is a clear indication of the acceptance and support that in-memory computing has been gaining in the market. A few related developments include:

- SAP is expected to extend HANA DB support for the entire SAP Suite through 2015. The following chart should illustrate how SAP plans to move HANA from the current ‘analytics’ play to ‘analytics plus DB’ play

- SAP has enhanced its existing Business Warehouse for conversion to HANA.

- Treating the HANA database as an application engine is also in play.  Predictive and Statistical engines are candidates for the in-memory platform of HANA

- The latest service pack for HANA comes with integration support for Hadoop thereby opening the doors for reading/processing vast volumes distributed big data (mostly unstructured) inputs from HDFS systems to perform fast batch uploads into HANA.

-To demonstrate the high-performance computing and scalability features, SAP demonstrated a 100-node HANA cluster, with IBM hardware, with 100 TB RAM and 4,000 Intel cores.

- HANA development services are now available on Amazon EC2.

- SAP is also developing a PaaS platform with HANA and new Netweaver technologies. It is code-named Project River and will support a set of programming environments including River Description Language, Spring, and Rails. It will also come with HANA DBaaS.

For more discussion on this topic, please read the detailed report at InformationWeek.

About The Author

Founder at Adunik Inc

Sreedhar Kajeepeta is the founder of Adunik Inc, a consulting firm specializing in research, consulting, and solutions related to business transformations powered by cloud computing, big data, social networking, and mobility. Sreedhar is based in West Bloomfield, MI, USA, and can be reached at sreedhar@adunik.us.com
. You can read his blogs on "Big Data Cloud" here.

Number of Entries : 1

2013 © Big Data Cloud Inc. All Rights Reserved.

Hadoop and the Hadoop elephant logo are trademarks of the Apache Software Foundation.

Scroll to top
UA-18319319-1