• Skip to primary navigation
  • Skip to main content
Enterprise Applications Consulting logo

eaconsult

Enterprise Applications Consulting – Josh Greenbaum

  • EAC Home
  • About
  • EAC in the Media
  • Blog
  • Contact
  • Show Search
Hide Search

SAP Gets Transactional with In-Memory Database: Changing the Game for Oracle, IBM, and Microsoft

Joshua Greenbaum · August 24, 2009

SAP has spent the last few years discussing, theorizing, and otherwise showing off numerous versions of the same idea: using in-memory databases to replace the “not getting any younger” relational database model. Last week at a conference hosted by SAP to showcase its research work with the academic community, supervisory chair and permanent SAP visionary Hasso Plattner took the concept to its ultimate conclusion: an open attack on the sacred relational database that powers three of SAP’s major rivals/partners and forms the technology core of every SAP installation on the planet.

Commercialization of the in-memory concept has already begun in the analytics/BI space with SAP’s Explorer, but the real holy grail is moving the in-memory concept into the day-to-day transactional, ERP world of SAP’s Business Suite. That concept moved a step closer to reality at the SAP Academic Conference, where Plattner discussed research efforts aimed at showing that in-memory databases, running on low-cost, multi-core systems, can work as well or better in a transactional environment as they do in an analytics environment.

The results are exactly what have been expected as in-memory, column-based databases have moved into the limelight in recent years. In a nutshell, research at Plattner’s own academic research center in Germany showed that the remarkable advances in throughput – and an equally remarkable reduction in TCO – that have been obtained in the analytics space with in-memory can be replicated when used for transactional systems.

And once this concept – in-memory, column-based databases on low-cost hardware – is applied to the enterprise, all sorts of things change. For starters, much of what we think of in terms of the typical data center design – massive arrays of disk storage backing up huge databases running on top-of-the-line hardware – can largely go away. Turns out much of the traditional data center was created to force-fit the relational model developed by Codd and Date onto an architecture that was ill-suited for optimizing a relational system. Because hardware and storage were relatively slow – especially 20+ years  ago, much less so today – the relational model included an intensive, and excessive, dependence on table structures and indexing to compensate for slow hardware and storage systems.

But in an era when RAM can be measured in tens of gigabytes and arrays of multi-core processors can provide levels of throughput that were inconceivable at the birth of the relational database back in the 1980s, the force-fitting of the relational model no longer makes sense. This doesn’t mean that thinking – and accessing – corporate data in a relational format is obsolete: SAP’s T-Rex in-memory database, as well most other column-based systems, can still use good old SQL. But all that overcompensation in the form of huge storage systems, armies of DBAs, and mind-numbing overhead costs can largely go away.

It’s rather ironic that the conference where Plattner talked about blending transactional and analytical systems in the same in-memory database took place at the Computer History Museum in Mountain View, CA. SAP’s intentions regarding the relational database will eventually relegate that data stalwart to the status of an historical monument, one that in retrospect will be missed as much as many of the other relics, like the punch card and the CRT tube, whose time has come and gone.

All that remains is to see what Oracle, Microsoft, and IBM will do when the hegemony of their respective relational database offerings is challenged by an in-memory, column-based, alternative. My guess: kick and scream, and then eventually get on the bandwagon. This kind of technical advance will be too hard to fight against.

August 24, 2009 · 21 Comments

Reader Interactions

Comments

  1. Simon G says

    August 25, 2009 at 11:53 am

    Interesting that column-based databases are beginning to see the light. In 1995 at Redbrick Systems in Los Gatos, Ca, I first heard about the change from row-oriented to column-oriented databases; Redbrick was pioneering the relational DBMS for data warehousing at that time.

    Reply
  2. Krishna Moorthy says

    August 26, 2009 at 3:04 pm

    No doubt, in-memory databases are a game changer. As you’ve noted, the need for separate analytical systems will (and should) go away.

    Reply
  3. Dilip says

    September 9, 2009 at 3:32 pm

    But, what happens if the app crashes for some reason, now since the in-memory database lives in the same space of the app, both would crash and how would be the recovery process? What if the in-memory database gets corrupted? what is the drp plan for this?

    Regards.
    Dilip

    Reply
    • ematters says

      September 9, 2009 at 4:02 pm

      Both are important questions: as long as the app doesn’t crash due to a memory hardware failure, the operating system would isolate the app’s processes from the database’s process, and should not cause everything in memory to crash. Re corrupted databases, my understanding is the master database is stored on disk, as are changes that occur in-memory, so that the in-memory database gets restored from disk (where the bulk of a very large database would reside anyway.)
      Josh

      Reply
  4. Theo Stolker says

    September 10, 2009 at 6:28 am

    It is exciting to see how in-memory databases add new possibilities and improve performance.

    However, it is ridiculous to say that that removes the need for physical storage and backups, as you will need a way to also persist the data. Products like Oracle Coherence provide that capability and combine the best of both world (in-memory data for maximum performance, backing store for reliability).

    Also, stating that DBA’s are no longer necessary is of course foolish. Yes, databases can work with ill-designed data, but I will not buy SAP applications built on this concept if they stop designing their databases.

    It’s obvious SAP tries to get some positive buzz out of it. But please avoid the hype, looking at R3, it will take at least 3 release to get it right!

    Reply
    • ematters says

      September 10, 2009 at 4:15 pm

      Theo — Not sure where your references to removing the need for physical storage and backups, or the idea that DBAs are no long necessary are coming from, certainly not my blog. The fact is that massive in-line storage and an army of DBAs will not be needed — we’ll still need storage and DBAs, just not as many as before. As for your comment about R/3 and SAP’s positive buzz: SAP isn’t the only proponent of in-memory database, Oracle bought Times Ten several years ago, which pioneered this concept. As Times Ten had a commercial in-memory product on the market before their acquisition, I don’t think SAP or anyone has to defend the viability of the concept.

      Reply
      • Theo Stolker says

        September 11, 2009 at 6:41 am

        Hi,

        Thanks for putting things back in perspective by your reply. Based onyour post, it seemed to be be black-and-white.

        Based on the title of the post it looks like SAP is doing something that has never been done before, but you are right, Oracle has similar technoly already for a long time.

        But I am excited and do believe that in-memory databases will change the way we view the database.

        Reply
      • ZenCushion says

        May 13, 2010 at 6:48 pm

        ematters, you said:

        “Not sure where your references to removing the need for physical storage and backups, or the idea that DBAs are no long necessary are coming from, certainly not my blog”

        Hello ?? from YOUR article:

        “But all that overcompensation in the form of huge storage systems, armies of DBAs, and mind-numbing overhead costs can largely go away.”

        Use of higher-performance in-memory data structures in now way removes or significantly reduces the need for data definition, modeling, and design — all jobs of the DBA — and your proclamations that in-memory databases would somehow magically cause the need to manage, define, and understand data representations is a bit like saying that application-tailoring wizards eliminate the need for software engineering — overly simplistic, and … incorrect !

        Reply
        • ematters says

          May 13, 2010 at 7:41 pm

          Hello ?? back to you: the words “overcompensation”, “huge”, “armies”, and “mind-numbing” were meant to express a degree of complexity and expense that are no longer needed. I realize these terms are not as explicit as saying “in-memory will greatly reduce the overhead costs of traditional RBDMS technology”, so for your benefit I have said it. We obviously disagree on the amount of the reduction in overhead, but every big DBMS shop that I have spoken to that has just moved from an RDBMS to a column-based database has experienced that “significant” reduction in overhead that you claim doesn’t exist. I would argue that when you move a column-based database in-memory you have an additional, and significant, further reduction in overhead.
          Josh

          Reply
      • chirag says

        September 14, 2011 at 8:04 am

        i think what is important is that there should be only one database for both OLTP & OLAP systems.

        ultimately both the techincal design of database & our own domain knowledge for data modelling should be focussed on integrating OLAP & OLTP.

        certainly in-memory db are useful but ? is about RAM technology.

        this does not mean that data should be modelled in any way.

        Reply
  5. Tony Brown says

    September 23, 2009 at 11:32 pm

    I don’t know If I said it already but …Excellent site, keep up the good work. I read a lot of blogs on a daily basis and for the most part, people lack substance but, I just wanted to make a quick comment to say I’m glad I found your blog. Thanks, 🙂

    A definite great read..Tony Brown

    Reply
  6. Vaidhya says

    December 21, 2009 at 7:29 am

    Not sure if Josh recollects his good old post many years ago. SAP has been using in-memory DB in APO for 10 years+

    http://www.netweavermagazine.com/archive/Volume_03_(2007)/Issue_01_(Winter)/v3i1a02.cfm?session

    Reply
  7. Paul Hamerman says

    February 10, 2010 at 11:10 pm

    Anyone mention Workday? Their app runs in-memory today.

    Reply
  8. Trevor Miles says

    May 26, 2010 at 11:36 am

    Josh

    As we have discussed before we have in-memory technology for supply chain management planning and short-term response to unexpected events. We have many customers that can run a full MRP through several tiers in less than 5 minutes. The same run in an ERP system takes 8 or more hours. In one instance of multiple ERP’s and multiple instances of the same ERP, which requires cascaded MRP, the MRP takes over 24 hours for the 100 sites. We bring that complete supply chain into a single instance of RapidResponse (our on-demand application), including data from suppliers and customers, and run the full horizon MRP in 59 sec.

    While this is a narrower application that a more generic ERP system, I would argue that the operations planning aspects are the most data and CPU intensive aspect of an ERP system. Clearly we have demonstrated the benefits of in-memory technology coupled with analytics.

    As previously discussed, we would be only too happy to give you a deep dive.

    Regards
    Trevor Miles
    http://www.kinaxis.com
    cell: 647.248.6269

    Reply

Trackbacks

  1. Twitter Trackbacks for SAP Gets Transactional with In-Memory Database: Watch Out Oracle, IBM, and Microsoft, SAP Wants to Change [ematters.wordpress.com] on Topsy.com says:
    August 26, 2009 at 2:50 am

    […] SAP Gets Transactional with In-Memory Database: Watch Out Oracle, IBM, and Microsoft, SAP Wants to C… ematters.wordpress.com/2009/08/24/throwing-down-the-gauntlet-sap-get-transactional-with-in-memory-database-or-watch-out-oracle-ibm-and-microsoft-sap-wants-to-change-your-game – view page – cached #Enterprise Matters RSS Feed Enterprise Matters » SAP Gets Transactional with In-Memory Database: Watch Out Oracle, IBM, and Microsoft, SAP Wants to Change Your Gam Comments Feed Enterprise Matters Life After ZDNet: Welcome to Enterprise Matters SAP vs. Oracle in the BI Space: the Line of Business — From the page […]

    Reply
  2. Video Interview with SAP’s John Schwartz | Zoli’s Blog says:
    September 1, 2009 at 12:07 am

    […] SAP Gets Transactional with In-Memory Database: Changing the Game for Oracle, IBM, and Microsoft […]

    Reply
  3. Video Interview with SAP’s John Schwartz | CloudAve says:
    September 1, 2009 at 1:08 am

    […] SAP Gets Transactional with In-Memory Database: Changing the Game for Oracle, IBM, and Microsoft […]

    Reply
  4. Focus on Analytics and BI: SAP, Microsoft, and SAS Grab For Market Dominance (and don’t forget IBM and Oracle too). « Enterprise Matters says:
    December 2, 2009 at 6:30 pm

    […] Strategy. Now The Fun Can BeginSAP vs. Oracle in the BI Space: the Line of Business AdvantageSAP Gets Transactional with In-Memory Database: Changing the Game for Oracle, IBM, and MicrosoftAbout Josh GreenbaumEnterprise vs. the Consumer Cloud: Azure vs. Amazon is the Wrong Fight to […]

    Reply
  5. Focus on Analytics and BI: Understanding The Interplay Between SAP, Microsoft, SAS, IBM, and Oracle says:
    December 2, 2009 at 7:41 pm

    […] and therein lies perhaps the most exciting market competition in enterprise software (actually, the database side is pretty exciting too, but in the interest of focus, we’ll leave that aside for now.) And this […]

    Reply
  6. SAP’s In-memory Hana Database Appliance Gets Apped « Enterprise Matters says:
    March 10, 2011 at 4:10 am

    […] in Ascendance, and Watch out for InforSAP vs. Oracle in the BI Space: the Line of Business AdvantageSAP Gets Transactional with In-Memory Database: Changing the Game for Oracle, IBM, and MicrosoftThe IBM/SAP/Oracle/HP Show: ALM and the Struggle for Lower TCO and Better Account ControlThe […]

    Reply
  7. SAP’s In-memory Hana Database Appliance Gets Apped says:
    March 11, 2011 at 4:10 am

    […] Not that there’s anything wrong with that, who doesn’t get all excited about a column-based, in-memory database appliance that can crunch a few hundred million records in a split second or […]

    Reply

Leave a Reply to SAP’s In-memory Hana Database Appliance Gets Apped Cancel reply

Your email address will not be published. Required fields are marked *

josh@eaconsult.com

  • LinkedIn
  • Twitter

Copyright © 2025 · Log in

  • EAC Home
  • About
  • EAC in the Media
  • Blog
  • Contact