This article is an extract from the forthcoming white paper entitled Security in SAP HANA by Layer Seven Security. The paper is scheduled for release in November 2013. Please follow this link to download the publication.
According to research performed by the International Data Corporation (IDC), the volume of digital information in the world is doubling every two years. The digital universe is projected to reach 40,000 exabytes by 2020. This equates to 40 trillion gigabytes or 5200 gigabytes for every human being in the world in 2020. As much as 33 percent of this information is expected to contain analytic value. Presently, only half of one percent of available data is analyzed by organisations.
The extraction of business intelligence from the growing digital universe requires a new generation of technologies capable of analysing large volumes of data in a rapid and economic way. Conventional approaches rely upon clusters of databases that that separate transactional and analytical processing and interact with records stored in secondary or persistent memory formats such as hard disks. Although such formats are non-volatile they create a relatively high level of latency since CPUs lose considerable amounts of time during I/O operations waiting for data from remote mechanical drives. Contemporary persistent databases use complex compression algorithms to maximise data in primary or working memory and reduce latency. Nonetheless, latency times can still range from several minutes to days in high-volume environments. Therefore, persistent databases fail to deliver the real-time analysis on big data demanded by organisations that are experiencing a significant growth in data, a rapidly changing competitive landscape or both.
In-memory databases promise the technological breakthrough to meet the demand for real-time analytics at reduced cost. They leverage faster primary memory formats such as flash and Random Access Memory (RAM) to deliver far superior performance. Primary memory can be read up to 10,000 times faster than secondary memory and generate near-zero latency. While in-memory technology is far from new, it has been made more accessible to organisations by the decline in memory prices, the widespread use of multi-core processors and 64-bit operating systems, and software innovations in database management systems.
The SAP HANA platform includes a database system that processes both OLAP and OLTP transactions completely in-memory. According to performance tests performed by SAP on a 100 TB data set compressed to 3.78 TB in a 16-node cluster of IBM X5 servers with 8 TB of combined RAM, response times vary from a fraction of a second for simple queries to almost 4 seconds for complex queries that span the entire data range. Such performance underlies the appeal and success of SAP HANA. Since its launch in 2010, SAP HANA has been deployed by 2200 organisations across 25 industries to become SAP’s fastest growing product release.
SAP HANA has emerged against a backdrop of rising concern over information security resulting from a series of successful, targeted and well-publicized data breaches. This anxiety has made information security a focal point for business leaders across all industry sectors. Databases are the vessels of business information and therefore, the most important component of the technology stack. Database security represents the last line of defense for enterprise data. It should comprise of a range of interdependent controls across the dual domains of prevention and detection.
The most advanced persistent databases are the product of almost thirty years of product evolution. As a result, today’s persistent databases include the complete suite of controls across both domains to present organisations with a high degree of protection against internal and external threats. In-memory databases are in comparison a nascent technology. Therefore, most do not as yet deliver the range of security countermeasures provided by conventional databases. This includes:
Label based access control;
Data redaction capabilities to protect the display of sensitive data at the application level;
Utilities to apply patches without shutting down databases; and
Policy management tools to detect database vulnerabilities or misconfigurations against generally-accepted security standards.
The performance edge enjoyed by in-memory database solutions should be weighed against the security disadvantages vis-a-vis persistent database systems. However, it should be noted that the disadvantages may be short-lived. Security in in-memory databases has advanced significantly over a relatively short period of time. The most recent release of SAP HANA (SPS 06), for example, introduced a number of security enhancements to SPS 05 released a mere seven months earlier. This includes support for a wider number of authentication schemes, the binding of internal IP addresses and ports to the localhost interface, a secure store for credentials required for outbound connections and more granular access control for database users.
The most crucial challenge to database security presented by the introduction of in-memory databases is not the absence of specific security features but architectural concerns. Server separation is a fundamental principle of information security enshrined in most control frameworks including, most notably, the Payment Card Industry Data Security Standard (PCI DSS). According to this principle, servers must be single purpose and therefore must not perform competing functions such as application and database services. Such functions should be performed by separate physical or virtual machines located in independent network zones due to differing security classifications that require unique host-level configuration settings for each component. This architecture also supports layered defense strategies designed to forestall intrusion attempts by increasing the number of obstacles between attackers and their targets. Implementation scenarios that include the use of in-memory databases such as SAP HANA as the technical infrastructure for native applications challenge the principle of server separation. In contrast to the conventional 3-tier architecture, this scenario involves leveraging application and Web servers built directly into SAP HANA XS (Extended Application Services). Unfortunately, there is no simple solution to the issue of server separation since the optimum levels of performance delivered by in-memory databases rely upon the sharing of hardware resources between application and database components.
Aside from such architectural concerns, the storage of large quantities of data in volatile memory may amplify the impact of RAM-based attacks. Although widely regarded as one of the most dangerous security threats, attacks such as RAM-scrapping are relatively rare but are becoming more prevalent since attackers are increasingly targeting volatile memory to circumvent encrypted data in persistent memory. Another reason that RAM-based attacks are growing in popularity is that they leave virtually no footprint and are therefore extremely difficult to detect. This relative anonymity makes RAM-based attacks the preferred weapon of advanced attackers motivated by commercial or international espionage.
This paper presents a security framework for SAP HANA SPS 06 across the areas of network and communication security, authentication and authorization, data encryption and auditing and logging. It also provides security-related recommendations for the SAP HANA appliance and SAP HANA One. Taken together, the recommendations in this paper should support the confidentiality, integrity and availability of data in the SAP HANA in-memory database.