Connect with us

Database Security

Data Encryption Using DBMS OBFUSCATION TOOLKIT

Izunna Okpala

Published

on

This article provides a description of the data encryption package (DBMS_OBFUSCATION_TOOLKIT) that allows you to encrypt data in a database. Data encryption topics are presented in the following sections:

  • Securing Sensitive Information
  • Principles of Data Encryption
  • Solutions For Stored Data Encryption in Oracle9i
  • Data Encryption Challenges
  • Example of Data Encryption PL/SQL Program
  • Securing Sensitive Information

The Internet poses new challenges in information security, especially for those organizations seeking to become e-businesses. Many of these security challenges can be addressed by the traditional arsenal of security mechanisms:

  • Strong user authentication to identify users
  • Granular access control to limit what users can see and do
  • Auditing for accountability
  • Network encryption to protect the confidentiality of sensitive data in transmission

Encryption is an important component of several of the above solutions. For example, Secure Sockets Layer (SSL), an Internet-standard network encryption and authentication protocol, uses encryption to strongly authenticate users by means of X.509 digital certificates. SSL also uses encryption to ensure data confidentiality, and cryptographic checksums to ensure data integrity. Many of these uses of encryption are relatively transparent to a user or application. For example, many browsers support SSL, and users generally do not need to do anything special to enable SSL encryption.
Oracle has provided network encryption between database clients and the Oracle database since Oracle7. Oracle Advanced Security, an option to Oracle9i, provides encryption and cryptographic integrity check for any protocol supported by Oracle9i, including Net8, Java Database Connectivity (JDBC) (both “thick” and “thin” JDBC), and the Internet Intra-Orb Protocol (IIOP). Oracle Advanced Security also supports SSL for Net8, “thick” JDBC and IIOP connections.
While encryption is not a security cure-all, it is an important tool used to address specific security threats, and will become increasingly important with the growth of e-business, especially in the area of encryption of stored data. For example, while credit card numbers are typically protected in transit to a web site using SSL, the credit card number is often stored in the clear (un-encrypted), either on the file system, where it is vulnerable to anyone who can break into the host and gain root access, or in databases. While databases can be made quite secure through proper configuration, they can also be vulnerable to host break-ins if the host is mis-configured. There have been several well-publicized break-ins, in which a hacker obtained a large list of credit card numbers by breaking into a database.
Encryption of stored data thus represents a new challenge for e-businesses, and can be an important tool in dealing with specific types of security threats.

Principles of Data Encryption
While there are many good reasons to encrypt data, there are many bad reasons to encrypt data. Encryption does not solve all security problems, and may even make some problems worse. The following section describes some of the misconceptions about encryption of stored data. It includes these topics:

  • Principle 1: Encryption Does Not Solve Access Control Problems
  • Principle 2: Encryption Does Not Protect Against a Malicious DBA
  • Principle 3: Encrypting Everything Does Not Make Data Secure

Principle 1: Encryption Does Not Solve Access Control Problems
Most organizations need to limit access to data to those who have a “need to know.” For example, a human resources system may limit employees to reviewing only their own employment records, while managers of employees may see the employment records of those employees working for them. Human resources specialists may also need to see employee records for multiple employees.
This type of security policy—–limiting data access to those with a need to see it—–is typically addressed by access control mechanisms. The Oracle database has provided strong, independently-evaluated access control mechanisms for many years. Recently, Oracle9i has added the ability to enforce access control to an extremely fine level of granularity, through its Virtual Private Database capability.
Because human resources records are considered sensitive information, it is tempting to think that this information should all be encrypted “for better security.” However, encryption cannot enforce the type of granular access control described above, and may actually hinder data access. In the human resources example, an employee, his manager, and the HR clerk all need to access the employee’s record. If employee data is encrypted, then each person also has to be able to access the data in un-encrypted form. Therefore, the employee, the manager and the HR clerk would have to share the same encryption key to decrypt the data. Encryption would therefore not provide any additional security in the sense of better access control, and the encryption might actually hinder the proper functioning of the application. An additional issue is that it is very difficult to securely transmit and share encryption keys among multiple users of a system.
A basic principle behind encrypting stored data is that it must not interfere with access control. For example, a user who has SELECT privilege on EMP should not be limited by the encryption mechanism from seeing all the data he is otherwise allowed to see. Similarly, there is little benefit to encrypting part of a table with one key and part of a table with another key if users need to see all encrypted data in the table; it merely adds to the overhead of decrypting the data before users can read it. Provided that access controls are implemented well, there is little additional security provided within the database itself from encryption. Any user who has privilege to access data within the database has no more nor any less privilege as a result of encryption. Therefore, encryption should never be used to solve access control problems.

Principle 2: Encryption Does Not Protect Against a Malicious DBA
Some organizations are concerned that a malicious user can gain elevated (DBA) privilege through guessing a password. These organizations would like to encrypt stored data to protect against this threat. However, the correct solution to this problem is to protect the DBA account, and to change default passwords for other privileged accounts. The easiest way to break into a database is through an unchanged password for a privileged account (for example, SYS/CHANGE_ON_INSTALL).
Note that there are many other destructive things a malicious user can do to a database once he gains DBA privilege. Examples include corrupting or deleting data, exporting user data to the file system to mail the data back to himself so as to run a password cracker on it, etc. Encryption will not protect against these threats.
Some organizations are concerned that database administrators (DBAs), because they typically have all privileges, are able to see all data in the database. These organizations feel that the DBAs should merely administer the database, but should not be able to see the data that the database contains. Some organizations are also concerned about the concentration of privilege in one person, and would prefer to partition the DBA function, or enforce two-person access rules.
It is tempting to think that encrypting all data (or significant amounts of data) will solve the above problems, but there are better ways to protect against these threats. First of all, Oracle does support limited partitioning of DBA privilege. Oracle9i provides native support for SYSDBA and SYSOPER users. SYSDBA has all privileges, but SYSOPER has a limited privilege set (such as startup and shutdown of the database). Furthermore, an organization can create smaller roles encompassing a number of system privileges. A JR_DBA role might not include all system privileges, but only those appropriate to a more junior database administrator (such as CREATE TABLE, CREATE USER, and so on). Oracle also enables auditing the actions taken by SYS (or SYS-privileged users) and storing that audit trail in a secure operating system location. Using this model, a separate auditor who has root on the OS can audit all actions by SYS, enabling the auditor to hold all DBAs accountable for their actions.
See Also:
The audit_sys_operations parameter in Oracle9i Database Administrator’s Guide

Furthermore, the DBA function by its nature is a trusted position. Even organizations with the most sensitive data—–such as intelligence agencies—–do not typically partition the DBA function. Instead, they vet their DBAs strongly, because it is a position of trust. Periodic auditing can help to uncover inappropriate activities.
Encryption of stored data must not interfere with the administration of the database; otherwise, larger security issues can result. For example, if by encrypting data you corrupt the data, you’ve created a security problem: data is not meaningful and may not be recoverable.
Encryption can be used to mitigate the ability of a DBA—–or other privileged user—–to see data in the database. However, it is not a substitute for vetting a DBA properly, or for limiting the use of powerful system privileges. If an untrustworthy user has significant privilege, there are multiple threats he can pose to an organization, and these may be far more significant than viewing un-encrypted credit card numbers.
Principle 3: Encrypting Everything Does Not Make Data Secure
It is a pervasive tendency to think that if storing some data encrypted strengthens security, then encrypting everything makes all data secure.
As described above, encryption does not address access control issues well. Consider the implications of encrypting an entire production database. All data must be decrypted to be read, updated, or deleted, and the encryption must not interfere with normal access controls. Encryption is inherently a performance-intensive operation; encrypting all data will significantly affect performance. Availability is a key aspect of security and if, by encrypting data, you make data unavailable, or the performance adversely affects availability, then you have created a new security problem.
Encryption keys must be changed regularly as part of good security practice, which necessitates that the database be inaccessible while the data is being decrypted and re-encrypted with a new key or keys. This also adversely affects availability.
While encrypting all or most of the data in a production database is clearly a problem, there may be advantages to encrypting data stored off-line. For example, an organization may store backups for a period of six months to a year off-line, in a remote location. Of course, the first line of protection is to secure the data in a facility to which access is controlled–a physical measure. However, there may be a benefit to encrypting this data before it is stored. Since it is not being accessed on-line, performance need not be a consideration. While Oracle9i does not provide this facility, there are vendors who can provide such encryption services. Before embarking on large-scale encryption of backup data, organizations considering this approach should thoroughly test the process. It is essential that any data which is encrypted before off-line storage, can be decrypted and re-imported successfully.
Solutions For Stored Data Encryption in Oracle9i
DBMS_OBFUSCATION_TOOLKIT provides several means for addressing the security issues that have been discussed. This section includes these topics:

  • Oracle9i Data Encryption Capabilities
  • Data Encryption Challenges

Oracle9i Data Encryption Capabilities
While there are many security threats that encryption cannot address well, it is clear that an additional measure of security can be achieved by selectively encrypting sensitive data before storage in the database. Examples of such data could include:

  • Credit card numbers
  • National identity numbers
  • Passwords for applications whose users are not database users

To address these needs, Oracle9i provides a PL/SQL package to encrypt and decrypt stored data. The package, DBMS_OBFUSCATION_TOOLKIT, is provided in both Standard Edition and Enterprise Edition Oracle9i. This package currently supports bulk data encryption using the Data Encryption Standard (DES) algorithm, and includes procedures to encrypt (DESEncrypt) and decrypt (DESDecrypt) using DES. The DBMS_OBFUSCATION_TOOLKIT also includes functions to encrypt and decrypt using 2-key and 3-key DES, in outer cipher block chaining mode. They require keylengths of 128 and 192 bits, respectively.
The DBMS_OBFUSCATION_TOOLKIT includes a cryptographic checksumming capabilities (MD5), and the ability to generate a secure random number (GetKey). Secure random number generation is important part of cryptography; predictable keys are easily-guessed keys, and easily-guessed keys may lead to easy decryption of data. Most cryptanalysis is done by finding weak keys or poorly-stored keys, rather than through brute force analysis (cycling through all possible keys).
________________________________________
Note:
Do not use DBMS_RANDOM as it is unsuitable for cryptographic key generation.
________________________________________

Key management is programmatic. That is, the application (or caller of the function) must supply the encryption key; and this means that the application developer must find a way of storing and retrieving keys securely. The relative strengths and weaknesses of various key management techniques are discussed below. The DBMS_OBFUSCATION_TOOLKIT package, which can handle both string and raw data, requires the submission of a 64-bit key. The DES algorithm itself has an effective key length of 56-bits.
The DBMS_OBFUSCATION_TOOLKIT is granted to PUBLIC by default. Oracle strongly recommends that this grant be revoked. In general, there is no reason why users should be able to encrypt stored data outside the context of an application.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Database Security

The proliferation of Digital currencies in Africa calls for stronger security measures

Izunna Okpala

Published

on

The advantages of cryptocurrencies in Africa are also being exploited as cyber criminals become increasingly technically skilled.

Virtual currency related crime has increased in Africa especially in South Africa, with hackers using peoples’ phones according to mine cryptocurrencies. As the popularity of crypto-assets such as bitcoin grows, various agencies have begun the process of reviewing the consumer impact of the crypto-sector and the impact on personal financial security.

Paxful handles more than 50,000 trades a day with three million wallets worldwide, with South Africa being one of its top markets. The fluctuating value of the most popular digital currency in the world has not dissuaded South Africans either as this month Paxful reported a huge 2800 percent increase in South African trades compared to October 2018. Johannesburg, Pretoria, and Cape Town are the country’s three cities contributing to the number of users. Overall, Paxful also noticed that, when compared to the same period, the number of trades across the African continent has risen by 64 percent.

Importance and transparency of increased security

While the vast majority of cryptocurrency trades are secure, the market has not been entirely resistant to scams and fraudulent transactions in recent years. Scammers prey on users who lack the appropriate awareness in this space with the cryptocurrency industry still quite new.

Marius Reitz, South Africa country manager for cryptocurrency wallet Luno, said while cryptocurrencies like Bitcoin were used for illicit activity, the situation is improving as law enforcement agencies get better at tracking transactions, but online scams are still common.

The most popular form of digital currency scams is through phishing, wherein the criminals use a legitimate-looking (but fake) website, email or SMS to trick you into providing password or detailed information said Reitz.

“Bitcoin is secure, but sometimes it’s dangerous how users use it. The users must treat Bitcoin as cash and protect our personal information and passwords just as we do when we use an ATM or shop online. It’s irreversible when your bitcoin is in the possession of hackers and scammers, so it’s important to have the required safeguards in place. “adds Ray Youssef, Paxful’s co-founderand CEO.

On Paxful, users are safe and secured from scammers through various security measures including the escrow service from Paxful which keeps the bitcoin safe until a transaction is completed.

Protecting Your Cryptocurrency Assets

Youssef concludes: “P2P finance is their only hope of financial inclusion and empowerment for more and more people around the world. While the risk of crypto fraud and theft can not be completely eliminated, you can significantly reduce your chances of becoming a target by following Paxful’s top tips for safe trading and taking a few simple precautions.”

Continue Reading

Database Security

Understanding Oracle Database Firewall

Izunna Okpala

Published

on

oracle-database

Despite what Oracle calls it, this is a Database Activity Monitoring product at its core. Just one with more of a security focus than audit/compliance, and based on network monitoring (it lacks local activity monitoring, which is why it’s weaker for compliance).

  1. Many other DAM products can block, and Secerno can monitor. I always thought it was an interesting product.
  2. Most DAM products include network monitoring as an option. The real difference with Secerno is that they focused far more on the security side of the market, even though historically that segment is much smaller than the audit/monitoring/compliance side. So Oracle has more focus on blocking, and less on capturing and storing all activity.
  3. It is not a substitute for Database Activity Monitoring products, nor is it “better” as Oracle claims. Because it is a form of DAM, but – as mentioned by competitors in the article – you still need multiple local monitoring techniques to handle direct access. Network monitoring alone isn’t enough. I’m sure Oracle Services will be more than happy to connect Secerno and Oracle Audit Vault to do this for you.
  4. Secerno basically whitelists queries (automatically) and can block unexpected activity. This appears to be pretty effective for database attacks, although I haven’t talked to any pen testers who have gone up against it. (They do also blacklist, but the whitelist is the main secret sauce).
  5. Secerno had the F5 partnership before the Oracle acquisition. It allowed you to set WAF rules based on something detected in the database (e.g., block a signature or host IP). I’m not sure if they have expanded this post-acquisition. Imperva is the only other vendor that I know of to integrate DAM/WAF.
  6. Oracle generally believes that if you don’t use their products your are either a certified idiot or criminally negligent. Neither is true, and while this is a good product I still recommend you look at all the major competitors to see what fits you best. Ignore the marketing claims.
  7. Odds are your DBA will buy this when you aren’t looking, as part of some bundle deal. If you think you need DAM for security, compliance, or both… start an assessment process or talk to them before you get a call one day to start handling incidents.

In other words: a good product with advantages and disadvantages, just like anything else. More security than compliance, but like many DAM tools it offers some of both. Ignore the hype, figure out your needs, and evaluate to figure out which tool fits best. You aren’t a bad person if you don’t buy Oracle, no matter what your sales rep tells your CIO.

And seriously – watch out for the deal bundling. If you haven’t learned anything from us about database security by now, hopefully you at least realize that DBAs and security don’t always talk as much as they should (the same goes for Guardium/IBM). If you need to be involved in any database security, start talking to the DBAs now, before it’s too late.

Continue Reading

Database Security

Understanding and Selecting DSP: Data and Event Collection

Izunna Okpala

Published

on

The evolution of DAM has been framed by these different methods of data collection. That’s important, because what you can do is highly dependent on the data you can collect.

One of its central aspects is the evolution of event collection mechanisms from native audit, to monitoring network activity, to agent-based activity monitoring. These are all database-specific information sources. The evolution of DAM has been framed by these different methods of data collection. That’s important, because what you can do is highly dependent on the data you can collect. For example, the big reason agents are the dominant collection model is that you need them to monitor administrators – network monitoring can’t do that (and is quite difficult in distributed environments).

The development of DAM into DSP also entails examination of a broader set of application-related events. By augmenting the data collection agents we can examine other applications in addition to databases – even including file activity. This means that it has become possible to monitor SAP and Oracle application events – in real time. It’s possible to monitor user activity in a Microsoft SharePoint environment, regardless of how data is stored. We can even monitor file-based non-relational databases. We can perform OS, application, and database assessments through the same system.

A slight increase in the scope of data collection means much broader application-layer support. Not that you necessarily need it – sometimes you want a narrow database focus, while other times you will need to cast a wider net. We will describe all the options to help you decide which best meets your needs.

Let’s take a look at some of the core data collection methods used by customers today:

Event Sources

Local OS/Protocol Stack Agents: A software ‘agent’ is installed on the database server to capture SQL statements as they are sent to the databases. The events captured are returned to the remote Database Security Platform. Events may optionally be inspected locally by the agent for real-time analysis and response. The agents are either deployed into the host’s network protocol stack or embedded into the operating system, to capture communications to and from the database. They see all external SQL queries sent to the database, including their parameters, as well as query results. Most critically, they should capture administrative activity from the console that does not come through normal network connections. Some agents provide an option to block malicious activity – either by dropping the query rather than transmitting it to the database, or by resetting the suspect user’s database connection.

Most agents embed into the OS in order to gain full session visibility, and so require a system reboot during installation. Early implementations struggled with reliability and platform support problems, causing system hangs, but these issues are now fortunately rare. Current implementations tend to be reliable, with low overhead and good visibility into database activity. Agents are a basic requirement for any DSP solution, as they are a relatively low-impact way of capturing all SQL statements – including those originating from the console and arriving via encrypted network connections.

Performance impact these days is very limited, but you will still want to test before deploying into production.

Network Monitoring: An exceptionally low-impact method of monitoring SQL statements sent to the database. By monitoring the subnet (via network mirror ports or taps) statements intended for a database platform are ‘sniffed’ directly from the network. This method captures the original statement, the parameters, the returned status code, and any data returned as part of the query operation. All collected events are returned to a server for analysis. Network monitoring has the least impact on the database platform and remains popular for monitoring less critical databases, where capturing console activity is not required.

Lately the line between network monitoring capabilities and local agents has blurred. Network monitoring is now commonly deployed via a local agent monitoring network traffic on the database server itself, thereby enabling monitoring of encrypted traffic. Some of these ‘network’ monitors still miss console activity – specifically privileged user activity. On a positive note, installation as a user process does not require a system reboot or cause adverse system-wide side effects if the monitor crashes unexpectedly. Users still need to verify that the monitor is collecting database response codes, and should determine exactly which local events are captured, during the evaluation process.

Memory Scanning: Memory scanners read the active memory structures of a database engine, monitoring new queries as they are processed. Deployed as an agent on the database platform, the memory scanning agent activates at pre-determined intervals to scan for SQL statements. Most memory scanners immediately analyze queries for policy violations – even blocking malicious queries – before returning results to a central management server. There are numerous advantages to memory scanning, as these tools see every database operation, including all stored procedure execution. Additionally, they do not interfere with database operations.

You’ll need to be careful when selecting a memory scanning product – the quality of the various products varies. Most vendors only support memory scanning on select Oracle platforms – and do not support IBM, Microsoft, or Sybase. Some vendors don’t capture query variables – only the query structure – limiting the usefulness of their data. And some vendors still struggle with performance, occasionally missing queries. But other memory scanners are excellent enterprise-ready options for monitoring events and enforcing policy.

Database Audit Logs: Database Audit Logs are still commonly used to collect database events. Most databases have native auditing features built in; they can be configured to generate an audit trail that includes system events, transactional events, user events, and other data definitions not available from any other sources. The stream of data is typically sent to one or more locations assigned by the database platform, either in a file or within the database itself. Logging can be implemented through an agent, or logs can be queried remotely from the DSP platform using SQL.

Audit logs are preferred by some organization because they provide a series of database events from the perspective of the database. The audit trail reconciles database rollbacks, errors, and uncommitted statements – producing an accurate representation of changes made. But the downsides are equally serious. Historically, audit performance was horrible. While the database vendors have improved audit performance and capabilities, and DSP vendors provide great advice for tuning audit trails, bias against native auditing persists. And frankly, it’s easy to mess up audit configurations. Additionally, the audit trail is not really intended to collect SELECT statements – viewing data – but focused on changes to data or the database system. Finally, as the audit trail is stored and managed on the database platform, it competes heavily for database resources – much more than other data collection methods. But given the accuracy of this data, and its ability to collect internal database events not available to network and OS agent options, audit remains a viable – if not essential – event collection option.

One advantage of using a DSP tool in conjunction with native logs is that it is easier to securely monitor administrator activity. Admins can normally disable or modify audit logs, but a DSP tool may mitigate this risk.

Discovery and Assessment Sources

Network Scans: Most DSP platforms offer database discovery capabilities, either through passive network monitoring for SQL activity or through active TCP scans of open database ports. Additionally, most customers use remote credentialed scanning of internal database structures for data discovery, user entitlement reporting, and configuration assessment. None of these capabilities are new, but remote scanning with read-only user credentials is the the standard data collection method for preventative security controls.

There are many more methods of gathering data and events, but we’re focusing on the most commonly used. If you are interested in a more depth on the available options, our blog post on Database Activity Monitoring & Event Collection Options provides much greater detail. For those of you who follow our stuff on a regular basis, there’s not a lot of new information there.

Expanded Collection Sources

A couple new features broaden the focus of DAM. Here’s what’s new:

File Activity Monitoring: One of the most intriguing recent changes in event monitoring has been the collection of file activity. File Activity Monitoring (FAM) collects all file activity (read, create, edit, delete, etc.) from local file systems and network file shares, analyzes the activity, and – just like DAM – alerts on policy violations. FAM is deployed through a local agent, collecting user actions as they are sent to the operating system. File monitors cross reference requests against Identity and Access Management (e.g., LDAP and Active Directory) to look up user identities. Policies for security and compliance can then be implemented on a group or per-user basis.

This evolution is important for two reasons. The first is that document and data management systems are moving away from strictly relational databases as the storage engine of choice. Microsoft SharePoint, mentioned above, is a hybrid of file management and relational data storage. FAM provides a means to monitor document usage and alert on policy violations. Some customers need to address compliance and security issues consistently, and don’t want to differentiate based on the idiosyncrasies of underlying storage engines, so FAM event collection offers consistent data usage monitoring.

Another interesting aspect of FAM is that most of the databases used for Big Data are non-relational file-based data stores. Data elements are self-describing and self-indexing files. FAM provides the basic capabilities of file event collection and analysis, and we anticipate the extension of these capabilities to cover non-relational databases. While no DSP vendor offers true NoSQL monitoring today, the necessary capabilities are available in FAM solutions.

Application Monitoring: Databases are used to store application data and persist application state. It’s almost impossible to find a database not serving an application, and equally difficult to find an application that does not use a database. As a result monitoring the database is often considered sufficient to understand application activity. However most of you in IT know database monitoring is actually inadequate for this purpose. Applications use hundreds of database queries to support generic forms, connect to databases with generic service accounts, and/or uses native application codes to call embedded stored procedures rather than direct SQL queries. Their activity may be too generic, or inaccessible to traditional Database Activity Monitoring solutions. We now see agents designed and deployed specifically to collect application events, rather than database events. For example SAP transaction codes can be decoded, associated with a specific application user, and then analyzed for policy violations. As with FAM, much of the value comes from better linking of user identity to activities. But extending scope to embrace the application layer directly provides better visibility into application usage and enables more granular policy enforcement.

This article has focused on event collection for monitoring activity.

Continue Reading

Trending