Monday, February 2, 2015

The Vicious Cycle of “Assuming Compromise” [feedly]



----
The Vicious Cycle of "Assuming Compromise"
// A Collection of Bromides on Infrastructure

When you walk the floors of industry trade shows and speak with security vendors, one of the most predominant endpoint security myths is "assume you will be compromised." Of course, this is a fallacy, but as a result of this axiom, the security industry has become obsessed with detection, but at the cost of less protection.

Unfortunately, there are a lot of shortcomings with a security model based on detection. Take the Target data breach, for example. By all accounts, Target had deployed technology that did detect the attacks against it, yet it did nothing to remediate the situation.

The reason this myth persists is because "assume you will be compromised" is a self-fulfilling prophecy. If you believe you will be compromised then you will make investments in detection and remediation, instead of considering more effective forms of endpoint protection. It is a vicious cycle: assume compromise, invest in detection, compromise occurs because of inadequate protection, threats are detected, incorrect beliefs are validated, repeat into the next budget cycle.

vicious-cycles-logo

As a result, organizations believe that deploying a multitude of security solutions enables "Defense in Depth." Bromium Labs has taken to calling this "Layers on Layers" because LOL makes hackers "laugh out loud." It is important to note that each layer has its own set of limitations and if these limitations are shared across layers, then the number of layers doesn't matter anymore. In the recent example from Bromium Labs, the focus was exploiting the kernel as that was the common weak link across all the widely used endpoint legacy technologies.

Common endpoint security solutions focus on sandboxes, anti virus (AV), host-based intrusion prevention systems (HIPS), exploit mitigation (EMET), and hardware-assisted security (SMEP), yet a single public exploit for a Windows kernel vulnerability bypasses all of these solutions, even if they are stacked one upon another.

This highlights the weakness of a "defense in depth" architecture. The simultaneous deployment of multiple solutions sharing the same weakness is not satisfactory. The issue is far from theoretical. Modern malware (e.g. TDL4) is already using this particular exploit to gain priviledges. Windows kernel vulnerabilities are frequent, and this is not going to change any time soon – we have to live with them and be able to defend against them.

Sophisticated attacks present a significant hurdle for endpoint protection. Sophisticated attacks may incorporate malicious Web sites, email or documents that have been developed to evade detection. Therefore, even diligent security teams may not be alerted to a compromise. This is the shortcoming when you "assume compromise."

Additionally, emerging technology trends, such as cloud computing and mobile employees are relocating corporate assets beyond the corporate perimeter, increasing the need for effective endpoint protection. When a mobile user connects to an untrusted network, it is imperative that attacks don't slip through the cracks.

Beyond the sophistication of attacks, there is also a balance between security and operations. Primarily, operations is concerned with ensuring that applications run, while security is concerned with compensating for vulnerable technology. For example, an organization may have developed its own legacy application that uses outdated and unpatched versions of Java to run.

Therefore, an effective endpoint protection solution must be able to securely enable both legacy application and new computing models from sophisticated new attacks without breaking them. Protection is not enough if we are not also maintaining a great user experience.

The reason it seems like endpoint security is a losing battle is because the current security model is broken. For example, the NIST Cybersecurity Framework organizes five basic cybersecurity functions: identify, protect, detect, respond and recover. Three-fifths of this framework (detect, respond and recover) assume compromise will occur. Similarly, industry analysts promote an advanced threat protection model of prevention, detection and remediation.

For the past two decades, threat detection has been a Band-Aid on a bullet wound. The good news is that it seems the security industry is finally starting to realize that reactive solutions, such as anti-virus, are incapable of detecting and protecting against unknown threats. Even Symantec has admitted that anti-virus is dead.

Threat detection systems rely on signatures to catch cyber-attacks, but the more signatures an organization has enabled, the more performance takes a hit. Organizations face a dilemma, balancing performance and security, which typically results in partial coverage as some signatures are disabled to maintain performance.

In order to stay ahead of unknown threats, organizations must adopt an architectural model that is proactive. For example, micro-virtualization delivers hardware-isolation, which isolates users takes from one another, which in turn protects the system from any attempted changes made by malware.

A robust endpoint protection solution should address the hurdles we discussed earlier, securely enabling legacy applications and new technology initiatives from sophisticated new attacks. We can conclude that detection has failed because it is a reactive defense that attackers have learned to evade. Ironically, these reactive defenses, such as signature-based detection, actually require quite a lot of activity with its constant updates and new signatures.

Instead, we should be considering endpoint protection solutions that are passive and proactive. One example is to deploy hardware-isolated micro-virtualization, which provides a secure, isolated container for each task a user performs on an untrusted network or document. Micro-virtualization can protect against known and unknown threats without the need for constant signatures and updates. This approach to containerization on the endpoint also enables superior introspection with real-time threat intelligence, which can provide insight into attempted attacks that can be fed into other security solutions.

Finally, endpoint protection must maintain end-user productivity. It cannot negatively impact performance or the user experience or else users will work to circumvent its protection. Ideally, the best solutions and invisible and autonomous to end users. They are non-intrusive, they do not negatively impact workflows and they avoid frequent updates.



----

Shared via my feedly reader


Sent from my iPhone

No comments:

Post a Comment