This week’s lectures focus on web security and were presented by Cedric Cochin of McAfee Labs.
Overview of web security
Ninety-five percent of malware is delivered via the web. On the one hand, this seems like a lot, on the other, where is everything else coming from? Assuming someone isn’t picking up USB sticks off the sidewalk and plugging them into their machine, where is the rest of the malware coming from?
Programmatically, here are a variety of injection sites for malware, for example:
- The HTML DOM
- Raw HTML
- HTTP at the network layer
As with our other lectures, social engineering, or manipulating someone’s behavior as part of an attack, is an important part of web (in)security. The presenter says that users are the weak link in terms of web security. They can be exploited in a variety of ways:
- SEO poisoning
- Fake anti-virus
- Social media link insertion
- Forum link insertion
I was previously unfamiliar with the last term. It involves using ad-networks as a delivery mechanism for malware. This is effective because users are primed to trust the content that appears on popular, high-profile sites, like NYTimes.com.
Defending against social engineering based attacks is difficult. The presenter notes that some progress has been made in educating users, but that this education is always a step behind the attackers.
Browser-level attacks exploit vulnerabilities in the browser. Typically some social-engineering aspects to these attacks, as they may involve a multi-step process that begins by luring a user to a particular site.
Other types of browser-level attacks include:
- Man-in-the-middle attacks
- Man-in-the-browser attacks
- DNS spoofing
- SQL injection
- Same-origin policy attacks
Browsers are becoming more secure, but the transition to HTML5 is expanding the attack surface.
The presenter covered a variety of tools that may be helpful in improving and investigating web-security.
- Alexa: Useful for determining general site popularity and prevalence
- Archive.org: Shows how a site has changed over time
- IPVOID: Compare an IP address against a variety of blacklists
- CheckShortURL: Expand a shortened URL to see where it points
- Site Dossier: Provides general information about given websites
- Webutation: URL reputation clearinghouse
- Virus Total: Online web scanning tool, provides a list of malware files
- Linux dig: DNS resolver utility
- Burp Suite: Intercept and modify traffic to and from a remote site
URL classification provides clues about the content or nature of a site based on its URL. It’s broken up into several types:
- Manual classification: Use your brain and tools to classify a URL. Very slow.
- Static classification: Uses automated methods. Looks at content but does not execute. Very fast.
- Low-interaction classification: Render and execute code on a site and note behavior. Fast.
- High-interaction: Render and execute code in a sandboxed environment and allow it to modify operating system. Note change in state to classify. Useful for zero-day attacks. Slow.