Defense Against the Dark Arts: Week 6

This week’s lectures focused on network security, threats, and defenses. The presenters were Geoffrey Cooper and Ram Venugopalan, both researchers at McAfee.


  • Robustness principle: Machines connected to a network should follow standards whenever sending, but should accept malformed messages as long as the meaning is clear. This lends itself to robustness because it encourages senders to only send proper messages, but it allows receivers work with less-than-perfect messages.
  • Zero-day vulnerability: A vulnerability that is unknown to those who would want to prevent it, but may be known to someone who would want to exploit it. In concrete terms, it’s like accidentally (and unknowingly) leaving your house unlocked.
  • Honey net: Like a honey pot, but in the form of a network. Meant to attract nefarious actors and malware.
  • Quarantine: Isolating suspicious software, connections, or users.

Returning to the robustness principle, this definition by Jon Postel provides a good summary.

“TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.” - Jon Postel

The presenters point out that the robustness principal made the internet what it was today, but it also presents vulnerabilities since hosts on a network are encouraged to accept a lot of things, even if they don’t perfectly conform to standards.

Defining expected and unexpected behavior

If you know what you’re network is supposed to do, it’s possible to identify unexpected (and potentially malicious) behavior. A firewall can be a means of establishing expected behavior and blocking unexpected behavior.

I’ve found that much of the design and explanation of computer security is related to military theory, and network security is no different. The presenters described a network security strategy centered on zones, defense in depth, and taking advantage of a demilitarized zone.

  • Network security zones: Separating a network into sections and applying different security rules to each. For example, a network might be more suspicious of email that originates from outside the organization.
  • Defense in depth: Setting up a variety of defenses in layers such that if a malicious actor makes it past one defense they will be caught by another. Interestingly, the presenter from a previous week (one, I believe) casually mentioned that some organizations are moving away from this strategy. I’m curious to know what they’re moving toward.
  • Demilitarized zone: The part of a network which faces the outside world. Part of a defense in depth strategy.
  • Firewall: Identifies known good traffic and rejects everything else.
  • Intrusion detection system: The opposite of a firewall. Instead of allowing known good traffic and blocking everything else it identifies bad traffic and blocks it.

Types of attacks

  • Man in the middle: Wherein a malicious actor intercepts internet traffic either to capture and use elsewhere or to modify and resend.
  • Denial of service: Prevents a server or host from serving legitimate requests by flooding it with superfluous requests. These attacks have been expanded into distributed denial of service attacks in which many machines are used to constantly make requests of a particular server or host.

Network reconnaissance

The presenters covered the two types of network reconnaissance, active and passive.


Active reconnaissance involves doing things like looking for vulnerable ports on a machine or employing a tool like NMap. This means actively engaging with the machine or network, sending packets, etc.

One strategy the presenters spoke about was using a so-called “slow-scan,” in which this scan is done randomly over a period of time to make it more difficult to track and defend against.


Passive reconnaissance involves observing the packets that are in transit across a network without creating or sending packets. This style of reconnaissance would be more difficult to track and defend against.


The presenters covered several network analysis tools, including:

  • NMap: Maps a network by sending packets and observing responses.
  • WireShark: A packet analyzer or sniffer. Allows the user to observe the packets that are traveling on a network or machine.
  • Ping: Allows a user to test the reachability of a host on a network.


The presenters didn’t go into too much detail regarding protecting against network-based attacks, but they did mention a few things that I thought were interesting.

Public key cryptography

I first became interested in public key cryptography after watching this video from Computerphile and a corresponding video from Numberphile. I won’t attempt an explanation here. Suffice to say that this is a hugely important part of the modern internet.

Response rates

If a host is constantly making requests on a server in a way that may hinder serving requests from other users, one strategy is to slow down the response rate in an exponential fashion. In other words, the first request might be served immediately, the second request after two seconds, the third after four seconds, and so on.

False data

The basic idea here is to knowingly provide inaccurate data to a potential malicious threat in order to mitigate the threat.