Thursday, March 28, 2013

Plans Wrecked by Internet Drama

My plans for the day were wrecked by Internet Drama. A DDoS attack on Spamhaus made it to the New York Times. Various providers jumped into the discussion with Words of Marketing, etc. This is all fairly typical, with one proviso.

Toward the end of last year, I gave a series of brown-bag lunch talks in Portland. These people didn't have a huge budget, but they were great to work with. They paid drive time (I'm pretty busy, and not willing to write off the opportunity cost) gave me a white board, and didn't limit the discussion to the announced topic.

I got to yack, and the discussions covered a lot of ground. Here are two points, both related to that client, who will remain nameless, for what should be obvious reasons.

You keep saying complexity is the enemy of security. Why?

Because I screwed up. I didn't really say that correctly. Complex is not the same thing as complicated.  This is the third time this topic has come up. Think of this in terms of reliability engineering. If you have two components, both which are 90% reliable, what is the reliability of the composite system?

0.9 x 0.9 = .81

And we are on our way to FAIL. This isn't limited to software. Read up on the Challenger disaster, and the systemic failure of internal NASA mechanisms to provide even remotely accurate risk analysis.

What does this have to do with DNS?


One of the branches that those discussions took was about DNS, in the context of things I immediately look for when doing an audit or pen-test. I've done work for a couple of orgs that stuck a DNS server on the DMZ, and pointed everything, including internal desktops, at it.

This is not the best of all possible plans.
  • A publicly-accessible server controls your entire infrastructure.
  • You surrender the ability to mitigate a large percentage of targeted email attacks.
  • You surrender the ability to do important real-time threat analysis.
  • You enable distributed attacks against anyone.
  • You are probably a long way way from being able to roll out DNSSEC, should that be in your plans.
If you have an Internet-facing DNS server, it should only provide authoritative resolution. If it isn't your domain, don't answer queries.

It turns out that while this was covered in one of brown-bag lunches, it was never fixed. It was going to go into the Q1 budget, but that didn't happen.  Here we are at the end of Q1, and it bit them. An innocent mistake. Happens all the time.

That doesn't mean that there is no cost involved with what was essentially an unforced error. They have a capex they had forgotten about, and an opex that they had mostly paid for (my brown-bag talks) but didn't use. Now they will be paying me a bit more to set everything up, write a couple of scripts, document everything, and train it. And create a reporting system so that mangers have some assurance that there is no recurrence.

This is probably going to triple their outlay, not including hardware costs. Another loss, which is hard to evaluate, is the opportunity cost of not having their own people create the solution.

It is obviously useful to have a third party provide a sanity check of your security posture; that is much of the value of an audit. But the training value of building a competent in-house security team is large, and it costs little to capture it.

No comments:

Post a Comment

Comments on posts older than 60 days go into a moderation queue. It keeps out a lot of blog spam.

I really want to be quick about approving real comments in the moderation queue. When I think I won't manage that, I will turn moderation off, and sweep up the mess as soon as possible.

If you find comments that look like blog spam, they likely are. As always, be careful of what you click on. I may have had moderation off, and not yet swept up the mess.