Why Site Filtering By DNS Fails

Filtering by DNS seems a good idea when you first consider it. OpenDNS has a very nice setup for doing just this, and is often recommended as a business tool for content filtering.

The concept is simple: use a benign form of DNS “hijacking” in reverse against malicious sites – and other undesirable web sites (such as adult or gaming or sports, et al). To use the DNS server in this way, the client identifies itself (pairing an IP address to a server-based account) to the DNS server, then replies with the appropriate web addresses based on the client’s DNS requests.

For example, once the client authenticates to the DNS server, then the client will make a DNS request. Once the server receives the request, it consults the filtering in place for the account, and either returns the actual IP address, or an IP address of a website showing the actual web site as blocked.

Unfortunately, the problem is not in the implementation at the DNS server; it is in actually getting to the DNS server that is the problem. One very big problem is that any DNS cache will subvert the filtering at the DNS server. When the DNS cache makes its requests, the association with the account is broken, and the actual IP address is cached.

This means that you will not be able to use a DNS cache on your local host for speeding up your Internet access. However, the problem is deeper than that: if your Internet provider uses a DNS cache – which they might and you would never know – then the DNS filtering breaks.

The other problem has to do with IP addresses. If the user can get to a site that has the actual IP addresses in it, then the DNS server is never consulted and filtering again breaks down.

There is also the problem with proxies. A proxy receives a URL itself, and makes the DNS request on its own, bypassing any DNS-based content filtering which may be in place.

And then there is the Google cache. Using Google, if a person selects the cached version of a page (and not the direct link) then the page can be seen rather than blocked.

The only reasonable way to perform content filtering is by using your own local proxy – such as Privoxy or Squid with Squidguard – but even this will not stop the Google cache and perhaps other methods. But at least it will be immune to most problems listed here. Privoxy is good for personal proxies, and Squid is good for enterprise implementation.

Using a local proxy is more resource intensive (both in terms of processing power and administration required) but this may be necessary to keep reasonable order in the workplace.

3 thoughts on “Why Site Filtering By DNS Fails”

  1. This is a great article that out lines the biggest issue with dns filtering. I was considering a dns setup but may stick to a proxy at least then I can provide different policies for each user.

  2. Is there any theoretical way to have effective DNS filtering, any mitigation for the loopholes above? (I suppose a known list of proxy sites could be filtered out, but it seems impossible to stop people going directly to IP addresses)

Leave a comment