I work for a school and we need to filter web traffic and block offending sites.
Blocking urls and filtering unencrypted http traffic doesn’t do the job anymore since more sites are using https – which is a good thing I strongly support and encourage.
Example? Google uses https, youtube uses https. We can’t block youtube completely since there is content which might prove useful for learning reasons. So youtube needs to pass. But we need to be able filter porn, racism and stuff in the video descriptions.
So now we have a problem: Govt says we need to protect our children from inappropriate contents, I don’t want them to waste my bandwidth with porn/music either. But depending on how you look at it, unencrypting encrypted traffic is also a bit… meh.
So ethics aside we need to filter https traffic by utilizing a man-in-the-middle attack.
I’m using e2guardian for this which is a fork of the now defunct dansguardian which I used previously.
e2guardian, or E2G for short, brings SSLMITM out of the box. The problem: It doesn’t work. At least on Debian.
2016.10.5 9:56:32 - 10.4.6.87 https://www.google.de:443 *DENIED* Certificate supplied by server was not valid: unable to get local issuer certificate CONNECT 0
e2guardian is using openssl directly to grab the certificate presented from the target host. By default openssl trusts no one and therefore throws an error that it can’t verify the integrity of the target host.
openssl s_client -connect google.com:443 CONNECTED(00000003) depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA verify error:num=20:unable to get local issuer certificate verify return:0 ... Start Time: 1407377002 Timeout : 300 (sec) Verify return code: 20 (unable to get local issuer certificate)
So we need to create a trusted CA bundle like the one in firefox.
Or we go here and grab a readily extracted bundle.
Using this, things look a little bit better:
openssl s_client -connect google.com:443 -CAfile cacert.pem | grep Verify depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA verify return:1 depth=1 C = US, O = Google Inc, CN = Google Internet Authority G2 verify return:1 depth=0 C = US, ST = California, L = Mountain View, O = Google Inc, CN = *.google.com verify return:1 Verify return code: 0 (ok)
Looking at e2guardian things still look the same. Of course, because E2G doesn’t know about our cacert.pem, neither does openssl.
Issuing the above command without specifying -CAfile leads to the same results as before.
The problem is, that there is no option to specify the CAfile in openssl.cnf. There are some options which look kinda promising but they are only for signing certs.
There is an option in e2guardian.conf
#sslcertificatepath = '/etc/e2guardian/trustedCAs/'
but I couldn’t get this to work.
So back to work on openssl. An strace shows that openssl s_client without the -CAfile option doesn’t care about the openssl.conf at all. But it checks a hardcoded folder:
open("/usr/lib/ssl/cert.pem", O_RDONLY) = -1 ENOENT (No such file or directory)
Placing (and renaming) our cacert.pem at this location not only does the job for openssl but even e2G is working now.
The next thing is to figure out how we’re going to deploy our own CA to the clients. In MS IE this is no prob due to GPOs but we may want FF working also…
And I need to figure out if I really want to be able to potentially read our users banking details or if I implement some idiot-proof switch the teacher has to press (and not forgets to press!) to enable SSLMITM if he uses a computer room with lil’ children instead of some semi-adults.
Edit: There was a ticket for openssl in 2014 regarding the issue.