XSLT Export Filter for Laurent Rodemerk Vokabeltrainer

Since documentation for creating OO-Calc XML export filters is sparse, I better list the following snippet here as well. It’s an export filter for Laurent Rodemerks Vokabeltrainer.
You create a spreadsheet w/ 3 Columns (source language, description/grammar and target language), add the filter, put some info in the first row (see xslt script) and then you’re able to export to the xml format used by the app.

This is far from perfect but does the job and might be the base for further improvements. Since I don’t see a use in diving deeper into the depths of OO xml filters I leave it as is for now.
Continue reading “XSLT Export Filter for Laurent Rodemerk Vokabeltrainer”

Concept art

multiple problems here
multiple problems here
  1. we can’t keep Truck and trailer separate sprites. As one can see at the topmost truck, the trailer bleeds into the truck sprite. There is no way to properly sort this out with iso sprites. So we would need to render a sprite for each truck&trailer combo in each direction – too much work but doable
  2. The usual z-order issue you can see on the truck below. Truck is a 3×1 sprite, z-order says: render last but obviously it’s the 2 boxes that should be rendered last – need to figure this out
  3. we’re doing all that stuff for some fancy graphics. For some action on the screen while the actual game content is more about using spreadsheets. So the trucks need to drive and take turns. – adding a pivot point between truck and trailer is nearly impossible without rendering each animation frame separately if the trucks takes a turn.
  4. I can’t do pixel art so I’m modeling in 3D, UV-Wrapping, rendering to ISO-Sprite and fight with all the issues I’m getting from using a ISO-Engine. Maybe it would be better to go full 3D instead, just load the models and slam a proper Iso-Cam onto the scene? Cars could turn, no z-ordering, attachable trailers…
  5. Ofc there’s always the option to make the graphics super tiny, so that we can fit a whole truck and trailer in one single tile, or to somehow avoid displaying the trucks at all…


web filtering with SSL Man-In-The-Middle

I work for a school and we need to filter web traffic and block offending sites.

Blocking urls and filtering unencrypted http traffic doesn’t do the job anymore since more sites are using https – which is a good thing I strongly support and encourage.
Example?  Google uses https, youtube uses https. We can’t block youtube completely since there is content which might prove useful for learning reasons. So youtube needs to pass. But we need to be able filter porn, racism and stuff in the video descriptions.

So now we have a problem: Govt says we need to protect our children from inappropriate contents, I don’t want them to waste my bandwidth with porn/music either. But depending on how you look at it, unencrypting encrypted traffic is also a bit… meh.

So ethics aside we need to filter https traffic by utilizing a man-in-the-middle attack.
I’m using e2guardian for this which is a fork of the now defunct dansguardian which I used previously.

e2guardian, or E2G for short, brings SSLMITM out of the box. The problem: It doesn’t work. At least on Debian.

2016.10.5 9:56:32 - https://www.google.de:443 *DENIED* Certificate supplied by server was not valid: unable to get local issuer certificate CONNECT 0

e2guardian is using openssl directly to grab the certificate presented from the target host. By default openssl trusts no one and therefore throws an error that it can’t verify the integrity of the target host.

 openssl s_client -connect google.com:443
 CONNECTED(00000003) depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA
 verify error:num=20:unable to get local issuer certificate
 verify return:0
 Start Time: 1407377002
 Timeout : 300 (sec)
 Verify return code: 20 (unable to get local issuer certificate)

So we need to create a trusted CA bundle like the one in firefox.
Or we go here and grab a readily extracted bundle.

Using this, things look a little bit better:

openssl s_client -connect google.com:443 -CAfile cacert.pem | grep Verify
depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA
verify return:1
depth=1 C = US, O = Google Inc, CN = Google Internet Authority G2
verify return:1
depth=0 C = US, ST = California, L = Mountain View, O = Google Inc, CN = *.google.com
verify return:1
 Verify return code: 0 (ok)

Looking at e2guardian things still look the same. Of course, because E2G doesn’t know about our cacert.pem, neither does openssl.
Issuing the above command without specifying -CAfile leads to the same results as before.
The problem is, that there is no option to specify the CAfile in openssl.cnf. There are some options which look kinda promising but they are only for signing certs.
There is an option in e2guardian.conf

#sslcertificatepath = '/etc/e2guardian/trustedCAs/'

but I couldn’t get this to work.

So back to work on openssl. An strace shows that openssl s_client without the -CAfile option doesn’t care about the openssl.conf at all. But it checks a hardcoded folder:

open("/usr/lib/ssl/cert.pem", O_RDONLY) = -1 ENOENT (No such file or directory)

Placing (and renaming) our cacert.pem at this location not only does the job for openssl but even e2G is working now.

The next thing is to figure out how we’re going to deploy our own CA to the clients. In MS IE this is no prob due to GPOs but we may want FF working also…
And I need to figure out if I really want to be able to potentially read our users banking details or if I implement some idiot-proof switch the teacher has to press (and not forgets to press!) to enable SSLMITM if he uses a computer room with lil’ children instead of some semi-adults.

Edit: There was a ticket for openssl in 2014 regarding the issue.

Back in action!

well… kinda.
I dug up an old backup from 2014 and just dumped it onto the server. Updated all necessary stuff and here we are.
I’ll see what I’ll do with this site. Some recent activities included 3D printing, arduino and some android development.

Some posts are a little borked and stuff got missing over the years but I don’t have time to fix this yet.

Replacing strings inside a specific filetype

Recently I made some spelling mistake (due to lack of knowledge) in a php project. Until I realized I had around 40 files with the typo in various places like table & function names, comments and so on.

To search for the affected files is pretty easy:

find . -type f -name "*.php" | xargs -l10 grep "tyrpo

To replace the typo as usual sed comes to the rescue:

find -type f -name "*.php" -exec sed -i 's/tyrpo/typo/' {} \;