19,498 yrs later...
2016/09/23
2016/09/10
MzE1NjE5NTI0ODYyfjIxMTAxNjI4NzMxNn5VMkZzZEdWa1gxOHorQVFYL2NQRFkwUVgyRm82anNJRGJHOW0vV1ZjdVEvVUR0cjZRZ0t3Y21WYWdCalo0dEZW
The title of this post is encrypted.
This page is also encrypted (via TLS (aka the new name for SSL)).
Anyone sniffing traffic on the wire must first decrypt the TLS traffic and then decrypt the content to work out what the message says.
But why bother with two layers of encryption?
Ok, so forgive the fact that this page is publicly accessible and TLS is decrypted before your eyes. It's possibly a poor example and in any case I'd like to talk about the server side of this traffic.
In many organisations, TLS is considered sufficient to provide security for data in-transit. The problem is TLS typically terminates on a load-balancer or on a web-server and is forwarded from there to another downstream server. Once this initial decryption takes place data often flows over the internal network of organisations in plain text. Many organisations consider this to be fine practice since the internal network is locked down with firewalls and intrusion detection devices etc. Some organisations even think it's good practice so that they can monitor internal traffic more easily.
However, there is obvious concern over insider-attacks with system-admins or disgruntled employees being in a good position to skim off the data easily (and clean-up any trace after themselves). Additionally requests are often logged (think access logs and other server logs) and these can record some of the data submitted. Such data-exhaust is often available in volume to internal employees.
It's possible to re-wrap traffic between each node to avoid network sniffing but this doesn't help data-exhaust and the constant un-wrap-re-wrap becomes increasingly expensive if not in CPU and IO then in effort to manage all the necessary certificates. Still, if you're concerned then do this or terminate TLS on the application-server.
But we can add another layer of encryption to programmatically protect sensitive data we're sending over the wire in addition to TLS. Application components will need to decrypt this for use and when this happens the data will be in plain text in memory but right now that's about as good as we can get.
The same applies for data at-rest - in fact this is arguably far worse. You can't rely on full database encryption or file-system encryption. Once the machine is up and running anyone with access to the database or server can easily have full access to the raw data in all its glory. These sort of practices only really protect against devices being lifted out of your data-centre - in which case you've got bigger problems...
The safest thing here is to encrypt the attributes you're concerned about before you store them and decrypt on retrieval. This sort of practice causes all sorts of problems in terms of searching but then should you really be searching passwords or credit card details? PII details; names, addresses etc, are the main issue here and careful thought about what really needs to be searched for; and some constructive data-modelling, may be needed to make this workable. Trivial it is not and compromises abound.
All this encryption creates headaches around certificate and key management but such is life and this is just another issue we need deal with. Be paranoid!
p.s. If you really want to know what the title says you can try the password over here.
This page is also encrypted (via TLS (aka the new name for SSL)).
Anyone sniffing traffic on the wire must first decrypt the TLS traffic and then decrypt the content to work out what the message says.
But why bother with two layers of encryption?
Ok, so forgive the fact that this page is publicly accessible and TLS is decrypted before your eyes. It's possibly a poor example and in any case I'd like to talk about the server side of this traffic.
In many organisations, TLS is considered sufficient to provide security for data in-transit. The problem is TLS typically terminates on a load-balancer or on a web-server and is forwarded from there to another downstream server. Once this initial decryption takes place data often flows over the internal network of organisations in plain text. Many organisations consider this to be fine practice since the internal network is locked down with firewalls and intrusion detection devices etc. Some organisations even think it's good practice so that they can monitor internal traffic more easily.
However, there is obvious concern over insider-attacks with system-admins or disgruntled employees being in a good position to skim off the data easily (and clean-up any trace after themselves). Additionally requests are often logged (think access logs and other server logs) and these can record some of the data submitted. Such data-exhaust is often available in volume to internal employees.
It's possible to re-wrap traffic between each node to avoid network sniffing but this doesn't help data-exhaust and the constant un-wrap-re-wrap becomes increasingly expensive if not in CPU and IO then in effort to manage all the necessary certificates. Still, if you're concerned then do this or terminate TLS on the application-server.
But we can add another layer of encryption to programmatically protect sensitive data we're sending over the wire in addition to TLS. Application components will need to decrypt this for use and when this happens the data will be in plain text in memory but right now that's about as good as we can get.
The same applies for data at-rest - in fact this is arguably far worse. You can't rely on full database encryption or file-system encryption. Once the machine is up and running anyone with access to the database or server can easily have full access to the raw data in all its glory. These sort of practices only really protect against devices being lifted out of your data-centre - in which case you've got bigger problems...
The safest thing here is to encrypt the attributes you're concerned about before you store them and decrypt on retrieval. This sort of practice causes all sorts of problems in terms of searching but then should you really be searching passwords or credit card details? PII details; names, addresses etc, are the main issue here and careful thought about what really needs to be searched for; and some constructive data-modelling, may be needed to make this workable. Trivial it is not and compromises abound.
All this encryption creates headaches around certificate and key management but such is life and this is just another issue we need deal with. Be paranoid!
p.s. If you really want to know what the title says you can try the password over here.
2016/09/02
Channel 4 in France
Slight obsession some would say, but I enjoy F1... not that much that I'm prepared to pay Sky whatever extortionate fee they're come up with today though so I tend to watch the highlights only on C4. Nice coverage btw guys - shame to lose you next year.
Anyway, I have a VPN (OpenVPN) running off a Synology DiskStation to allow me to tunnel through home when I'm abroad. Works a treat... normally. Channel 4 does not.
Initially I thought it was DNS leakage picking up that name resolution is from french servers. You can see this by visiting www.dnsleaktest.com and running the "standard test". Even though I'm reported as being in the UK, all my DNS servers are in France... Humm, I smell a fish...
Am I in the UK or France?
To work around this I setup a proxy server on the DiskStation and the same test now reports UK DNS servers as everything goes through the proxy.
Definitely looks like I'm in the UK... But still no luck on C4...
Finally, I set the timezone I was in to UK rather than France and this seemed to do the trick. Note that you need to change the timezone on the laptop, not the time itself or you'll have all sorts of trouble connecting securely to websites including C4.
In the end, the proxy doesn't seem necessary so they don't appear to be picking up on DNS resolution yet though it's the sort of thing that they could look at adding (that, and device geolocation using HTML5 geo API though for this there are numerous plugins for browsers to report fake locations).
Incidentally, BBC iPlayer works fine and does so without fiddling with timezone.
The net wasn't really designed to expose your physical location and IP to location lookups such as MaxMind are more of a workaround than truly identifying your location. Using TOR as a more elaborate tunnel makes you appear to be all over the place as your IP address jumps around and corporate proxies; especially for large organisations, can make you appear to be in all sorts of weird places. Makes you wonder.. All these attempts to limit your access based on an IP address to prop up digital rights management just doesn't work. It's all too easy to work-around.
p.s. Turns out that whilst France doesn't have free-to-air F1 coverage, most places have some form of satellite TV via CanalSat or TNT which includes the German RTL channel. It'll do nothing to improve my French but at least I get to watch the race on the big screen...
Anyway, I have a VPN (OpenVPN) running off a Synology DiskStation to allow me to tunnel through home when I'm abroad. Works a treat... normally. Channel 4 does not.
Initially I thought it was DNS leakage picking up that name resolution is from french servers. You can see this by visiting www.dnsleaktest.com and running the "standard test". Even though I'm reported as being in the UK, all my DNS servers are in France... Humm, I smell a fish...
Am I in the UK or France?
To work around this I setup a proxy server on the DiskStation and the same test now reports UK DNS servers as everything goes through the proxy.
Definitely looks like I'm in the UK... But still no luck on C4...
Finally, I set the timezone I was in to UK rather than France and this seemed to do the trick. Note that you need to change the timezone on the laptop, not the time itself or you'll have all sorts of trouble connecting securely to websites including C4.
In the end, the proxy doesn't seem necessary so they don't appear to be picking up on DNS resolution yet though it's the sort of thing that they could look at adding (that, and device geolocation using HTML5 geo API though for this there are numerous plugins for browsers to report fake locations).
Incidentally, BBC iPlayer works fine and does so without fiddling with timezone.
The net wasn't really designed to expose your physical location and IP to location lookups such as MaxMind are more of a workaround than truly identifying your location. Using TOR as a more elaborate tunnel makes you appear to be all over the place as your IP address jumps around and corporate proxies; especially for large organisations, can make you appear to be in all sorts of weird places. Makes you wonder.. All these attempts to limit your access based on an IP address to prop up digital rights management just doesn't work. It's all too easy to work-around.
p.s. Turns out that whilst France doesn't have free-to-air F1 coverage, most places have some form of satellite TV via CanalSat or TNT which includes the German RTL channel. It'll do nothing to improve my French but at least I get to watch the race on the big screen...
Subscribe to:
Posts (Atom)
Voyaging dwarves riding phantom eagles
It's been said before... the only two difficult things in computing are naming things and cache invalidation... or naming things and som...
-
PO: We need a bridge over the river right here? Me: Why? PO: Because the customer needs to get to the other side? Me: Why can't they use...
-
It's been said before... the only two difficult things in computing are naming things and cache invalidation... or naming things and som...
-
My ageing brain sees things in what feels like an overly simplistic and reductionist way. Not exactly the truth so much as a simplified ver...