I was recently given the bridge torch problem to solve – so I made this script to solve it in the general case (for small numbers of people).

Short description – 4 people in a cave /on one side of a narrow bridge, can only cross to the other side in groups of max size 2, need a torch. There is only one torch. They cross at the speed of the slowest one. What’s the fastest they can all get out (one can go back with the torch and pick up another one).

My algo could probably be optimised to O(N^2) – I’ll get round to it one of these days.

What garden weeds can teach us about system resilience

I’ve been wondering about how it can be that garden weeds are so hard to get rid of considering they only grow at the rate of 1 cm a day. I realised that they apply lots of principles which are pretty useful in making a resilient system of any kind (including computer systems) – here are some cross-links:

1. When you pick the biggest and most obvious ones, the tiny ones you hadn’t noticed suddenly grow big in no time at all. I.e. they have a load-balanced cluster model with auto-scaling built in. When one machine goes down, the others jump in and take up the slack.
2. They stash away resources so that if you only pick the top and don’t uproot them, they can grow again, and again and again. I.e. they are smartly over-provisioned so that in high-demand conditions, they can recover. This might also tell us something about market economics – they must have a pretty tried-and-tested model about how much resources to stash away (invest) depending on the risk climate.
3. They metastasize – i.e. you can cut them up into little pieces but this will just make more of them grow. I.e. they are built in loosely coupled components which are able to operate and recover when the others stop working.
4. Even if you kill them all off, there are still the seeds – i.e. they are resilient to reboot and relaunch.
5. They dump seeds everywhere: i.e. they are opportunistic – ready to boot wherever there is an opportunity – e.g. in a crack in the concrete.

On SSL – why Crypto can’t fix trust for end-users

Cryptography is widely claimed to be useful in establishing trust in electronic assertions (such as an assertion that the holder of a certain public key is also the registered owner of a certain domain) . The argument goes like this: a certain operation (E) on the assertion A resulting in an artefact A is mathematically intractable (i.e. practically impossible) without knowledge of a certain secret S. Another bit of information, Pi which is not secret, but instead publicly available can be used to verify whether A was produced using a given secret Si or not using the inverse mathematical operation to E, D. Pi can only be used to prove this for only one other secret. Therefore if you know P and the binding between P and the entity owning the secret  it can verify (say Bi(P,E,A)) then you can be sure that E produced A with S. In order to be sure of this binding, you can rely on a second  binding  which makes the statement that Bi(P,E,A) is valid. By chaining these bindings, as long as you have one binding which you are sure of (for example because someone you trust gave it to you) then you can be sure of Bi and ultimately the source of A.

This is all fine and dandy, but the problem is with the dumb user. Ultimately the way this works is that your computer performs D and tells you whether Pi corresponds to Si or not.  No user is capable of checking whether D was performed correctly or whether instead the computer just said – yeah, looks good to me, without actually doing D at all, therefore the user has to trust the computer and its software. So the user, most of whom wouldn’t know a public key from a public toilet, is tasked with figuring out whether the computer was telling porkies or not. How can this be done? The options are limited to one: you trust the person who gave you the device that they neither cooked the machine deliberately nor exposed it to anyone else who could have cooked it instead. This is almost as hard as figuring by another means other than crypto whether for example, a document was really sent by a specific individual – for example by phoning them and listening to their voice telling you that they sent it.

Cyberwar

There’s a lot of stuff out there about cyberwar being unlikely and I’ve always thought the whole thing was basically scareware. I agree with Ian Brown that trying to use cyber to do typical war things like killing people and setting things on fire etc… seems unlikely. But I was at a conference on cyberwar last week and it got me thinking along a different track…

Killing and maiming usually isn’t the ultimate objective of a war. Killing, maiming are a means to gain political and economic control and to steal the assets of a country. You want their resources, you want their people to follow your ideology or  crackpot religion, you want more room for your people to live, more food for them to eat, more fuel to run toys like cars and planes with. Oh and you want your people to have more babies than them.

You can do all those things without tanks, planes and flame throwers and it gets easier the more a country moves its assets and control structures into cyberspace. Just consider the following possibilities:

  1. Steal intellectual property from corporations and universities (we’re an knowledge economy, right).
  2. Compromise border control systems (electronic passports, identity cards etc…)
  3. Falsify digital evidence (cover up your crimes)
  4. Steal money wholesale (e.g. from credit card readers)
  5. Devalue a currency
  6. Compromise a nation’s banking system (steal money, stop money transfers, inflate the currency)
  7. Compromise the integrity of a nation’s stock exchange so trading has to stop, or worse people can’t even prove which shares they own.
  8. Shut down the power supply, telephones, internet and water system without lifting a finger.
  9. Organise flash political revolts and bring down the government of a country.
  10. Control media to gain grass-roots support for your own political control structure.

Some of these are less likely than others. Some of them are already happening. But they’re all possible with enough resources – e.g. by putting systematic backdoors in hardware (chips etc…) which is sold to another nation state, insiders in software corporations etc… It would require a lot of resources, but here’s the key point – when does it become easier to achieve your ultimate objectives using cyber-methods than by going in with robot droids, patriot missiles etc…? Traditional warfare is vastly expensive in terms of money, human lives and energy.

How to remove a friend in Facebook

Before you start, the question most people want to know the answer to is – when you unfriend someone on Facebook, do they get notified? The answer given is – you disappear from your former friend’s friend list but your friend is not otherwise informed. You can still view your friend’s Facebook profile if it is public, but you have no other access.

1.Find their profile – by clicking on your list of all friends and clicking the person’s name.

2. Right down at the bottom on the left hand side is a link – “Remove from Friends” – click it!

3. Do what you have to do to get rid of the “are you sure” thingy.

You’re done.

Extreme Tech Misconceptions

How many people do you know who (I have come across all these, some quite often):

1. Believe google earth/street view could be used as a replacement for CCTV – e.g. to catch theives.

2. Are completely unable to understand the idea of a location in a file system, despite repeated attempts to explain it – e.g. – cannot find a saved file unless it is in the default directory of a given application.

3. Never empty their camera cards or back up their contents.

4. Do not have the slightest idea what is meant by memory sizes – e.g. would not know the difference between a 4GB SD card and a 64MB SD card.

5. Think day and night is caused by the sun going round the earth.

6. Think that clutches wear while the pedal is completely pressed down.

Know any other extreme tech misconceptions? Please post them below.