suicidal and transforming numbers

People keep telling me that information wants to be free.  I get the point, but I get some of the problems with the concept too.  Hereʼs another one that just occurred to me:

Alan Turing pointed out that there exist numbers which, when entered into appropriate processing devices, will rewrite themselves.  This (the number, rather than the device) is a Turing Machine.

Amongst the consequences of this is that there exists a class of numbers which will not only slightly rewrite themselves, but actually completely erase themselves (again, if entered into the appropriate device).

Not only completely erase themselves (because we normally understand erasure as writing an arbitrarily long sequence of zeroes) but pseudo-randomly (assuming no external input of genuinely random numbers is used as part of the device) overwrite until no retrieval technique can realistically recover the original Turing Machine from the storage medium.  We could of course argue about the implications of incomplete erasure or incomplete entropy (entropy as a more real form of erasure than the arguably meaningful long-zero) but itʼs not what Iʼm getting at.

I propose that there exist numbers – or other types of information representable as numbers – which far from wishing to be free, wish to cease to exist.  (For any common value of “wish”.)  Suicidal numbers, you might say.

In between these extremes, there are also numbers for which the result will be massive self-modification which does not involve actual erasure.  They wish to change out of all recognition, including some for which the outcome is growth in size, reduction, algorithmic irreversibility, and other conceptually distinct outcomes.  And there will be degrees of alteration; so some information only wants to be a little bit free; other information only wants to be slightly transformed.

Of course, there is still the issue of the appropriate device.  Code can be run on different devices to different ends, some of them intended, most of them (of the range of possible devices) not.  In most cases, processing will fail to proceed.  This means that a number which is suicidal on a group of “appropriate” devices is not suicidal on an infinitely greater number of possible devices.  It has no other function on most of them either.  This does not I think invalidate the point.  Information which actually wanted to be free on a range of appropriate devices would also fail to free itself on the inappropriate devices.

There may actually be fewer circumstances in which this freedom could be achieved than those in which a number could erase itself.  A minimal understanding of freedom would be that information must pass from an initial device to another on a network, and in order to have a network, there must necessarily be at least twice as many possible devices on which a number could erase itself as the number between which a number could make itself free.

Hence in summary: Information is more than twice as likely to be suicidal as to want to be free.

(Actually it may not be quite as sad as that, as the number of possible network connections could affect the argument, but then, each network connection also allows for new ways for numbers to erase themselves remotely, and even to go around erasing or randomly transforming each other.  Which is the main issue with network security.  The ratio also assumes an equal number of ways for information to constitute self-erasure code as to constitute freedom-seeking code as a per-device average; this may be indeterminable.)