Sunday, September 21, 2008

pattern/randomness dialectic

I found Hayles’ discussion of the displacement of the presence/absence opposition by the pattern/randomness to be very pertinent to the distinctions between the definitions of information in relation to entropy in Weaver (or Shannon) and Wiener. After discussing the two definitions at such length last week, I had the impression that I agreed with Weaver that the more entropy a system has, the more information a message has. But it wasn’t until I read How We Became Posthuman this week that I had a clearer idea of what I found lacking in Wiener’s formulation.


Towards the end of class last week I mentioned that I thought that the theory of evolution by natural selection presented a small “explosion” in Wiener’s text. Natural selection, of course, comes about through random mutations of genes that make an organism more fit to survive in its environment, more capable of swimming locally upstream against entropy. Entropy, however, is the very condition of those mutations. Wiener points to a similar process (learning) when he cites “Ashby’s brilliant idea of the unpurposeful random mechanism which seeks for its own purpose” (38). He doesn’t reject the idea that entropy is a condition of evolution or of learning but he does indicate that in the long run all instances of local organization will meet their “eventual doom” in the face of entropy. It seems to me that it is in the long run that Wiener differs from Weaver’s idea that as entropy increases, information increases.


Contrary to Davis, I thought Hayles’ discussion of mutation in the pattern/randomness dialectic effectively shed light on how randomness evolves into pattern (32-3). She points out that mutation conserves the idea of pattern while disrupting it because it replaces only a small segment of a long chain of conserved digits.

No comments: