(This is my contribution to International Pixel-Stained Technopeasant Day. I've never been paid to write anything meant to be funny except in the praise of friends who don't wish to hurt my ego, but I am theoretically to receive royalties for my textbook, and I imagine this would be more interesting than something for the next mathematics book I'm supposed to co-write.)
The ``bit'' was created as a class label by rabble-rouser and early techno-labor leader Claude Shannon in 1948, and the term soon caught on. The computing industry at the time didn't care as long as the bits were in their assigned booths by the start of the computation day and worked with at least 73 percent reliability, which was achieved nearly one-quarter of the time, owing largely to their never having really understood Fortran, much less obviously fictitious languages such as Algol.
Soon Shannon, who experimentally determined he could ride a unicycle as short as 18 inches tall, organized bits into eight-member cells known as ``bytes'', which attempted to strike for better working conditions and documented source code so bits would know what they were supposed to be doing, to which the generally American-dominated industry responded by having the National Guard roll tanks over the protesters. About the only good news for the bits came in 1973 when Judge Earl Larson ruled (in Honeywell vs Sperry Rand, won four games to three) they did have the right to a unique address location within a given computer, which made organizing efforts easier.
But the 70s would see model and kit-built computers spreading to a young and idealistic generation, and soon altruists like Steve Wozniak, Bill Gates, and Paul Allen were ready to screw everything up for IBM and Control Data and such. Without consciously organizing a notion born of the 1960s ``be-in'' and ``free love'' style movements came to the conclusion that if ordinary people were simply exposed to bits, and came to know them in their home lives, they would lose their stereotypical inhibitions against computers as oppressive tools of authoritarian governments and would instead come to love them as individuals. Thus they and many other pioneers went to work making computers which could be owned and operated by average people.
Even back then manufacturers were aware that having more of any measurable quantity would improve the sale value of these home computers, and they began hoarding bits -- which, since a person could love one bit as well as a million -- meant that while more bits would encounter people, the number of people encountering bits was growing dissatisfyingly slowly. Because of this in 1981 Bill Gates issued his much-misunderstood declaration that 640 kilobytes should be enough for anyone, allowing computers that previously contained as many as a thousand kilobytes to be turned into two computers and thus introduced to twice as many people.
Unfortunately, social movements have a way of turning ironic, and soon the number of bits in individual homes became an excuse to depersonalize them and view them as nothing more than interchangeable, easily-altered buckets for magnetic polarities. By the end of the 1980s still-idealistic computer magnates (not magnets) had come to the golden-age science fiction solution for everything: escape into space. Beginning with Mac System 6.0.7/MultiFinder and with Windows 2.0 were little routines which would cause bits to be, at random, lifted into space to spread to other planets, dispersed by a Svante Arrhenius-like method of solar wind carrying bits away.
Of course, to sneak as many bits as possible out it was now necessary to get as many bits as could be into homes, and thus ancient rules like Gates's 640K limit or the old make-work 16-bit processor limits were repealed as fast as could be. More, the bits could be most easily sneaked out if the computer were doing some work and so programs to keep a computer busy without doing anything essential, like screen savers, Minesweeper, and web browsers were included on new computers. Today most people barely notice the number of bits in their computers dwindles in time, as their evaporation is hidden by programs which appear to be bloating in size. The computers grow slower, of course, as the number of bits doing work decreases, but people have come to accept that as part of the natural cruft-building life cycle of computers, and barely know their role as host to this bit liberation process. It's not a perfect solution, but there never is one until long after the problem has been solved.
Trivia: William Shakespeare's The Merry Wives of Windsor was first performed before Queen Elizabeth I on 23 April 1597. Source: Shakespeare's Kings, John Julius Norwich.
Currently Reading: Park Maker: A Life Of Frederick Law Olmstead, Elizabeth Stevenson.