Friday, January 28, 2011

The Origins of Artificial Intelligence in English Romantic Poetry

by Len Hart, The Existentialist Cowboy

Ada Byron, daughter of the English Romantic poet Lord Byron is – arguably --the world's first computer programmer. She is otherwise known to history as Lady Lovelace. The case on her behalf is so strong that the Department of Defense developed a programming language in her honor: ADA. It was among the first so-called "Object-Oriented Languages," a set of languages dominated by C++, Java, etc. But that gets ahead of the story by some one hundred years or more.

Ada Byron was born in 1815. Divorced from Lord Byron, Lady Byron brought up her daughter, Ada, fearful that she might suffer Romanticist, "poetical" tendencies as did her father. Her education, therefore, consisted of mathematics and science. Predictably, Ada's understanding of mathematics –though profound –was influenced by simile, analogy, and metaphor.

It is not surprising that Ada was fascinated and intrigued when, in 1834, she encountered Charles Babbage's idea for a "calculating machine" about which Babbage offered a daring conjecture: a machine acting upon foresight.
Most of what we know of Babbage's "differential engine" we have learned from Ada Byron. Inspired by the "universality" of Babbage's ideas, Ada proceeded to write more notes on Babbage than Babbage wrote at all. Her fascination with Babbage's "engine" is noteworthy for at least two outstanding developments.

It was Ada, inspired by Babbage's calculating machine, who first articulated, if not invented, the very concept of "software" –a set of instructions to be carried out by a "universal" machine – a machine capable of acting meaningfully upon those instructions. The obvious progeny of this concept is the multitude of software packages that now drive everything from desktops to mainframes.

Of even greater interest to physicists and cosmologists is that Ada's ideas and hopes for the 'computing machine' lead inexorably to Claude Shannon's concept of information as the inverse of entropy – a western version of the yin and yang. Entropy is or is associated with the Second Law of Thermodynamics, a general principle constraining the 'direction' of 'heat transfer'; in the vernacular: things run down. Chaos increases. Organization becomes dis-organization and disorder. Hot things cool down in the absence of new infusions of energy. Eventually all movement ceases entirely. Some have called it the 'heat death' of the Universe –a final and eternal 'thermodynamic state' in which there no longer exists sufficient 'free energy' to sustain motion or life.

Shannon then spent 31 years at Bell Labs, starting in 1941. Among the many things Shannon worked on there, one great conceptual leap stands out. In 1948, Shannon published "The Mathematical Theory of Communication" in the Bell System Technical Journal, along with Warren Weaver. This surprisingly readable (for a technical paper) document is the basis for what we now call information theory--a field that has made all modern electronic communications possible, and could lead to astounding insights about the physical world and ourselves.

Names like Einstein, Heisenberg, or even Kurt Goedel are better known among the people who have led twentieth-century science into the limits of knowledge, but Shannon's information theory makes him at least as important as they were, if not as famous. Yet when he started out, his simple goal was just to find a way to clear up noisy telephone connections.

--Heroes of Cyberspace; Claude Shannon
Shannon wrote A Mathematical Theory of Communication [PDF] for Bell Labs in 1948 –more than a hundred years after Ada wrote what is considered to be the world's first computer program, a plan that she shared with Babbage. In it, Ada suggested how his machine might calculate Bernoulli numbers. This was the world's first computer program.

The second development began the debate about Artificial Intelligence. In his paper Computing Machinery and Intelligence, Alan Turing, devoted several paragraphs to "Lady Lovelace's Objection" to the very concept of A.I. It was a concept which Ada discounted in her notes:
The analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to [do]. It can follow analysis; but it has no power of anticipating any...truths. Its province is to assist us in making available what we're already acquainted with.
Some 100 years later Turing ascribes to her a cautious "logical positivism:"
It will be noticed that he [ D. R. Hartree ] does not assert that the machines in question had not got the property [ A.I.], but rather that the evidence available to Lady Lovelace did not encourage her to believe that they had it.
The point is not whether Lady Lovelace or Turing is correct with regard to A.I., but rather that it was Ada who foresaw the playing field and terrain, the scope of the debate. It was Turing, however, who defined 'artificial intelligence'. By his definition, computers have already achieved consciousness. Turing had said that we may consider a machine intelligent if, in a blind test, we cannot differentiate the computer's response from that of a living, breathing person. By that standard, computers are now conscious and intelligent. IBM's 'Big Blue' defeated Chess champion Boris Spasky who charged that human beings had directed the machine. I often share that attitude with my own computer's chess program. It actually seems to learn from its mistakes.

Ada understood that the meaning of a machine is what it does. Her contribution is that this meaning may be shaped by what are now call 'programmers' who literally instruct the machine, providing it a well-planned list of discrete tasks, a program –in other words: software.

Computer technologists speak of "state". Each state is particular, analogous in some way, to a particular task that may be accomplished while in that "state." Before computers, machines may have processed information –but in a crude way. The instructions had been built-in. Consider the lever –a simple machine that nevertheless may be said to have two "states" –up or down. The meaning of either state is to be found in the work done by (or in) that state. As a general principle, the meaning of a given state is the utility it creates, the work that it does.

In Ada's wake, early computers were mere assemblages of electrically controlled "levers" called "relays." The origin of binary languages may be found in relays which are either 'on' or 'off' . Mechanical relays would be replaced by vacuum tubes and, later, by transistors. Simple circuits were called either "and-gates," or "or-gates", or, generally –'flip/flop' circuits. Even now, the largest supercomputers, reduced to their smallest components, are capable only of processing just two states: 0 and 1. But upon this basic alphabet, patterns of increasing complexity have grown exponentially. The computer has become an electronic loom in which each pattern represents a "state."

That Ada glimpsed this future almost a century and a half ago is remarkable.

Post a Comment