FIRST EDITION OF RALPH HARTLEY'S FOUNDATIONAL PAPER ON INFORMATION THEORY -- work that was "the single most important prerequisite" for Shannon's theory of information" (Wikipedia). Shannon acknowledged his debt to Hartley in the first paragraph of his landmark 1948 paper "A mathematical theory of communication," -- the paper in which Shannon introduced a qualitative and quantitative model of communication that solved the problem of reproducing at any given point a message originating at another point. Hartley had a way of thinking philosophically about the transmission of information, a habit that led to his unconventional method of formulating the problem of communication. Hartley "regarded the sender of a message as equipped with a set of symbols (the letters of the alphabet for instance) from which he mentally selects symbol after symbol, thus generating a sequence of symbols. He observed that a chance event, such as the rolling of balls into pockets, might equally well generate such a sequence" (Pierce, An Introduction to Information Theory, 39). "Hartley distinguished between psychological and physical considerations -- that is, between meaning and information. The latter he defined as the number of possible messages independent of whether they are meaningful. He used this definition of information to give a logarithmic law for the transmission of information in discrete messages: H = K log sn where H is the amount of information, K is a constant, n is the number of symbols in the message, s is the size of the set of symbols and therefore sn is the number of possible symbolic sequences of the specified length n. Hartley had arrived at many of the most important ideas of the mathematical theory of communication: the difference between information and meaning, information as a physical quantity, the logarithmic rule for transmission of information, and the concept of noise as an impediment in the transmission of information" (Origins of Cyberspace 316). Hartley's research led him to formulate the law upon which Shannon built, "that the total amount of information that can be transmitted is proportional to frequency range transmitted and the time of the transmission" (Wikipedia). Together with Shannon's work, the law became known as the Shannon-Hartley theorem. CONDITION & DETAILS: New York: American Telephone and Telegraph Company. Bell System Technical Journal 7, 1928, pp. 535-563. Illustrations throughout. The Hartley paper includes 7 illustrations, two photos and five figures. Full volume. Quarto. (9 x 6.25 inches; 225 x 156mm). Solidly and tightly bound in dark blue cloth; gilt-lettered at the spine. "P. Caporale" appears in small gilt in the lower right corner of the front board. Very slight scuffing around the edge tips; clean and bright inside and out. Very good condition.
1929. First edition. Fine. Relationship of Information to the Physical World Szilard, Leo (1898-1964). Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. In Zeitschrift für Physik 53 (1929): 840-856. Whole volume. vii, 889pp. Text illustrations. 223 x 152 mm. Library buckram. Fine. Embossed library stamp of the Carnegie Institution of Washington, Mount Wilson Laboratory on the front free endpaper, library call number on spine. Boxed. First Edition of the founding document of information theory. In "Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen" [On the reduction of entropy in a thermodynamic system by the intervention of intelligent beings], Szilard described a theoretical model that served both as a heat engine and an information engine, establishing the relationship between information (manipulation and transmission of bits) and thermodynamics (manipulation and transfer of energy and entropy). He was one of the first to show that "Nature seems to talk in terms of information" (Seife, Decoding the Universe, p. 77). In his paper Szilard addressed the problem of "Maxwell's demon," a thought experiment posed by James Clerk Maxwell in his Theory of Heat (1871) as a challenge to the second Law of Thermodynamics. This law states that the entropy of an isolated system not in equilibrium will tend to increase over time, reaching its maximum level at equilibrium. Maxwell speculated that "if we conceive of a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are as essentially finite as our own, would be able to do what is impossible to us. For we have seen that molecules in a vessel full of air at uniform temperature are moving with velocities by no means uniform, though the mean velocity of any great number of them, arbitrarily selected, is almost exactly uniform. Now let us suppose that such a vessel is divided into two portions, A and B, by a division in which there is a small hole, and that a being, who can see the individual molecules, opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower molecules to pass from B to A. He will thus, without expenditure of work, raise the temperature of B and lower that of A, in contradiction to the second law of thermodynamics." Maxwell's demon exploits the random, statistical nature of matter in order to decrease entropy in a closed system without any expenditure of energy-a state of affairs that is physically impossible. Recognizing the flaw in Maxwell's concept, Szilard countered the earlier physicist's challenge as follows: "Szilard realized that the act of measuring the position of the atom (or in the Maxwell case, the speed of an incoming atom) must, in some way, increase the entropy of the universe, counteracting the demon's reduction of the universe's entropy. When a demon performs a measurement, he is getting an answer to a question: Is the atom on the right side of the box or the left side of the box? Is the atom hot or cold? Should I open a shutter or not? So a measurement is an extraction of information from the particle. That information does not come for free. Something about that information-either extracting it or processing it-would increase the entropy of the universe. In fact, Szilard calculated that the "cost" of that information was a certain amount of useful energy-more precisely, kT log 2 joules for every bit of information, where T is the temperature of the room that the demon is in and k is the same constant that Boltzmann used in his entropy equation" (Seife, Decoding the Universe [2007], pp. 78-79). One of the most brilliant thinkers of the twentieth century, Szilard is best known for his work in nuclear physics: he conceived the idea of a nuclear chain reaction in 1933, filed a patent for a simple nuclear reactor in 1934, and collaborated with Fermi in the first demonstration of a chain reaction in 1942. In 1939 Szilard wrote a confidential letter to President Roosevelt outlining the possibility of nuclear weapons; this letter, co-signed by Einstein, led directly to the foundation of the Manhattan Project. Szilard worked on the Manhattan Project during the Second World War, but opposed the use of the atomic bomb as a weapon of destruction, instead advocating for a demonstration of the bomb's power in the hope that the mere threat of such a weapon would force Germany and Japan to surrender. Horrified by the devastation of Hiroshima and Nagasaki, Szilard turned from nuclear physics to biology after the war, and became an outspoken opponent of nuclear proliferation.