Urbana, [Illinois]: The University of Illinois Press 1949. First Edition. , -117 pages. 8vo. Original plum cloth with silver spine lettering. Lacks the original dust jacket. Corner tips lightly bumped, previous owners name (R. J. Solomonoff, March 1950) on front flyleaf. Lacks the original dust jacket, a touch of fading to the spine panel. Inked notations throughout by Solomonoff. Near Fine. Cloth. 
The first hardcover edition of Shannon's most important work. It reprints "the Mathematical Theory of Communication" by Claude Shannon, first published in the Bell System Technical Journal in 1948, with minor corrections and additional references. Also includes "Recent Contributions to the Mathematical Theory of Communication" by Warren Weaver of the Rockefeller Foundation, first published in this form (a condensed version was published in Scientific American July 1949). (Origins of Cyberspace 881) "
At a stroke [Shannon] transformed the understanding of the process of electronic communication, by providing it with a mathematics, a general set of theorems called 'information theory.' With lucid brilliance, Shannon wrote out the basic principles of the signaling of information. It was like Newton writing out the laws of motion for mechanics." (Shannon Collected Papers, p. xix)
With a nice association to R. J. Solomonoff (Ray Solomonoff). "Ray Solomonoff (July 25, 1926 – December 7, 2009) was the inventor of algorithmic probability, his General Theory of Inductive Inference (also known as Universal Inductive Inference), and was a founder of algorithmic information theory. He was an originator of the branch of artificial intelligence based on machine learning, prediction and probability. He circulated the first report on non-semantic machine learning in 1956....Solomonoff founded the theory of universal inductive inference, which is based on solid philosophical foundations and has its root in Kolmogorov complexity and algorithmic information theory. The theory uses algorithmic probability in a Bayesian framework. The universal prior is taken over the class of all computable measures; no hypothesis will have a zero probability. This enables Bayes' rule (of causation) to be used to predict the most likely next event in a series of events, and how likely it will be. Although he is best known for algorithmic probability and his general theory of inductive inference, he made many other important discoveries throughout his life, most of them directed toward his goal in artificial intelligence: to develop a machine that could solve hard problems using probabilistic methods." (wikipedia)
See also Shannon Collected Papers #38