Share
0
Claude E. Shannon: Founder of Information Theory
12.04.2021 Graham P. Collins
Mathematics
Claude Shannon

Quantum information science is a young field, its underpinnings still being laid by a large number of researchers [see "Rules for a Complex Quantum World," by Michael A. Nielsen; Scientific American, November 2002]. Classical information science, by contrast, sprang forth about 50 years ago, from the work of one remarkable man: Claude E. Shannon. In a landmark paper written at Bell Labs in 1948, Shannon defined in mathematical terms what information is and how it can be transmitted in the face of noise. What had been viewed as quite distinct modes of communication--the telegraph, telephone, radio and television--were unified in a single framework.

Shannon was born in 1916 in Petoskey, Michigan, the son of a judge and a teacher. Among other inventive endeavors, as a youth he built a telegraph from his house to a friend's out of fencing wire. He graduated from the University of Michigan with degrees in electrical engineering and mathematics in 1936 and went to M.I.T., where he worked under computer pioneer Vannevar Bush on an analog computer called the differential analyzer.

Shannon's M.I.T. master's thesis in electrical engineering has been called the most important of the 20th century: in it the 22-year-old Shannon showed how the logical algebra of 19th-century mathematician George Boole could be implemented using electronic circuits of relays and switches. This most fundamental feature of digital computers' design--the representation of "true" and "false" and "0" and "1" as open or closed switches, and the use of electronic logic gates to make decisions and to carry out arithmetic--can be traced back to the insights in Shannon's thesis.

scientificamerican.com
Discussion

Welcome To Geniuses.Club!

Here you’ll find All that’s interesting about humanity’s Great Minds
Biographies, Articles, Videos, Quotes, Geni-Shop
Who was Born / Died on each day & Where
And much more
Continue!