Last week, George Dyson spoke at the library about his book “Turing’s Cathedral.” It gives a unique perspective on the development of electronic computers that took place just down the street at the Institute for Advanced Study in the 1940s and 1950s. At the time, the concept of a “computer” was a person who calculated complex equations and made charts and tables of the answers for people to look up. By the 1940s, the occupation was typically a female one. With massive government investment due to World War II, many businesses and universities around the country were looking for ways to use machines to assist in the war effort by automatically processing equations and solving equations that no human could. The photo below shows a room of “computers” at work at NACA, the predecessor to NASA, in 1949.
Today it’s obvious and taken for granted that computers have a standard set of parts – a core processor to manipulate binary data, memory to store the data, a screen to interact with the user, and the abilty to run a variety of software through coded instructions. Looking back to the 1940s though, and all of the questions surrounding the possibility of machines replicating human’s mental work, like math, you start to get a sense of awe for how hard this problem was to overcome. Faulty electronics, differing philosophies and beliefs, political machinations, personality clashes; drama was by no means in short supply! The reality is that the theoretical foundations for digital computing began even earlier, in the early 1920s. Over the course of three decades, some of America’s and the world’s most brilliant minds struggled, debated, and ultimately resolved the questions which enabled the future, the digital world we live in today.
The library has at least two other books I know of on the same topic. “The Dream Machine“, which I am currently reading, starts the story in the early 1920s, and is a fascinating read for anyone interested in the history of technology. “Dark Hero of the Information Age“, which I have not read yet, focuses on the life of Norbert Weiner who certainly was involved in the intellectual underpinnings for digital computing, but who also appears to have been politically sidelined from history until recently.