Friday, March 18, 2016

>> Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

Gather guide Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson start from now. However the brand-new method is by gathering the soft documents of the book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson Taking the soft file can be saved or kept in computer system or in your laptop computer. So, it can be more than a book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson that you have. The easiest method to expose is that you can likewise save the soft data of Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson in your appropriate and readily available device. This problem will suppose you too often check out Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson in the extra times more than chatting or gossiping. It will not make you have bad habit, but it will certainly lead you to have better practice to check out book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson.

Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

Turing's Cathedral: The Origins of the Digital Universe, by George Dyson



Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

Just how if your day is begun by reviewing a book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson But, it is in your device? Everyone will certainly still touch as well as us their device when awakening and in morning tasks. This is why, we expect you to likewise check out a publication Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson If you still perplexed how to obtain guide for your gizmo, you could follow the means right here. As right here, we provide Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson in this site.

To overcome the issue, we now give you the innovation to download guide Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson not in a thick printed data. Yeah, checking out Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson by on the internet or getting the soft-file just to check out could be one of the ways to do. You may not really feel that checking out an e-book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson will certainly serve for you. Yet, in some terms, May individuals successful are those that have reading routine, included this kind of this Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson

By soft file of guide Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson to review, you might not have to bring the thick prints anywhere you go. Whenever you have willing to read Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson, you could open your gizmo to read this e-book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson in soft data system. So simple and also quick! Reading the soft data book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson will certainly provide you very easy means to read. It could likewise be much faster due to the fact that you can review your publication Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson all over you want. This on-line Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson can be a referred book that you could take pleasure in the solution of life.

Due to the fact that e-book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson has wonderful benefits to review, many individuals now grow to have reading habit. Sustained by the developed modern technology, nowadays, it is not hard to obtain guide Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson Even guide is not existed yet out there, you to look for in this site. As just what you can discover of this Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson It will really ease you to be the very first one reading this book Turing's Cathedral: The Origins Of The Digital Universe, By George Dyson and also obtain the benefits.

Turing's Cathedral: The Origins of the Digital Universe, by George Dyson

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

  • Sales Rank: #479613 in Books
  • Published on: 2012-03-06
  • Released on: 2012-03-06
  • Format: Deckle Edge
  • Original language: English
  • Number of items: 1
  • Dimensions: 9.50" h x 1.52" w x 6.63" l, 1.80 pounds
  • Binding: Hardcover
  • 432 pages

Review
“An expansive narrative . . . The book brims with unexpected detail. Maybe the bomb (or the specter of the machines) affected everyone. Gödel believed his food was poisoned and starved himself to death. Turing, persecuted for his homosexuality, actually did die of poisoning, perhaps by biting a cyanide-laced apple. Less well known is the tragic end of Klári von Neumann, a depressive Jewish socialite who became one of the world’s first machine-language programmers and enacted the grandest suicide of the lot, downing cocktails before walking into the Pacific surf in a black dress with fur cuffs. Dyson’s well made sentences are worthy of these operatic contradictions . . . A groundbreaking history of the Princeton computer.”
—William Poundstone, The New York Times Book Review

“Dyson combines his prodigious skills as a historian and writer with his privileged position within the [Institute for Advanced Study’s] history to present a vivid account of the digital computer project . . .  A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ’50s . . . It demonstrates that the power of human thought often precedes determination and creativity in the birth of world-changing technology . . . An important work.”
—Richard DiDio, Philadelphia Inquirer
 
“Dyson’s book is not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—Josh Rothman, Braniac blog, Boston Globe
 
“Beyond the importance of this book as a contribution to the history of science, as a generalist I was struck by Dyson’s eye and ear for the delightfully entertaining detail . . . Turing’s Cathedral is suffused . . . with moments of insight, quirk and hilarity rendering it more than just a great book about science. It’s a great book, period.”
—Douglas Bell, The Globe and Mail
 
“The greatest strength of Turing’s Cathedral lies in its luscious wealth of anecdotal details about von Neumann and his band of scientific geniuses at IAS.  Dyson himself is the son of Freeman Dyson, one of America’s greatest twentieth-century physicists and an IAS member from 1948 onward, and so Turing’s Cathedral is, in part, Dyson’s attempt to make both moral and intellectual sense of his father’s glittering and yet severely compromised scientific generation.”
—Andrew Keen, B&N Review

“A mesmerizing tale brilliantly told . . . . The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text . . . . Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well—the definitive history of the computer.”
—Kirkus (starred review)
 
“The most powerful technology of the last century was not the atomic bomb, but software—and both were invented by the same folks. Even as they were inventing it, the original geniuses imagined almost everything software has become since. At long last, George Dyson delivers the untold story of software’s creation. It is an amazing tale brilliantly deciphered.”
—Kevin Kelly, cofounder of WIRED magazine, author of What Technology Wants
 
“It is a joy to read George Dyson’s revelation of the very human story of the invention of the electronic computer, which he tells with wit, authority, and insight. Read Turing’s Cathedral as both the origin story of our digital universe and as a perceptive glimpse into its future.”
—W. Daniel Hillis, inventor of The Connection Machine, author of The Pattern on the Stone

About the Author

George Dyson is a historian of technology whose interests include the development (and redevelopment) of the Aleut kayak (Baidarka), the evolution of digital computing and telecommunications (Darwin Among the Machines), and the exploration of space (Project Orion).

Excerpt. © Reprinted by permission. All rights reserved.

Preface
 
POINT SOURCE SOLUTION
 
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
 
 
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
 
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
 
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
 
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
 
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
 
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
 
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
 
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
 
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
 
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
 
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

Most helpful customer reviews

137 of 149 people found the following review helpful.
How it came from bit
By Ash Jogalekar
The physicist John Wheeler who was famous for his neologisms once remarked that the essence of the universe could be boiled down to the phrase "it from bit", signifying the creation of matter from information. This description encompasses the digital universe which now so completely pervades our existence. Many moments in history could lay claim as the creators of this universe, but as George Dyson marvelously documents in "Turing's Cathedral", the period between 1945 and 1957 at the Institute for Advanced Study (IAS) in Princeton is as good a candidate as any.

Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.

Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.

There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.

Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.

All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.

Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.

385 of 434 people found the following review helpful.
Misleading
By Jeremy E. May
The focus of George Dyson's well-written, fascinating but essentially misleading book,'Turing's Cathedral', is curiously not on celebrated mathematician, code-breaker and computer theorist Alan Turing but on his equally gifted and innovative contemporary John von Neumann. Von Neumann, whose extraordinarily varied scientific activities included inter alia significant contributions to game theory, thermodynamics and nuclear physics, is especially associated with the early development of the electronic digital computer (i.e. the 'EDC'), an interest apparently sparked by reading Turing's seminal 1936 paper 'On Computational Numbers' which attempted to systematize and express in mathematical terminology the principles underlying a purely mechanical process of computation. Implicit in this article, but at a very theoretical level, was a recognition of the relevance of stored program processing (whereby a machine's instructions and data reside in the same memory), a concept emanating from the work of mid-Victorian computer pioneer Charles Babbage but which demanded a much later electronic environment for effective realization.

What Mr Dyson insufficiently emphasizes is that, despite a widespread and ever-growing influence on the mathematical community, Turing's paper was largely ignored by contemporary electronic engineers and had negligible overall impact on the early development of the EDC. Additionally, he omits to adequately point out that von Neumann's foray into the new science of electronic computers involved a virtual total dependence on the prior work, input and ongoing support of his engineering colleagues. Invited in August 1944 to join the Moore School, University of Pennsylvania, team responsible for ENIAC, the world's first general purpose computer being built for the US Army, von Neumann was quickly brought up to speed courtesy of the machine's lead engineers, J. Presper Eckert and John Mauchly. As early as the fall of 1943, Eckert and Mauchly had become seriously frustrated by the severe processing limitations imposed by ENIAC's design and were giving serious consideration to implementing major modifications, in particular the adoption of Eckert's own mercury delay line technology to boost the machine's miniscule memory capacity and enable a primitive stored-program capability. These proposals were subsequently vetoed by the School's authorities on the quite understandable grounds that they would seriously delay ENIAC's delivery date; instead it was decided to simultaneously begin research on a more advanced machine (i.e. EDVAC) to incorporate the latest developments. As a new member of the group, von Neumann speedily grasped the essentials of the new science and contributed valuable theoretical feedback, but an almost total lack of hands-on electronic expertise on his part prevented any serious contribution to the nuts and bolts of the project. Relations with Eckert and Mauchly rapidly deteriorated when an elegantly written, but very high-level, document of his entitled 'First Draft of a Report on the EDVAC' was circulated among the scientific community. Not only had this document not been previewed, let alone pre-approved, by Eckert and Mauchly, but it bore no acknowledgment whatsoever of their overwhelming responsibility for much of the content. By default, and in view too of his already very considerable international reputation, the content was therefore attributed exclusively to von Neumann, an impression he made no attempt thereafter to correct, the term 'Von Neumann Architecture' being subsequently bestowed on the stored program setup described in the document.

The public distribution of von Neumann's 'Draft' denied Eckert and Mauchly the opportunity to patent their technology. Worse still, despite academic precedents to the contrary, they were refused permission by the Moore School to proceed with EDVAC's development on a commercial basis. In spite of his own links to big business (he represented IBM as a consultant), von Neumann likewise opposed their efforts to do so. All this resulted in a major rift, von Neumann thereafter being shunned by Eckert and Mauchly and forced to rely on lesser mortals to help implement various stored-program projects, notably the IAS computer at Princeton. The following year (1946) Eckert and Mauchly left the School to focus on developing machines for the business market. Before doing so, they jointly delivered a series of state of the art lectures on ENIAC and EDVAC to an invited audience at the School. Among the attendees was British electronics engineer Maurice Wilkes, a fellow academic of Turing's from Cambridge University, but with relatively little interest in the latter's ongoing activity (by this time Turing, a great visionary, had also turned his attention to designing stored-program computers). Blown away by Eckert and Mauchly's presentation, Wilkes returned to England to forge ahead with a new machine called EDSAC, which was completed in May 1949 and represented the first truly viable example of a stored program computer (an experimental prototype christened 'Baby' had already been developed at Manchester University the year before). Back in the US, Eckert and Mauchly continued their efforts, but persistent problems with funding and also Eckert's own staunch refusal to compromise on quality delayed progress, their partnership finally culminating in the development of the UNIVAC 1, the world's first overtly business-oriented computer, delivered initially to the Census Bureau in March 1951.

Mr Dyson is quite right of course (and he does this well) to trace the beginnings of the modern computer to the stored program concept, but his obsessive focus on von Neumann's role obscures the impact of Eckert and Mauchly's vastly more significant contribution to its development. The triumph of the EDC depended almost wholly on the efforts and expertise of utterly dedicated and outstanding electronics specialists like them, not on mathematicians, logicians and generalists like von Neumann or even Turing. Never one to deny credit where it was due, Wilkes (who later spearheaded advances in software, became the doyen of Britain's electronic community and ended his long and distinguished career as professor emeritus of computer science at Cambridge) unceasingly acknowledged his major debt to Eckert and Mauchly. Hopefully, Mr Dyson, a writer of considerable talent, might one day decide to tell in full their story and set the record straight.

138 of 160 people found the following review helpful.
Digital History that Reads Like Code
By Book Shark
Turing's Cathedral: The Origins of the Digital Universe by George Dyson

"Turing's Cathedral" is the uninspiring and rather dry book about the origins of the digital universe. With a title like, "Turing's Cathedral" I was expecting a riveting account about the heroic acts of Alan Turing the father of modern computer science and whose work was instrumental in breaking the wartime Enigma codes. Instead, I get a solid albeit "research-feeling" book about John von Neumann's project to construct Turing's vision of a Universal Machine. The book covers the "explosion" of the digital universe and those applications that propelled them in the aftermath of World War II. Historian of technology, George Dyson does a commendable job of research and provide some interesting stories involving the birth and development of the digital age and the great minds behind it. This 432-page book is composed of the following eighteen chapters: 1.1953, 2. Olden Farm, 3. Veblen's Circle, 4. Neumann Janos, 5. MANIAC, 6. Fuld 219, 7. 6J6, 8. V-40, 9. Cyclogenesis, 10. Monte Carlo, 11. Ulam's Demons, 12. Barricelli's Universe, 13. Turing's Cathedral, 14. Engineer's Dreams, 15. Theory of Self-Reproducing Automota, 16. Mach 9, 17. The Tale of the Big Computer, and 18. The Thirty-ninth Step.

Positives:
1. A well researched book. The author faces a daunting task of research but pulls it together.
2. The fascinating topic of the birth of the digital universe.
3. A who's who of science and engineering icons of what will eventually become computer science. A list of principal characters was very welcomed.
4. For those computer lovers who want to learn the history behind the pioneers behind digital computing this book is for you.
5. Some facts will "blow" you away, "In March 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth".
6. Some goals are counterintuitive. "The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms".
7. There are some interesting philosophical considerations.
8. As an engineer, I enjoy the engineering challenges involved with some of their projects.
9. Amazing how the Nazi threat gave America access to some of the greatest minds. The author does a good job of describing these stories.
10. The fascinating life of the main character of this book, John von Neumann.
11. So much history interspersed throughout this book.
12. The ENIAC..." a very personal computer". A large portion of this book is dedicated to the original computer concepts, challenges, parts, testing, etc...
13. The fundamental importance of Turing's paper of 1936. It's the inspiration behind the history of the digital universe.
14. Some amusing tidbits here and there, including Einstein's diet.
15. The influence of Godel. How he set the stage for the digital revolution.
16. Blown away with Leibniz. In 1679, yes that is correct 1679 he already imagined a digital computer with binary numbers...
17. So many great stories of how these great minds attacked engineering challenges. Computer scientists will get plenty of chuckles with some of these stories involving the types of parts used in the genesis of computing. Vacuum tubes as an example.
18. There are many engineering principles devised early on that remain intact today. Many examples, Bigelow provides plenty of axioms.
19. I enjoyed the stories involving how computers improved the art of forecasting the weather.
20. "Filter out the noise". A recurring theme and engineering practice that makes its presence felt in this book.
21. Computers and nuclear weapons.
22. The Monte Carlo method a new, key domain in mathematical physics and its invaluable contribution to the digital age.
23. The fascinating story of the summer of 1943 at Los Alamos.
24. The Teller-Ulam invention.
25. How the digital universe and the hydrogen bomb were brought into existence simultaneously.
26. Barricelli and an interesting perspective on biological evolution.
27. The amazing life of Alan Mathison Turing and his heroic contributions.
28. A fascinating look at the philosophy of artificial intelligence and its future.
29. The collision between digital universe and two existing stores of information: genetic codes and information stored in brains.
30. The basis for the power of computers.
31. The five distinct sets of problems running on the MANIAC by mid-1953. All in JUST 5 kilobytes.
32. A look at global digital expansion and where we are today.
33. The unique perspective of Hannes Alfven. Cosmology.
34. The future of computer science.
35. Great quotes, "What if the price of machines that think is people who don't?"
36. The author does a great job of providing a "where are they now" narration of all the main characters of the book.
37. Links worked great.
38. Some great illustrations in the appendix of the book. It's always great to put a face on people involved in this story.

Negatives:
1. It wasn't an enjoyable read. Plain and simple this book was tedious to read. The author lacked panache.
2. The title is misleading. This title is a metaphor regarding Google's headquarters in California. The author who was given a glimpse inside the aforementioned organization sensed Turing's vision of a gathering of all available answers and possible equations mapped out in this awe-inspiring facility. My disappointment is that this book despite being inspired by Alan Turing's vision, in fact, has only one chapter dedicated to him. The main driver behind this book was really, John von Neumann.
3. A timeline chart would have added value. With so many stories going back and forth it would help the reader ground their focus within the context of the time that it occurred.
4. Some of the stories really took the scenic route to get to the point.
5. The photos should have been included within the context of the book instead of a separate section of its own.
6. The book was probably a hundred pages too long.

In summary, I didn't enjoy reading this book. The topic was of interest to me but between the misleading title and the very dry prose, the book became tedious and not intellectually satisfying. The book felt more like a research paper than a book intended for the general audience. For the record, I am engineer and a lot of the topics covered in this book are near and dear my heart but the author was never able to connect with me. This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital origins but I felt like I was reading code instead of a story. This book will have a limited audience; if you are an engineer, scientist or in the computer field this book may be of interest but be forewarned it is a monotonous and an uninspiring read.

Recommendations: "Steve Jobs" by Walter Isaacson, "The Quantum Universe: (And Why Anything That Can Happen, Does)" by Brian Cox, "Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100" Michio Kaku, "Warnings: The True Story of How Science Tamed the Weather" by Mike Smith, "Spycraft: The Secret History of the CIA's Spytechs, from Communism to Al-Qaeda" by Robert Wallace and H. Keith Melton.

See all 146 customer reviews...

Turing's Cathedral: The Origins of the Digital Universe, by George Dyson PDF
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson EPub
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Doc
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson iBooks
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson rtf
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Mobipocket
Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Kindle

>> Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Doc

>> Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Doc

>> Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Doc
>> Free Ebook Turing's Cathedral: The Origins of the Digital Universe, by George Dyson Doc

No comments:

Post a Comment