November 28, 2016

Gottfried Wilhelm Leibniz: The Philosopher Who Helped Create the Information Age


                                                      



When did the information age begin? 

One might point to the winter of 1943, when British engineers started using a room-sized machine dubbed “Colossus,” the world’s first electronic digital programmable computer, to break Nazi codes during the World War II.

Or perhaps it was February 1946, when the U.S. Army unveiled the faster, more flexible Electronic Numerical Integrator and Computer (aka ENIAC) at the University of Pennsylvania. History buffs may push it back further, perhaps bringing up key 19th-century figures like Charles Babbage and Ada Lovelace who pioneered programmable calculating machines in Victorian England.

But we should look back even earlier, to the work of a towering but often overlooked intellect—to Gottfried Wilhelm Leibniz, the German philosopher and polymath who died 300 years ago on Nov. 14, 1716. Though you may not have heard of him, he was a man who envisioned the systems and machines that would define the digital revolution.

Something of a prodigy, Leibniz was just 8 when he started reading the books in his father’s library. (His father was a professor of moral philosophy at Leipzig University.) He quickly learned the classics, once boasting that he could recite Virgil’s Aeneid by heart. 

At school, he excelled in logic; by 17 he had defended his master’s thesis, and three years later he had qualified for his doctorate. Leibniz would go on to work as a historian, librarian, legal adviser, and diplomat. He wrote on biology, medicine, geology, theology, psychology, linguistics, and of course philosophy. The king of Prussia, Frederick the Great, described Leibniz as “a whole academy in himself.”

Famously, Leibniz clashed with Isaac Newton over the invention of calculus. Historians now believe that the two men discovered calculus independently, though it’s Leibniz’s elegant and compact notation system, not Newton’s clunkier version, that we use today.

Of course, there was no such thing as “computer science” in Leibniz’s day. But by developing the binary number system, a way of representing numerical information using zeroes and 1s, he became the father of all computer coding. (Computers don’t have to run by manipulating zeroes and 1s—but it’s a lot easier if they do.) 

Leibniz believed that machines, not people, should be crunching numbers and worked on a prototype for a device that could add, subtract, multiply, and divide. He tweaked and improved the design over many years; one of these contraptions looked like a primitive pinball game, with numbers represented by tiny spheres, rolling along grooves and going through gates that open and close. In London, his fellow scientists were so impressed with the device that they elected him to the Royal Society. He designed another machine that could do certain kinds of algebra, and yet another for cracking codes and ciphers.

Leibniz envisioned these machines would be used in accounting, administration, surveying, astronomy, the production of mathematical tables, and more. 

Tedious work that had kept human beings awake far into the night, working by candlelight, could now be mechanized.


Unfortunately, the technology of the day didn’t allow for the precisely machined parts, such as uniform screws, that Leibniz’s devices required. (For instance, as historians later discovered, something as simple as “carrying the one” turns out to be maddeningly difficult to implement in hardware.) Despite 45 years of work and many prototypes, his calculating machine was never fully functional.

But for Leibniz, computation was just the beginning: He believed that all kinds of problems could be reduced to the manipulation of symbols and tackled just as though they were mathematical problems. He imagined a kind of alphabet of human thought, whose symbols could be manipulated according to precise, mechanical rules, the work carried out by devices. He called them “reasoning machines” and envisioned the pursuit we know today as artificial intelligence.

Once the system was perfected, he believed, humanity would have “a new kind of tool, a tool that will increase the power of the mind much more than optical lenses helped our eyes, a tool that will be as far superior to microscopes or telescopes as reason is to vision.” We would weigh arguments, he said, “just as if we had a special kind of balance.” Linguistic barriers between nations would fall, and the new universal language would usher in an era of understanding, peace, and prosperity. (Leibniz was, needless to say, an optimist—he also had ambitions to reunify the Catholic and Protestant churches.)

Leibniz, however, was right in foreseeing the extent to which we would come to see our world in terms of numbers. (Even just a century ago, who would have imagined that creating and manipulating visual images, or recording a symphony, would boil down to processing certain arrangements of zeroes and 1s?) He even worried about “data overload,” as individuals and governments struggled to process, store, and retrieve the vast amounts of data that would soon be generated.

Leibniz never became a household name, and many of his ideas, like the notion of a universal symbolic language, never bore fruit. 

Much of it had to be re-discovered by later thinkers, such as the 19th-century English mathematician and philosopher George Boole, who more fully developed the idea of a logical system based on binary arithmetic. 

(You may have run across his name before: Boolean algebra, Boolean searches, Boolean system.) But, in imagining a world in which machines could be used to supplement or supplant human computation, Leibniz’s way of thinking paved the way for the information age that blossomed 250 years after his death.


By Dan Falk

With many thanks to Slate


Top Ten Things Women Invented

A History Of Mick Jagger On Film

Alan Turing Manuscript Sells For $1 million 

Temple Grandin On The Autistic Brain

Benedict Cumberbatch And Eddie Redmayne: The Changing Face Of Hollywood 

Temple Grandin: Her Extraordinary Life And Work  
     
Hedy Lamarr's 101st Birthday Celebrated by Google    

 
Mathematical Minds Stir A Beauty Within

Modern Technology and Albert Einstein: Are We There Yet?

Arthur Benjamin: The Magic of Fibonacci Numbers

Claude Shannon Jr: The Greatest Genius No One Has Heard Of

John von Neumann: This Hungarian-American Mathematician May Have Been Smarter Than Einstein 

Great Minds: Filippo Brunelleschi

Great Minds: Leonardo da Vinci

The Genius of Nicola Tesla

Hedy Lamarr - Beauty And Brains in Abundance

The New Turing Test:Brainy Machines Need An Updated IQ Test, Experts Say 

Alan Turing Manuscript Sells For $1 million 

 'Albert Einstein Font' Lets You Write Like Physics Genius

Do You See Albert Einstein Or Marilyn Monroe In This Photo? 

Albert Einstein: Inspiring Quotes on Nature and Life

Albert Einstein: 25 Quotes

Warp Speed Space Travel A Possibility Thanks To Einstein's Theory Of Relativity

Einstein's Famous Theory Has Aged Well 

Jason Padgett: Brain Injury Turns Man Into Math Genius

 Mad Geniuses: 10 Odd Tales About Famous Scientists

Benjamin Franklin:11 Surprising Facts

 Last Piece of Einstein’s Theory Of Relativity In Line For Final ‘Proof’

10 Mathematical Equations That Changed The World

11 Female Inventors Who Helped Power The Information Age 

Biopics Now Focus On Key Moments Rather Than A Whole Life 

Some Biopic Actors And Their Real-Life Counterparts

German WWII Coding Machine Found On eBay For $20

Creators Of Modern Cryptography Win Turing Award

High school student Ryan Chester wins $350,000 For Film Explaining Einstein's Theory Of Relativity

A Mathematician Has Built A Machine That Can Beat The Odds In Roulette

Quantum Enigma Machines Are Becoming A Reality