************************************************************** * * * CYBERSPACE * * A biweekly column on net culture appearing * * in the Toronto Sunday Sun * * * * Copyright 2000 Karl Mamer * * Free for online distribution * * All Rights Reserved * * Direct comments and questions to: * * * * * ************************************************************** 1000 Years of Computer History Part I: The Dark Ages 1000 AD - 1964 AD Well that was a nifty millenium. Okay technically the millenium is not over until Dec 31, 2000 but like whatever. Being one year shy of a full 2000 years makes you 99.95% right. I make it a practice of flipping the calendar to March around February 20. I hate February. So wrapping up 2000 years of history a year early, I don't have a problem with. It's tradition at the end of every year to do a little year in review bit. At the end of every decade, we take a longer squint. I imagine the press covered with some fondness the passing of the 19th century, a century marked by an unusually long run of world peace and prosperity (well peaceful for anyone not resisting colonization and prosperous for anyone educated at Oxford or Cambridge). So I figure I may as well take this single opportunity to review some of the landmark achievements in computer history over the last thousand years. Discovery of Zero by the West 12th century, more than likely a Monday We've lived with post-Christmas bank balances of zero for so long it's inconceivable there was a time when people got along, in fact prospered, with no knowledge of nothing. It wasn't until the 5th century that zero was invented in India. Like so many ideas born in the east, from khaki pants to Hello Kitty, it took a while for zero to catch on in the west. It wasn't until the 12th century that western mathematicians saw something to nothing. Without zero the computer's binary system of ones and zeros would not exist. Punch Cards Early 1800s Joseph-Marie Jacquard owned a weaving factory. To expedite the process of changing weaving patterns he invented a weaving system that had new patterns encoded on big, wooden punch cards. At the beginning of the 20th century, Herman Hollerith adopted Jacquard's punch card system to tabulate the American census. Hollerith went on to found IBM. IBM gave rise to the computer revolution and Microsoft. Oh what a tangled web Jacquard weaved. Charles Babbage's Analytical Engine 1830 Babbage is credited with designing the first computer, his clock-work Analytical Engine. Babbage was unfortunately an incessant tinkerer and never got around to building his engine, despite the aid of Ada Lovelace, a brilliant mathematician and daughter of poet Lord Byron. Ada Lovelace is credited with being the world's first programmer, having written a program for the Analytical Engine. The computer industry, unfortunately, would not hire another woman programmer until 1992. Pigs. Claude Shannon Publishes his Master's Thesis 1937 People in the know call Claude Shannon the father of information science. Outside of MIT and Bell Labs, he's pretty well ignored. Shannon was both an electrical engineer and a mathematician. For his MIT Master's Thesis, he published a theory on how electrical circuits could be used to model logical statements. Shannon resurrected what was then a very obscure branch of mathematics called Boolean Algebra. Anyone who has ever used a search engine is familiar with Boolean AND/OR operators today. Shannon's theoretical work allowed the creation of the modern computer. Some call Shannon's thesis the most important Master's thesis ever2 written. Sure beats my Honor's thesis on tea spoon collecting. William Shockley Invents the Transistor and Silicon Valley 1948-1955 In 1948, Bell Labs unveiled the transistor. William Shockley, a physicist at Bell Labs, co-developed it and won a Nobel prize for its invention. Shockley left Bell Labs, returned to his home town of Silicon Valley (nee Palo Alto, California) and started his own company to manufacture transistors. Shockley's abrasive personality drove off his best engineers, who started all sorts of interesting high tech companies in Palo Alto, including the legendary Fairchild Semiconductor. Fairchild invented not only the silicon chip but the casual dress code that has been a perk of the computer biz for nearly half a century. Creation of BASIC 1964 BASIC was written as a small teaching aid for Dartmouth College. People who never ever intended to be programmers or never should have become programmers liked its simple English-like commands and demanded microcomputers of the early'80s be capable of running it. Microsoft was founded when Bill Gates developed a BASIC interpreter for the world's first microcomputer the Altair.