Home | Best of Cyberspace | White Label Humour
 

1000 Years of Computer History

 

 

 

 

Part I: The Dark Ages 1000 AD - 1964 AD

 

Well that was a nifty millennium. Okay technically the millennium is not over until Dec 31, 2000 but, like, whatever. Being one year shy of a full 2000 years makes you 99.95% right. I make it a practice of flipping the calendar to March around February 20. I hate February. So wrapping up 2000 years of history a year early, I don't have a problem with.

 

It's tradition at the end of every year to do a little year in review bit. At the end of every decade, we take a longer squint. I imagine the press covered with some fondness the passing of the 19th century, a century marked by an unusually long run of world peace and prosperity (well, peaceful for anyone not resisting colonization and prosperous for anyone educated at Oxford or Cambridge). So I figure I may as well take this single opportunity to review some of the landmark achievements in computer history over the last thousand years.

 

<><><>

When Big Blue's big sales men in big blue suits began reporting big losses in sales to the little beige Apple, Big Blue's biggest boys knew they had to act fast big time.

<><><>

 

Discovery of Zero by the West (12th century, more than likely a Monday)

 

We've lived with post-Christmas bank balances of zero for so long it's inconceivable there was a time when people got along, in fact prospered, with no knowledge of nothing. It wasn't until the 5th century that zero was invented in India. Like so many ideas born in the east, from khaki pants to Hello Kitty, it took a while for zero to catch on in the west. It wasn't until the 12th century that western mathematicians saw something to nothing. Without zero the computer's binary system of ones and zeros would not exist.

 

Punch Cards (Early 1800s)

 

Joseph-Marie Jacquard owned a weaving factory. To expedite the process of changing weaving patterns he invented a weaving system that had new patterns encoded on big, wooden punch cards. At the beginning of the 20th century, Herman Hollerith adopted Jacquard's punch card system to tabulate the American census. Hollerith went on to found IBM. IBM gave rise to the computer revolution and Microsoft. Oh what a tangled web Jacquard weaved.

 

Charles Babbage's Analytical Engine (1830)

 

Babbage is credited with designing the first computer, his clock-work Analytical Engine. Babbage was unfortunately an incessant tinkerer and never got around to building his engine, despite the aid of Ada Lovelace, a brilliant mathematician and daughter of poet Lord Byron. Ada Lovelace is credited with being the world's first programmer, having written a program for the Analytical Engine. The computer industry, unfortunately, would not hire another woman programmer until 1992. Pigs.

 

Claude Shannon Publishes his Master's Thesis (1937)

 

People in the know call Claude Shannon the father of information science. Outside of MIT and Bell Labs, he's pretty well ignored. Shannon was both an electrical engineer and a mathematician. For his MIT Master's Thesis, he published a theory on how electrical circuits could be used to model logical statements. Shannon resurrected what was then a very obscure branch of mathematics called Boolean Algebra. Anyone who has ever used a search engine is familiar with Boolean AND/OR operators today. Shannon's theoretical work allowed the creation of the modern computer. Some call Shannon's thesis the most important Master's thesis ever written. Sure beats my Honor's thesis on tea spoon collecting.

 

William Shockley Invents the Transistor and Silicon Valley (1948-1955)

 

In 1948, Bell Labs unveiled the transistor. William Shockley, a physicist at Bell Labs, co-developed it and won a Nobel prize for its invention. Shockley left Bell Labs, returned to his home town of Silicon Valley (nee Palo Alto, California) and started his own company to manufacture transistors. Shockley's abrasive personality drove off his best engineers, who started all sorts of interesting high tech companies in Palo Alto, including the legendary Fairchild Semiconductor. Fairchild invented not only the silicon chip but the casual dress code that has been a perk of the computer biz for nearly half a century.

 

Creation of BASIC (1964)

 

BASIC was written as a small teaching aid for Dartmouth College. People who never ever intended to be programmers or never should have become programmers liked its simple English-like commands and demanded microcomputers of the early'80s be capable of running it. Microsoft was founded when Bill Gates developed a BASIC interpreter for the world's first microcomputer the Altair.

 

Part II: The Darker Ages, from Windows to Windows 3.1

 

Alan Kay Invents Everything Not Yet Invented (1972)

 

Alan Kay took a job at Xerox's Palo Alto Research Center and set about designing the world's first notebook computer called the Dynabook. His idea for a computer you could carry around under your arm, instead of a computer that needed to be wheeled in pieces into a room, was at least a decade ahead of the technology curve. While waiting for the technology to catch up, he developed Object Oriented Programming, the graphical user interface, laser printing, and Ethernet. He eventually got hired by Apple and helped create the Macintosh.

 

Apple II Released (1977)

 

Most people are probably familiar with the story of how Steve Jobs and Steve Wozniak (no relation despite the same first name) started Apple in their garage. The Apple was not their first creation. Jobs and Wozniak got the tinkering bug back in a high school science course. Hewlett-Packard (HP), then an obscure manufacturer of scientific instruments, used to donate surplus junk to California high schools teaching electronics. Jobs and Wozniak found their muse building things out of HP's surplus hardware. This miniscule investment kick started the minds that started Apple, which started the desktop publishing revolution, which started HP earning tidy profits as the leading manufacturer of laser printers.

 

A Harvard Business Student Creates the VisiCalc Spreadsheet (1978)

 

For a couple years, microcomputers like the Apple II, the PET, and the TRS-80 were really just expensive calculators that played some lame games. In the spring of 1978, when most young men's minds were turning to love, Dan Bricklin dreamed up the idea of an electronic spreadsheet while sitting in a Harvard business class. He borrowed a friend's Apple II and wrote it in BASIC. Business people rapidly saw VisiCalc's utility for creating expense reports, time sheets, and other things that needed numbers jiggled. Business people bought Apple IIs in large numbers simply to run the spreadsheet software. When IBM sales people came knocking, trying to sell companies $20,000 systems, they found $2,000 worth of Apple hardware and VisiCalc running on people's desks. IBM quickly realized it need a lean, mean, VisiCalc-running machine of its own.

 

IBM Releases the PC (1981)

 

When Big Blue's big sales men in big blue suits began reporting big losses in sales to the little beige Apple, Big Blue's biggest boys knew they had to act fast big time. But they realized that being big in the world of microcomputers wasn't so grand. The time it took the company's various levels of bureaucracy to turn out a new product meant the computer would be obsolete the moment it hit the store shelves. The solution was to create a computer from off-the-shelf components and license an operating system. Legend has it IBM tried to set up a meeting with Gary Kidall, the creator of the CP/M operating system. CP/M was at that time world's most popular microcomputer OS. Kidall arrogantly thumbed his nose at IBM, preferring to enjoy a day of perfect flying weather in his private plane. IBM then looked up Bill Gates who showed them Microsoft DOS.

 

Internet Opened to Commercial Traffic (1991)

 

The National Science Foundation, which oversaw the Internet's principle backbone, had banned commercial use since the net's inception. In 1991, it lifted this restriction, opening up the net to a wide range of non-educational uses. This single move is probably more important than the development of browser and web technology (both developed in 1991 as well). With the restriction gone, the public at large now had access to all those make-money-fast emails and pictures of Gillian Anderson's head stuck on naked centerfold bodies.

 

Windows 3.1 Is Released (1992)

 

Eghads.

 

 

 

* * *

 

Copyright 2002 Karl Mamer

Free for online distribution as long as

"Copyright 2002 Karl Mamer (kamamer@yahoo.com)"

appears on the article.

 

Direct comments and questions to mailto:kamamer@yahoo.com