Today CNN posted a Timeline for Computing Power.  Being a librarian and a blogger on information issues, I collect timelines related to all aspects of information history.  Last week I attended a meeting where someone said that the Internet will double every 11 months. What have we wrought with these computers?

The best thing about the CNNs timeline is the photo of the original Apple Computer which sold for $666.66.

The Original Apple Computer
 A “Computer” Was Originally a Job Title

But I do have a few quibbles with the timeline. It dates the history of computing to only 1946. In fact, the first use of the word “computer” dates to the 1640’s and referred to a human being who performed calculations. These human generated calculations would be compiled in to navigation tables or interest rate tables.

How Did They Overlook Charles Babbage?

My big beef is with the omission of Charles Babbage who developed  the Difference Engine  in 1822.  It was capable of computing several sets of numbers and producing a hard copy of the results. In 1837 Babbage proposed an Analytical Engine, a hand cranked, mechanical digital computer which anticipated virtually every aspect of present day computers. It was almost 100 years until another all purpose computer was conceived. According to James Gleik’s book, The Information, Babbage was inspired by the Jaquard fabric loom,invented in 1804 and which used punched cards to design the fabric patterns. 

A Demonstration of the Difference Engine

Although Baggage never completed building the Difference Engine in his lifetime, a team of engineers using his design notes built an identical engine which was completed in March 2008. It is on display at the Computer History Museum in Mountain View, California.

Reference Librarians Vs The Machine

The CNN timeline ends in 2011 with IBM’s Watson becoming the first non-human winner of Jeopardy. Reference Librarian’s take heart. According the Google’s CTO Craig Silverstein, it will be  about 300 years before Google can understand emotions and non-factual information and replace human reference librarians!  So we have some job security at least for a few more centuries. But coming full circle, if the word computer was derived from the ability to compute algorithms, will the future Artificial Intelligence machine be referred to as a “researcher?” And will some future generation ever be amazed that there were people who once  performed that task, using their own brains?