Syncopated Systems
Seriously Sound Science

My Brief History of Computers

The term “computer” dates back to at least 1613, when it had been used to describe human computers: people engaged in calculating and computing numbers, often occupationally. This usage continued until after World War II ended in 1945 and well into the Space Race of the the 1950s and 1960s.

The 2016 film Hidden Figures nicely portrays occupational human computers—at least some of them also expert mathematicians—within the organization that became NASA in 1958, and a shift from manual computation through the installation of an electronic (transistorized) IBM 7090 mainframe computer.

Computation versus Storage-and-Recall

Manual computation can be tedious work, and this work is prone to errors that are often impractical to weed out through constant checking and the supervision of the task and its checking, especially as tasks become more complex and each layer of supervision also becomes a task that requires checking.

Manual Lookup

So, to reduce tedium and errors, starting centuries ago, tables of common numerical sequences were calculated by specialists and stored in books, and printed books with these numbers generally provided economies of scale to those who bought and used them.

Perhaps the most familiar of these texts today is the CRC Standard Mathematical Tables and Formulas, which is apparently now in its 33rd edition (published in 2018, about 90 years after its first edition). (I should note that I tend to reserve my own use of the word “standard” for standards that have been formally agreed upon by organizations of experts in particular fields, preferring otherwise to use terms such as “common”, “generally accepted”, or “customary”—the last of which is used within the CRC text.)

In recent years, I have also learned to appreciate online resources such as Wolfram Mathworld (which has evolved from something with earlier connection to CRC Press) and the On-Line Encyclopedia of Integer Sequences (OEIS).

In Modern Computing

Developers of modern computers and their software generally balance between how quickly they can computer versus much they they can store.

An example of how this applies to software includes “memoization”, a term introduced in 1968 that essentially describes increasing computing speed by storing results of calculations and recalling them later where applicable. (I’ve used this to increase the speed of prime factorization over routines that use only Pollard’s rho algorithm, which I believe is otherwise state-of-the-art. For more, see my page about prime numbers.)

In hardware, we can see how the relative widths of each microprocessor’s data bus and address bus have changed as faster ones have been developed; generally, those from the 1970s had a wider address bus, in the 1980s and 1990s the address bus and data bus were mostly equal in width, and since the introduction of 64-bit microprocessors in 2003 (with the Newisys 2100, a machine I helped develop) the data bus is generally wider than the address bus.

Mechanical Automation

With applications in many fields including finance and engineering, early publications of computed values included logarithmic tables. However, due to the complexity of accurately calculating and typesetting logarithmic values, the tables in these books were often prone to errors.

Circa 1820, errors appearing in these tables (and likely the dangers posed by those errors) motivated Charles Babbage to design two massive mechanical calculators called difference engines, and at least the latter of which would create plates for printing.

Although his two designs were never completed in his lifetime, two of his second (and simpler) design have been produced by London’s Science Museum; in 2011, I was lucky enough to see one in operation at the Computer History Museum (in Mountain View, California) and the other on display at London’s Science Museum.

Though mechanical computers before these were rare, they have a long history; smaller mechanical calculating machines can be traced back at least as far as the Antikythera mechanism, which is believed to have been created between the years 205 BCE and 60 BCE.

See Also

My Brief History of Computers

On Microprocessors

The 6502 Microprocessor