SamR's Course Webs

ANT154 2000S: Evolution of Technology: Computing

Held: Monday, April 24, 2000 and Wednesday, April 26, 2000.

This page may be found on the World-Wide Web at


Approximate Schedule



If we are going to study the history of computing and related issues, we'll need to consider some basic terms. In particular, we'll need to talk about computers, algorithms, digital computers, computing, computer science, networking, hypertext, and the Internet. We'll be covering hypertext and the Internet, in part, because they've become a very important topic in computing.


At the root, a computer is something that automatically performs some computation. What is a computation? It can include a variety of different tasks, from mathematical to textual.

Currently, many people think of a computer as an electronic device used to perform calculations and computations. These computations are typically based on some sort of symbolic data. Those symbols are often numbers, but may also be letters, glyphs, bits on the screen, and many other things. In the recent past, computers were also built from both electronic and mechanical parts and even from only mechanical parts.

As recently as the 1940's, ``computer'' was a profession rather than a device. That is, a computer was someone whose profession was computing values (typically based on numbers or symbolic values). As you might guess, that use has fallen into disfavor, but it was used for over one hundred years.

In general, we divide computational devices into two kinds: special-purpose computers automate some selected computation, but are usually able to perform only that computation or class of computations. Special purpose computers have been developed to support simple mathematics, the construction of ballistics tables, encoding, and many other processes. On the other hand, general-purpose computers are designed in such a way that they can be configured or programmed to perform any reasonable process. These days, we rely on both kinds of computers.

In the standard model of computers, a computer has five parts:

Here's one picture of how they all relate (other people may draw things differently).

  +-------+        +--------+        +--------+       +--------+
  | Input | -----> | Memory | <----> | Arith. | ----> | Output |
  +-------+   ^    +--------+   ^    +--------+   ^   +--------+
              |     |     ^     |         ^       |
              |     |     |     |         |       |
              |     |     |     |         |       |
              |     |    +---------+      |       |
              |     +--> | Control |------+-------+
              +--------> |  Unit   |

As the picture suggests, the control unit controls ``everything''. It reads instructions from the memory and, according to those instructions, tells memory to load information from input; the arithmetic unit to read information from memory, perform computations, and store or output the results; and the memory unit to provide further instructions.


How does a computer know how to compute? Often, a set of instructions tell it how to do the steps. We call these sets of steps algorithms. You probably know some algorithms. For example, you've probably learned techniques for doing multiplication and long division. Note that some algorithms are little more than ``look it up in a table''. For example, consider how you multiply two one-digit numbers.

Digital, Binary, and Analog

Modern computers are often referred to as digital computers. This is to indicate that they work with discrete digits rather than continuous values. This is in contrast to analog devices, that tend to work on a more continuous spectrum. For example, an LP record produces a continuous waveform while a CD produces a set of values which are used to simulate that waveform.

Most modern computers are also binary. That is, they are based on a system in which there are only two values, typically called zero and one. It turns out that it's easy to represent two values in electronic circuits. For example, 1 might be ``on'' and 0 might be ``off''.

Computer Science

Although there are many definitions of computer science, it is often useful to think of computer science as ``the study of computers, computation, and computability''. Often, the emphasis of computer science is the development, analysis, and implementation of algorithms. In addition, a select group of computer scientists attempt to determine what is computable because it turns out that there some things we could never compute, even given an arbitrary amount of time, processing power, and memory. (Example: Traveling salesperson problem.)

Some computer scientists think of computer science as having three important intellectual precursors: mathematics, science, and engineering. The study of computing draws upon all three areas. Mathematics provides a formal basis for understanding what is computable, and how efficient algorithms are. Science provides a basis for analyzing particular computing systems. Engineering provides support for understanding how to build large systems (more on this later).

A number of computer scientists are also pushing for the field to consider psychology an important precursor for modern computing. Today, interaction between human and computer is particular important, and psychology provides an important framework for understanding such issues. (Ergonomics, a subfield of engineering, also provides significant contributions).

You might wonder how we apply the scientific/experimental method to computers, given that computers are created objects. Surprisingly, some aspects of computers (e.g., the Internet) are so large or designed in such a way that we can only understand them by experimentation.

What is Computable?

Let's consider a few issues relating to what is computable. Suppose someone gave us a computer that they said could correctly compute the ratio of any two ``computerized integers'' (which have a particular limit to their ranges). There are 4,611,686,014,132,420,609 possible pairs of integers. Even if we could do a billion tests each second it would take us approximately 4,611,686,014 seconds to do all the testing. That's about 76,861,433 minutes, 1,281,023 days, or 3,507 years.

Another interesting problem is that of the traveling salesperson. The traveling salesperson has N cities to visit, which are connected by roads that may take different amounts of time to traverse in different directions. Suppose (s)he wants to find the shortest route between the cities. One possibility is to list every possible path through the cities. How many such paths are there? There are N choices for the first city. There are N-1 choices for the second city. There are N-2 choices for the third city. And so on and so forth. Basically, there are N! choices. N! grows surprisingly fast. 10! is 3,628,800. 20! is 2,432,902,008,176,640,000. 30! is 265,252,859,812,191,058,636,308,480,000,000. Surprisingly, no significantly better algorithm is known.


In modern computing, networking is often as important as computation. In general terms, networking is the sharing of information between a number of computers.

One of the most important networks is the Internet. The Internet is a particular network of networks that grew out of two important government-sponsored networks: ARPANET and NSFNET.


As you may know, one of the most powerful transformations of computing was wrought by the World-Wide Web. More than almost any other computing application, the Web has created an enormous growth in the number of computers, the number of networked computers, and the number of networked users. Ten years ago, there were less than three million networked users. Now there are over three hundred million networked users (numbers are approximate). The Web is responsible for much of that growth.

The Web is a hypertext system that takes advantage of computer technology and computer networking. What is hypertext? Hypertext is a mechanism for organizing information in which the information is segmented into small nodes or pages which are then connected by links. Unlike traditional texts, which are often intended to be read linearly (and are certainly represented that way), hypertexts are expected to support multiple sequences of reading (and, perhaps, to challenge the notion of ``sequence'').

The Growth of Computing

Changing Notions of Computing

As computers have evolved and people have found new things that computers can do, there has been a changing notion of what computers are. Originally, computers were considered devices that did simple computations (but lots and lots of them). However, if you ask a child what a computer is, (s)he might say that computers are ``things that help me draw, communicate with my friends, and do my homework''. Similarly, much of the work the most of us do on the computer seems to have little to do with computation (e.g., writing this handout). Nonetheless, at the heart of everything we do on the computer is some form of symbolic manipulation, which is what many consider to be the heart of any type of automated computation.

One sense of the evolution (some of which happened in parallel) is:

An Abbreviated and Inaccurate Timeline

Some Social Implications

Selected Web Issues


My notes on the history of computing are based on a variety of sources and experiences, not all of which are things I've seen, used, or heard recently. Recent and remembered sources include:

My notes on social issues in computing are based on a variety of sources and experiences, not all of which are things I've seen, used, or heard recently. Recent and remembered sources include:


Source text written by Samuel A. Rebelsky.

Source text last modified Wed Apr 26 07:52:30 2000.

This page generated on Wed Apr 26 07:54:16 2000 by Siteweaver.

Contact our webmaster at