Fundamentals of Computer Science II (CSC-152 99F)

Class 55: A History of Computer Science

Back to Comparing Lists. On to Wrapup: What is Computer Science? Revisited.

Held Thursday, December 9, 1999









If we are going to study the history of computing and related issues, we'll need to consider some basic terms. In particular, we'll need to talk about computers, digital computers, computing, computer science, networking, hypertext, and the Internet. We'll be covering hypertext and the Internet, in part, because they've become a very important topic in computing.


Currently, we may think of a computer as an electronic device used to perform calculations and computations. These computations are typically based on some sort of symbolic data. Those symbols are often numbers, but may also be letters, glyphs, bits on the screen, and many other things. In the recent past, computers were also built from both electronic and mechanical parts and even from only mechanical parts.

As recently as the 1940's, ``computer'' was a profession rather than a device. That is, a computer was someone whose profession was computing values (again, based on numbers or symbolic values). As you might guess, that use has fallen into disfavor, but it was used for over one hundred years.

In general, we divide computational devices into two kinds: special-purpose computers automate some selected computation, but are usually able to perform only that computation. Special purpose computers have been developed to support simple mathematics, the construction of ballistics tables, encoding, and many other processes. On the other hand, general-purpose computers are designed in such a way that they can be configured or programmed to perform any reasonable process.

In the standard model of computers, a computer has five parts:

Here's one picture of how they all relate (other people may draw things differently).

  +-------+        +--------+        +--------+       +--------+
  | Input | -----> | Memory | <----> | Arith. | ----> | Output |
  +-------+   ^    +--------+   ^    +--------+   ^   +--------+
              |           ^     |         ^       |
              |           |     |         |       |
              |           v     |         |       |
              |          +---------+      |       |
              +----------| Control |------+-------+
                         |  Unit   |

As the picture suggests, the control unit controls ``everything''. It reads instructions from the memory and, according to those instructions, tells memory to load information from input; the arithmetic unit to read information from memory, perform computations, and store or output the results; and the memory unit to provide further instructions.

Digital, Binary, and Analog

Modern computers are often referred to as digital computers. This is to indicate that they work with discrete digits rather than continuous values. This is in contrast to analog devices, that tend to work on a more continuous spectrum. For example, an LP record produces a continuous waveform while a CD produces a set of values which are used to simulate that waveform.

Most modern computers are also binary. That is, they are based on a system in which there are only two values, typically called zero and one.

Computer Science

We've talked about what ``compute science'' is, but it may be helpful to reconsider the question.

Although there are many definitions of computer science, it is often useful to think of computer science as ``the study of computers, computation, and computability''. Often, the emphasis of computer science is the development, analysis, and implementation of algorithms, processes by which computation is done. In addition, a select group of computer scientists attempt to determine what is computable because it turns out that there some things we could never compute, even given an arbitrary amount of time, processing power, and memory. (Example: Traveling salesperson problem.)

Some computer scientists think of computer science as having three important intellectual precursors: mathematics, science, and engineering. The study of computing draws upon all three areas. Mathematics provides a formal basis for understanding what is computable, and how efficient algorithms are. Science provides a basis for analyzing particular computing systems. Engineering provides support for understanding how to build large systems (more on this later).

A number of computer scientists are also pushing for the field to consider psychology an important precursor for modern computing. Today, interaction between human and computer is particular important, and psychology provides an important framework for understanding such issues. (Ergonomics, a subfield of engineering, also provides significant contributions).


In modern computing, networking is often as important as computation. In general terms, networking is the sharing of information between a number of computers.

One of the most important networks is the Internet. The Internet is a particular network of networks that grew out of two important government-sponsored networks: ARPANET and NSFNET.


As you may know, one of the most powerful transformations of computing was wrought by the World-Wide Web. More than almost any other computing application, the Web has created an enormous growth in the number of computers, the number of networked computers, and the number of networked users. Ten years ago, there were less than three million networked users. Now there are over three hundred million networked users (numbers are approximate). The Web is responsible for much of that growth.

The Web is a hypertext system that takes advantage of computer technology and computer networking. What is hypertext? Hypertext is a mechanism for organizing information in which the information is segmented into small nodes or pages which are then connected by links. Unlike traditional texts, which are often intended to be read linearly (and are certainly represented that way), hypertexts are expected to support multiple sequences of reading (and, perhaps, to challenge the notion of ``sequence'').

The Growth of Computing

Computing has grown faster than almost any technology to date. For example, the ENIAC cost over $1,000,000 in 1950 (approximately). These days, you can buy as much computing power for about $10 (again, approximately). In some sense, that's equivalent to being able to buy a Jet Airplane for the price of a children's trike (even more approximate).

Few of the original predictions of the impact of computing accommodated the implications of this enormous growth. It's not clear many of us can really conceive of the growth, or the possibilities such growth implies. A quick scan of the many early predictions about computing and related issues show things that we now consider gross misconceptions, such as: ``there will never be a need for more than ten computers nationwide''; ``no one ever needs more than two copies of a document''; ``computers can only process numbers''; ``only specialists need computers''; ``only specialists can use computers''.

The growth has also led many computer programmers to be wasteful. In 1984, a Macintosh had 128 kilobytes of memory and used 400K disks. The ``smallest'' Macintosh you can buy today has about 16 megabytes of memory (128 times as much) and comes with a four gigabyte hard drive (about 10,000 times as much). Yet there was a lot squeezed into that small space. A 400K disk could fit System, Finder, Drawing or Writing Program, and even a few user files. Have we gained significantly more functionality? One hundred times as much?

Changing Notions of Computing

As computers have evolved and people have found new things that computers can do, there has been a changing notion of what computers are. Originally, computers were considered devices that did simple computations (but lots and lots of them). However, if you ask a child what a computer is, (s)he might say that computers are ``things that help me draw, communicate with my friends, and do my homework''. Similarly, much of the work the most of us do on the computer seems to have little to do with computation (e.g., writing this handout). Nonetheless, at the heart of everything we do on the computer is some form of symbolic manipulation, which is what many consider to be the heart of any type of automated computation.

One sense of the evolution (some of which happened in parallel) is:

An Abbreviated and Inaccurate Timeline

Some Social Implciations

Selected Web Issues



My notes on the history of computing are based on a variety of sources and experiences, not all of which are things I've seen, used, or heard recently. Recent and remembered sources include:

My notes on social issues in computing are based on a variety of sources and experiences, not all of which are things I've seen, used, or heard recently. Recent and remembered sources include:


Tuesday, 10 August 1999

Wednesday, 8 December 1999

Back to Comparing Lists. On to Wrapup: What is Computer Science? Revisited.

Disclaimer Often, these pages were created "on the fly" with little, if any, proofreading. Any or all of the information on the pages may be incorrect. Please contact me if you notice errors.

This page may be found at

Source text last modified Wed Dec 8 16:41:46 1999.

This page generated on Wed Dec 8 16:50:44 1999 by Siteweaver. Validate this page's HTML.

Contact our webmaster at