Held: Friday, April 17, 1998
If we are going to study the history of computing and related issues, we'll need to consider some basic terms. In particular, we'll need to talk about computers, computer science, networking, and hypertext. We'll be covering hypertext, in part, because it's become a very important topic in computing.
Currently, we may think of a computer as an electronic device used to perform calculations and computations. These computations are typically based on some sort of symbolic data. Those symbols are often numbers, but may also be letters, glyphs, bits on the screen, and many other things. In the recent past, computers were also built from both electronic and mechanical parts and even from only mechanical parts.
As recently as the 1940's, "computer" was a profession rather than a device. That is, a computer was someone whose profession was computing values (again, based on numbers or symbolic values). As you might guess, that use has fallen into disfavor, but it was used for over one hundred years.
In general, we divide computational devices into two kinds: special-purpose computers automate some selected computation, but are usually able to perform only that computation. Special purpose computers have been developed to support simple mathematics, the construction of ballistics tables, encoding, and many other processes. On the other hand, general-purpose computers are designed in such a way that they can be configured or programmed to perform any reasonable process.
In the standard model of computers, a computer has five parts:
Here's one picture of how they all relate (other people may draw things differently).
+-------+ +--------+ +--------+ +--------+ | Input | -----> | Memory | <----> | Arith. | ----> | Output | +-------+ ^ +--------+ ^ +--------+ ^ +--------+ | ^ | ^ | | | | | | | v | | | | +---------+ | | +----------| Control |------+-------+ | Unit | +---------+
As the picture suggests, the control unit controls "everything". It reads instructions from the memory and, according to those instructions, tells memory to load information from input; the arithmetic unit to read information from memory, perform computations, and store or output the results; and the memory unit to provide further instructions.
Modern computers are often referred to as digital computers. This is to indicate that they work with discrete digits rather than continuous values. This is in contrast to analog devices, that tend to work on a more continuous spectrum. For example, an LP record produces a continuous waveform while a CD produces a set of values which are used to simulate that waveform.
Most modern computers are also binary. That is, they are based on a system in which there are only two values, typically called zero and one.
Although there are many definitions of computer science, it is often useful to think of computer science as the study of computers, computation, and computability. Often, the emphasis of computer science is the development, analysis, and implementation of algorithms, processes by which computation is done. In addition, a select group of computer scientists attempt to determine what is computable (it turns out that there some things we could never compute, even given an arbirtrary amount of time, processing power, and memory).
In modern computing, networking is often as important as computation. In general terms, networking is the sharing of information between a number of computers.
As you may know, one of the most powerful transformations of computing was wrought by the World-Wide Web. More than almost any other computing application, the web has created an enormous growth in the number of computers, the number of networked computers, and the number of networked users. Ten years ago, there were less than three million networked users. Now there are over three hundred million networked users (numbers are approximate). The web is responsible for much of that growth.
The web is a hypertext system that takes advantage of computer technology and computer networking. What is hypertext? Hypertext is a mechanism for organizing information in which the information is segmented into small nodes or pages which are then connected by links. Unlike traditional texts, which are often intended to be read linearly (and are certainly represented that way), hypertexts are exprected to support multiple sequences of reading (and, perhaps, to challenge the notion of "sequence").
As computers have evolved and people have found new things that computers can do, there has been a changing notion of what computers are. Originally, computers were considered devices that did simple computations. However, if you ask a child what a computer is, (s)he might say that computers are "things that help me draw, communicate with my friends, and do my homework." Similarly, much of the work the most of us do on the computer seems to have little to do with computation (e.g., writing this handout). Nonetheless, at the heart of everything we do on the computer is some form of symbolic manipulation, which is what many consider to be the heart of any type of automated computation.
One sense of the evolution (some of which happened in parallel) is:
My notes on the history of computing are based on a variety of sources and experiences, not all of which are things I've seen, used, or heard recently. Recent and remembered sources include:
On to History of Computing II
Back to Iterating Trees
Outlines: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
Current position in syllabus
Disclaimer Often, these pages were created "on the fly" with little, if any, proofreading. Any or all of the information on the pages may be incorrect. Please contact me if you notice errors.
Source text last modified Tue Jan 12 11:52:29 1999.
This page generated on Mon Jan 25 09:49:52 1999 by SiteWeaver. Validate this page.
Contact our webmaster at email@example.com