[Written by Sam Rebelsky.]

As you may know, values in the computer are represented in as a sequence of binary (0/1) digits. Unfortunately, when values are transmitted, some digits may be flipped randomly. How do computers detect and correct these errors? There are a number of ways. In general, we assume that at most one error is made in a string (that is, at most one digit is flipped from 0 to 1 or from 1 to 0).

Develop a set of two length-three strings of 0's and 1's such that if zero or one changes are made in the string, then you can still tell which string it is.

To develop a coding in which we can not only detect but also
*correct* errors, we will want to create a set of N length-M
strings such that if zero or one changes are made to the string, then
you can still tell which string was changed. If we use the term
*distance* to indicate the number of bit flips necessary to
convert one string to another, what is the minimum distance between any
two elements in such a set?

Develop a set of eight strings of 0's and 1's such that if zero or one changes are made to any of the strings, you can still determine what the original string was.

Develop a set of eight strings of 0's and 1's such that if zero, one, or two changes are made to any of the strings, you can still determine what the original string was.

Source text last modified Tue May 5 08:49:55 1998.

This page generated on Thu May 7 14:25:05 1998 by SiteWeaver.

Contact our webmaster at rebelsky@math.grin.edu