< Back to
table of contents
Power and Pedagogy: Transforming Education through
Information Technology
|
Chapter Two - The Computer as a System
Computers are like wheeled vehicles: they come in many shapes
and sizes, each serving a different purpose. Moreover, the computer
has yet to mature. It is an emerging technology. Hence, to determine
the potential of computers in education, we need to understand what
the computer is. To start, consider two distinctions, one between
transitional and mature technology and the other between artifacts
and systems.
Complicated technologies take a long time to develop their potentialities.
They also take capital. Developers cannot perfect their technology
in endless years of laboratory work and then deliver it, refined
and complete, to a grateful public. To underwrite the costs of perfecting
a technology, developers must bring it to market long before it
is mature. Profits from transitional implementations sustain the
development work, providing resources and disclosing unexpected
opportunities for use. Computers have exemplified this drawn-out
development: computers have evolved through several distinct, quite
profitable incarnations, yet neither the time-sharing mainframe
nor the stand- alone micro indicate fully what the computer will
be when the technology matures.
In common speech, we generally do not distinguish between typical
technological products and the technical systems that make them
usable. For instance, "television" can refer to the TV set, that
ubiquitous appliance, or to the whole industry -- the networks,
their broadcasting installations, the news teams and production
studios, advertisers, and all. Likewise, "automobile" can refer
to the car in my driveway or to the vast infrastructure -- the manufacturers
here and abroad, with their suppliers, advertisers, and dealers;
all the roads and bridges and the builders constructing and maintaining
them; the service stations and oil producers, refiners, and marketers;
and the myriad of designers, workers, police, and service people
who make the system go. The car is both a separate artifact and
a complex system.
Currently, "computer" usually calls to mind the artifact, the
stand-alone personal computer, like the one on which I am now writing.
Most of us do not think much about the complex system of which my
PC is a transitory part. Computers as a system are important, however.
The significance of computers for education will not be well understood
by thinking simply of a lot of separate machines sprinkled through
existing schools and colleges. Computers are an emergent infrastructure,
a system, fully as complicated as that of the car. We need to think
about what that system is and how that infrastructure will work.
Computers as a system can be a powerful agent of change in education.
To grasp the computer as a system, particularly as it matures,
let us concentrate, on neither hardware nor software, but on an
underlying process, the digitization of information. The computer,
as a system, introduces a new way of representing information in
our culture, a new way of encoding ideas. When complete, it will
constitute a deep transition in our history, one equal in importance
to the introduction of printing, quite possibly to the development
of writing itself. Essentially, the computer as a system will envelop
all previous modes of representing information, preserving and empowering
them by integrating once separate domains of communication into
a unified, "multimedia" system.
Information in Matter and Energy
Think of the ways we commonly represent information -- a scribbled
note, a neatly printed page, a reflective sign, a painted picture,
a ruler uniformly marked, a measuring cup, or the symbolic forms
of church or court. With these, people have encoded ideas and information
in material objects, in the ink upon the page or the shape of the
sculpted stone. Put most generally, through traditional ways of
encoding ideas, people expend energy to transform matter in ways
that they will find meaningful, making enduring marks and forms
in which ideas inhere. People locate the information in the material
object. When they do this according to a defined convention and
art, the tangible, palpable results are our major forms of traditional
communication -- documents, sculptures, pictures, monuments.
Starting with the telegraph and developing through the telephone,
radio, television, and computer, people have begun to put their
information into controlled pulses of energy itself. The material
object, say the telephone, becomes a kind of transparent medium
for an infinity of possible conversations encoded in different electrical
waves that the phone will generate, transmit, and receive. Increasingly
people are representing information in controlled states of energy,
not in matter, as they did traditionally. The new practice requires
various material tools, with which people apprehend on their human
scale the information located in energy, but the information is
not in the material, but in the energy. Thus the TV translates the
information bearing energy into a material form that I can watch.
The picture hanging on my wall is what it is because the information
that it contains is in the material that makes it up. My TV, in
contrast, can receive an infinity of images because the information
it displays is not in the material of the set, but in the electromagnetic
waves that it picks up and decodes for me.
This practice of locating information in energy states is not
entirely new in our culture. One can take sound to be a form of
energy, not a state of matter, and hold that through speech and
song people have long encoded information in energy, using the ear
as the naturally developed, material receiving apparatus. Other
senses, too, especially sight, kinesthesia, and the ability to feel
hot and cold, derive much information from energy states and forms
of force. Some traditional tools of communication and control also
provided readings of the information in energy states. The clock
measures time by controlling the release of energy in uniform units.
The compass provides a most informative reading of the orientation
at any location of the earth's magnetic field. The governor on a
steam engine directly translates a change in its energy state into
a controlling action. Like the TV -- but unlike the painting on
the wall -- clocks, compasses, and governors all inform their users
through their changing readings, not through their static states.
More strictly speaking, these instruments display information that
is fortuitously located in states of energy, rather than encoding
it in those states. Traditionally, only the voice and musical instruments
went beyond display to encode.
Up until very recently, information encoded in energy has been,
however useful and dynamic, troublesomely transient. Speech is the
paradigmatic instance. It is powerful and nuanced, yet fleeting
and unstable. For a time memory preserves its residue, and writing
fixes a stiff representation of it in stable matter. But much is
lost. This transience also characterizes many modern media that
encode information in wave forms, substituting electricity for sound
as the energy medium. Thus telephone, radio, and television have
enabled people to encode sound and gesture in electromagnetic waves,
amplifying these vastly, without making them much more enduring.
Recording signals on tape and other media makes such material reproducible,
and thus enduring. Yet this has been a recent, ancillary development.
So far, the power of electromagnetic media has resulted from the
breadth of their transient reach, not from the ease with which productions
can be reproduced.
This transience of electromagnetically encoded information fundamentally
affected the usefulness of broadcast media for education. Entertainment
results from encountering cultural experiences for their immediate,
present value -- they amuse, inspire, absorb, purge, distract, or
release us now. Education involves us with cultural works of enduring
importance -- we acquire skills, ideas, beliefs, knowledge, information
that will empower us over time in the conduct of life. The things
at stake in education are the elements of the culture that are on-going,
lasting resources. Consequently, the educationally important media
are the ones that represent and make such enduring ideas and skills
available to people. For the most part, these have been the media
that locate information in material objects, particularly in printed
texts and pictures.
Commentators complain that educators have done little with the
major communications developments of the twentieth century. Despite
high hopes, radio and television have not become important educational
resources and some infer therefore that education is resistant to
technological change. This inference is wrong. The photograph, which
extends the pictorial capacity to locate information on film and
paper, has been seamlessly incorporated into education. It improves
the capacity to work with lasting ideas and information, and educators
have quickly adopted photographs in the processes of research and
instruction. As conservative a field as art history took without
hesitation to 35mm color slides because they served the intellectual
needs of the subject. So too, recorded music has become a natural
part of music education, far more so than have broadcast performances,
for the recordings are stable, enduring resources that different
students at different times can study, each with unique purposes
in mind. Recordings suit the needs of education because they are
stable, easily stored and retrieved, while broadcasts suit the needs
of entertainment, absorbing us in their immediate presence.
Educators cannot resist new technologies, provided those technologies
have characteristics suitable to educational purposes, foremost
among those being a permanence in time. Stop for a moment to consider
film, which encodes information in stable, material form yet has
not come into robust use in education. Is it an exception to the
rule here propounded? No. With respect to dissemination and retrieval,
film is not as stable as it might seem. Film is bulky, hard to store,
costly to project, and easily damaged. It can be best disseminated
in a quasi-broadcast fashion with prints distributed to numerous
theaters more or less at the same time, with the production playing
as long as it can command a full audience and then disappearing
into an archive, from which films are not easy to retrieve. These
distribution constraints have made movies, until very recently,
far more effective as media of entertainment than of education.
Computers as a system will change that, and much more. Broadly
speaking, the communication innovations since the mid- nineteenth
century have created a family of technologies for encoding diverse
forms of information in energy. The computer is the most recent
in this series of innovations, and it is likely, historically, to
incorporate all those leading up to it into itself. What seem to
us to be separate industries with separate technologies will become
branches of a single comprehensive industry and technology, the
computer as a system.
One can now see large corporations jockeying to capitalize on
this consolidation of technologies. For instance, the major Japanese
electronics firms seem to be calculating that they can best shape
this process by combining business communication with the entertainment
industries, buying up major entertainment conglomerates while designing
ever-more computing power into home entertainment devices. The emerging
system, however, may in fact be far more robust if built on a combination
of telecommunications and education. Digital technologies enhance
the staying power of information in time, expanding its educative
power relative to its currency as entertainment. We will be developing
the thesis that the computer is rapidly incorporating the modern
media in one comprehensive system, a system of knowledge and education.
The Analog and the Digital
We distinguished between technologies that locate information in
matter, for instance sculpting and printing, and those that locate
it in states of energy, for instance radio and computers. Among
the latter, we need to make important further distinctions, which
have to do with the techniques people use to encode information
in energy. To grasp the cultural import of the computer as a system
incorporating all the media of communication, to appreciate its
potential power, we need to reflect on the way that it encodes information
in energy, seeing how that differs from other techniques.
Analog coding serves effectively for some specialized computational
purposes, but almost all computers, from tiny palm- tops to huge
supercomputers, work with information stored in digital code. Such
digital code differs profoundly from the analog codes used typically
in radio and television. In the paragraphs that follow, we will
reflect on how digital code differs from analog and then consider
five matters that determine the value of information for human activity
-- production and reproduction, storage, transmission, selective
retrieval, and intelligent processing. Through these considerations,
we will form a sense of why the computer, as it matures, will be
a very significant step in our history.
Note at the outset that we could apply this distinction between
analog and digital coding to the media that use matter to carry
information. For instance, painting and sculpture are highly analog
media, whereas alphabetic writing is interestingly ambiguous. It
is analog insofar as it is phonetic and digital insofar as it is
a prescriptive set of legible conventions. But it would take us
afield to pursue these distinctions with respect to media that locate
information in matter, for our concerns here are primarily with
the media that carry information in energy. How does the digital
coding of information in energy differ from the analog?
Analog systems encode information in energy by using the properties
of continuous waves so that each successive change in the amplitude
of the wave will be analogous to a change in sound or appearance
in the human world. Lets construct an example. Take a dishtowel.
Holding each corner of one end in each hand, flap it rhythmically
in front of you, making it undulate up and down. It is not hard
to control the beat of the flapping, making each flap identical
in duration, perhaps slow and long or quick and short. That beat
is like the frequency of an analog signal. Usually it does not carry
the information, but when we are surrounded by many different signals,
each with a different frequency, it allows us to find the one signal
we want. Observe the flapping towel, however. From beat to beat,
it will have all sorts of variations, curving this way then that,
depending on subtle changes in the orientation of your hands to
each other and the tension they put on the cloth. If you could control
the flapping skillfully enough, you could make each change in the
way the towel undulated match some other, analogous change in a
completely different wave, say the ever changing sounds of a symphony
or rock concert. At that point, you would have encoded the concert
in the flapping towel rather like the way radio encodes a concert
in an electromagnetic amplitude or frequency.
Like the sound itself, the flapping is transient. Analog encoding
depends on making significant changes in the energy state of the
wave, a most unstable phenomena. Digital encoding is much more stable.
Put down the towel and flip the light switch on the wall. The switch
has gone from "on" to "off;" it was stable in its former state and
is stable in its latter. The light switch is a digital device, although
one that does not accomplish much in the way of communication and
control. To see simple signals controlling more complicated processes
occurring around you, look at another digital switch, the stoplight
at the corner. It has two basic states, red or green -- amber is
not really a state, but a cue that a change of state is about to
happen. There are two unambiguous states, green-go, red-stop. These
are easily standardized, stable, and remarkably effective in controlling
complex flows of matter and energy. The stoplight is very much like
the small charge in a transistor in that one state allows traffic
to move and the other calls it to a halt.
Our basic red-green stoplight is a binary digital system -- binary
because there are two alternatives and digital because those consist
of discrete, unambiguously different states. The typical electric
stove, with options on each burner running from warm to high, has
a quinary digital control on its coils -- quinary because there
are five alternatives and digital because each of these is distinct
from the others. Thus, digital systems can in principle have different
numbers of basic alternatives, but computers almost always use a
binary system, building many subtle variations from a multiplicity
of either-ors.
A digital state is what it is, discrete, unambiguous, disjunctive.
Digital code does not capture changes similar to other changes,
it presents a set of values that are what they are. Digital coding
follows a principle akin to encrypting -- there is only one message,
which, when encrypted, is put in a way that makes it look indecipherable.
With the appropriate key, however, the cryptographer finds the message,
not something like the original, but the original itself. For instance,
the apparatus for recording music digitally measures sound frequencies
at successive instances and records the numeric value of the frequencies.
These are samples of the actual sound, not likenesses to it. Digital
coding samples a phenomenon, registers the sample, and then reproduces
the phenomenon from the sample. If the sampling technique and the
technique of reproducing from the sample are very good, it can be
extremely hard to distinguish the original from the reproduction.
What is coded is an exact value, precisely what it is and nothing
else.
What is encoded digitally, therefore, is actually very different
from what is encoded in an analog system. The digital system encodes
a sample of the thing whereas the analog system encodes an analogy
to it. Again, let us construct an example. Consider a full wheel
of cheddar cheese. Describing the cheese by analogy can be difficult.
I might say it is about the size and shape of an old-time hatbox
and that it is heavy, as if the hatbox were filled with water. Its
color is like custard and it tastes -- this is the important, difficult
part -- somewhat like grapefruit, although its texture in the mouth
is very different, a bit like a firm fudge that crumbles and then
softens into a paste as one chews it. Describing the cheddar by
a sample of it is much simpler. I cut you a little piece, perhaps
several from different places in the wheel. The sample is the cheese
and you can sniff it or taste it directly from the sample.
When we digitally code the sample, we register what the sample
is on an appropriate scale and we code that value, not some approximate
likeness to it. Consider recording a singer's voice digitally. At
numerous intervals the recording samples the exact sound frequency
of the voice, registering in a matrix of precise values what, at
each sampling instant, the frequency was. The digital recording
carries no information about the voice during the intervals between
the sampling instants, but it carries the exact frequency of it
at those instants. If the sampling frequency is sufficiently rapid,
the sound of the reproduced voice will be essentially identical
to the original. Digital code allows the playback to reconstruct
the voice. Thus, digital coding registers sampled values, not approximate
similarities. That is its first point of difference with analog
coding.
Secondly digital code differs from analog because it resists degradation
far more effectively. Electrical systems, like everything else,
are subject to entropy. Every circuit has in it random fluctuations.
Computers are not wondrously free of such static. Minor fluxes are
a big problem in analog coding because the locus of information
is in tiny incremental differences in the amplitude of waves, which
the random fluxes in circuits can easily affect. In the absolute,
digital systems are equally subject to noise, but the locus of information
is in the basic energy state, not in small changes of that state.
When the significant point is simply whether a circuit is on or
off, it allows for a huge threshold before an intrusive fluctuation
will become significant, making a circuit that is "on" appear to
be "off" or vice versa.
To construct an example, consider a binary test for whether or
not it is raining: looking out my apartment window to see if the
sidewalk is wet or dry. This test is subject to noise -- perhaps
in this case we should call it "splash." During the summer window
air-conditioners in the building adjacent condense water on hot,
humid days, splotching the sidewalk. Also on the road on the other
side there is a low spot where water collects from a leaky hydrant
and occasionally passing cars splash it onto the sidewalk. Like
the noise in the electrical system, extraneous wetness sometimes
partially covers a dry pavement. This rarely confuses my binary
test, however, because I establish a threshold -- it is raining
if the sidewalk is fully, uniformly wet and it is not raining if
the sidewalk is dry, or partially splotched from random sources
of water. Given the substantial threshold possible in a binary system,
very, very rarely will electrical noise cause the misreading of
a bit of information.
In sum, in comparison to analog coding, digital code registers
values that are attributes of the thing being coded, not likenesses
to it, and those values, once coded, will be remarkably resistant
to error or degradation. These characteristics make digital code
immensely useful in processes of communication and control.
Digitization and Communication
Digital code records samples of phenomena, not analogies to them,
and it does so by techniques that are remarkably stable and accurate.
By themselves, these characteristics may not seem so extraordinary.
But put in context, the context of human use, they have very significant
effects on the computer as a communication system. Whatever the
medium, in order to communicate people need to be able to produce
and reproduce information, to store it, to transmit it, to select
among it, and to process it intelligently in the course of action.
These five areas determine the relative historic value of different
communication techniques. Reproduction, storage, transmission, selection,
intelligent action: communication techniques that perform these
functions well serve human needs well. Because digital coding registers
samples of things and because it resists error and degradation,
it has interesting effects in each of these five areas. These effects
will determine how the computer as a system can contribute to our
unfolding cultural history.
We begin with the problem of producing and reproducing information.
What sort of information can one produce with a typical analog medium,
audio tape, for instance? The answer defines a wide range of matters
-- anything that can be recorded through an electromagnetic analog
to sound within certain frequency ranges -- an aria but not a painting,
a speech but not a balance sheet. The analog techniques used in
the audio system must be closely coupled to the phenomena they record
so that the way they modulate electromagnetic waves is precisely
analogous to the particular wave patterns they are recording. To
use the audio system to record images or the financial transactions
of a bank, complex and careful adjustments need to be made in it,
radical adjustments that convert the audio system into something
quite different. Here the constraints of the analog medium limit
the sort of information the system can record. With the digital
system, we can produce a much more flexible range of information.
As a result, digital coding can absorb both the analog media for
carrying information in energy and many of the more traditional
media that carry information in matter. For instance, the most familiar
digital application now is word processing, enabling people to manipulate
electronically the material system of writing with far greater flexibility,
precision, and ease that traditional means have availed. In due
course, anything that we can represent with a symbolically coded
sample, we can record in a digital system.
It is not a trivial task to implement this potentiality. But it
is inexorably happening. The first wave of computer uses involved
diverse numerical applications. The microcomputer extended these
and added extensive textual applications. Recently software designers
have incorporated two-dimensional graphics into many programs for
general use and three-dimensional imaging for special needs. Supercomputers
have begun to record vast samplings of extremely complex phenomena
that were simply beyond the ken of analog media -- climate change
and molecular structures, for instance. With compact discs, the
audio industries have developed and marketed the digital recording
of sound, which is fast being incorporated into computing systems.
The television and computing industries together are rapidly generating
digital systems for producing and recording moving images. Techniques
for sampling nearly all the forms of information and capturing them
in digital code are quickly developing. In its basic sense, the
concept of "multimedia" is this practice of integrating in one system
all forms of producible information. When we speak of the computer
enveloping other media and incorporating them into itself, we mean
the capacity, unique to digital coding, to produce and reproduce
many different forms of recordable information. Multimedia implements
this capacity.
The difficulties in implementing multimedia are not primarily
"technical," in the layman's sense of the term. Ordinarily we think
that the technical problem lies in designing an apparatus to accomplish
a novel purpose. In many areas, making the apparatus is relatively
simple, and it can be done in numerous different ways. What is difficult
is setting a controlling standard that will establish agreement
on which one of the possible ways to design the apparatus will be
the one put into common use. This is in part a question of technical
standards -- for instance, what sampling rates will be standard
for digitally encoded sound or what screen resolution will be standard
for digital high-definition television (HDTV)? But the problems
of controlling standards goes far beyond the domain of technical
standards -- long established branches of law and language are at
stake as well.
Thus, the production and reproduction of information is not simply
a technical process. It is a process controlled by law and driven
by incentives. Digital coding of information will affect these domains
as well. For instance, copyright makes sense in a system in which
people locate information in material objects -- copying consists
in expending the energy to implant the information in matter, preeminently
by putting ink on a page. Copying information that is located in
matter is a laborious, error-prone process, subject to legal processes.
Recording and reproducing information that is located in energy
has very different characteristics. It becomes extremely inexpensive,
with the result that it can be done ad hoc by anyone who possesses
easily available, inexpensive tools. Already, spontaneous reproduction
through analog means, such as photocopying and audio and video tape,
has put considerable stress on laws pertaining to the right to copy.
The broadcast industries have had to develop novel ways to realize
economic benefit from cultural works, ways that turn less on the
right to copy and more on the right to use a work.
With digital coding the reproduction of material becomes even
faster, cheaper, and vastly more accurate than it does with analog
electronic media. Once something has been sampled and captured in
digital code, the idea of a copy of that sample ceases to make much
sense. The copy is not really a copy, but a second instance of the
original. The computer radically changes the conditions bearing
on the reproduction of information and ideas. Once the infrastructure
is in place, the reproduction of materials has a negligible cost
with respect to materials, work, or quality. In principle, in a
digitally encoded culture, anyone can have instances of anything
they wish without added cost to the system. It will require an elaborate
process of technical, social, and legal development to achieve actualize
such potentialities.
Digital coding will also transform the problem of storing information.
Librarians concerned with the preservation of materials traditionally
attend closely to the durability of paper and its possible substitutes.
The key question they ask is: "How long will it last?" This makes
a lot of sense as long as the information is located in matter.
If the paper will quickly degrade, the cultural community will soon
need to reprint its materials or reproduce them on some alternative
material such as microfiche. The shelf-life of all this is important
as each cycle of reproduction is very costly, as well as an occasion
for material to be lost and errors in reproduction to creep in.
With digitally coded materials, shelf-life remains limited, but
the costs of reproduction and the likelihood of errors arising from
reproduction drastically declines. Hence, the keepers of the heritage
need to rethink the standard principles of storage and preservation.
Continuous reproduction can make the quest for durability unnecessary.
Since reproduction is very cheap and very accurate, the problem
is not one of finding the most enduring materials and keeping them
as stable as possible. Rather the problem becomes one of regularly
refreshing the energy-states in which the information is located
and making sure that it is scattered in enough separate instances
that a catastrophic failure in one instance would not obliterate
the heritage.
Other, more novel problems of storage also arise. With respect
to information located in material objects, we naturally store materials
in institutions adapted to the attributes of the objects. Thus we
use libraries for books and museums for paintings and artifacts.
Much intellectual specialization arises because people need specific
skills to work effectively in these different collection of material
resources. Insofar as we can record all these resources in digital
code, we will store them in one, comprehensive system and we will
thereby diminish in power many objective goads to intellectual specialization.
As digital coding makes information easier to store with much
diminished threat of loss, so too it improves our ability to transmit
information. Transportation costs and limitations have long been
a significant determinant of communication capacities. Through the
twentieth century, techniques of coding information in energy have
greatly reduced the costs and limits on its transmission. With the
substitution of digital for analog coding, these developments are
extending far further as we enter the twenty-first century. Analog
systems using energy as the medium have developed two major principles:
point-to-point circuit switching, as through the telephone, and
the use of wide information channels in broadband transmissions,
as through radio and television broadcasting. Digital systems are
combining and unifying these two principles, allowing the links
between point-to-point switched circuits to be wide information
channels, creating a single transmission net of extraordinary flexibility
and power.
We are already everyday users of the basic principles essential
to these changes. My mother is eighty-eight and legally blind, but
she can use a push-button phone with confidence and has a good head
for phone numbers and thus she keeps up familial and social connections
all over, in Mexico, in Canada, and around the United States. Each
time she dials someone's number, she instructs the phone system
to establish connections within its circuits to link her phone with
that of the person she is calling. Phones code and decode voices
from a very simple electrical signal that can be easily transmitted
through complex switching systems and has a narrow band for coding
information, one just sufficient for the low- fidelity reproduction
of ordinary speech. How much traffic the phone system can bear depends
on how many separate circuits it can switch together at any time
and on how many separate transmissions its trunk lines can aggregate
together in simultaneous calls. You'll get a busy signal if the
system runs out of switches or transmission room.
Radio and television use much wider bandwidths, and they code
them more intensely, with the result that their signals can be much
more complex than those of the telephone. Thus radio can reproduce
sound with much greater quality that the telephone, and the amount
of information transmitted via television far exceeds that used
in a phone conversation. The wider bandwidth, however, makes point-to-point
switching in such transmissions more complicated to do without introducing
noise into the signal, and without overwhelming the capacity of
connecting circuits when many parallel transmissions are traveling
on them simultaneously. Various properties of digital coding facilitate
the combination of circuit switching with the information intensive
transmissions that characterize broadband systems.
Both analog and digital systems make use of what we will call
micro-time, the actuality of incredibly brief instants. For instance,
radio waves fluctuate several million times per second and each
fluctuation produces some of the information we hear. The higher
the frequency, the more information the signal can contain, provided
we can keep the receiver tuned to the proper spot upon the spectrum
and provided we can minimize interference between signals and other
sources of noise. Because the information bearing medium is a continuous
wave, however, we find it much easier to propagate the information
onto the medium at the rate it occurs at, and at which it is to
be received. In contrast, when the information has been captured
in digital code, it becomes much easier to make use of micro-time
in more flexible ways: capture, transmission, and delivery can be
separated. The pace of capture depends on the pace of the phenomenon,
what we call "real time." Transmission of the binary units, the
bits encoding the phenomenon, can take place in different time --
it can squeeze into each tenth of a second, or less, the information
needed for one second of conversation, giving the circuit to other
conversations for the remaining nine-tenths, or more, of each second.
By this technique, and others like code compression and error correction,
the capacity of a circuit carrying digital data can be greatly expanded.
Further, the transmission of analog data depends very closely
on the particular characteristics of the transmitting medium. With
the transmission of digital data, it does not matter what the transmitting
medium is, provided that medium has been adapted to transmit digital
code. Thus all the different electromagnetic transmission media
in common use now easily transmit digital data. More importantly,
new media, useless for transmitting analog information, for instance,
laser light in fiber optic cable, increasingly transmit digitized
information with significant gains in speed and volume, at lowered
cost, and with increased dependability. The frequencies of light
waves are much higher than those of electromagnetic waves. Hence,
we can pack information far more densely per unit of time into light
for transmission over fiberoptic cables than we can with electricity
over wires or electromagnetic signals in space. The usable bandwidth
is much, much wider. The higher density allows much more intense
timesharing of the circuit and the greater bandwidth means that
in each instant a much larger load of information will be charging
through the circuit. As a result, a system is emerging in which
all forms of information -- text, numerics, graphics, audio, video
-- can be transmitted, switched from point-to-point, as easily as
we can with the phone.
Digital coding, thus, is making possible the use of one system
to produce all forms of information, to reproduce anything in the
system with low cost and little loss, to provide for its indefinite
storage through this process of continuous reproduction, and to
transmit any element of it to any user fast and cheaply. By themselves,
these developments make oodles of good information easily accessible,
threatening to overwhelm the user in a vast babel of bits. These
three characteristics are of a piece with each other, setting limits
on what intellectual resources a culture can provide its members.
But they do not, alone, make for a well developed system of communication.
Selective retrieval, enabling people to get precisely the information
they want and when they need it, has always been a key problem of
culture and communication. How can you get from the culture the
ideas and information that you want and need? And even more perplexing,
how can the culture intimate to you and everyone else what possibilities
of interest it does and does not offer in the infinity of circumstances
surrounding us? Retrieval is a fundamental problem of all cultures,
and it is becoming an even more pressing problem with digitally
coded information. It is the fourth determinant of communication
effectiveness in history and the widespread digitization of information
is transforming it as well.
Throughout history, major communication advances have brought
with them new ways to retrieve information. The practice of citing
books and articles by title and author, edition and page, rose to
full significance in the era of print. The printed book, which could
be distributed in many locations in identical versions, needed some
logically effective technique of reference and recall, one that
would work in many different places and many different times. Prior
to that people referred far more vaguely to an author and an argument
or thesis, and to retrieve the actual text a scholar needed to know
where a specific instance was physically located, with diverse works
bound together for convenience. Today, people often handle their
personal libraries in this pre-print fashion, jumbling certain books
together say by size, or just shelving them as they come, able to
find any particular one, not by a sense of logical order, but by
having a feel for where it is by some sense of spatial juxtaposition.
That works for small libraries, but it spells chaos for large collections
of printed books. For those, people needed to develop far more systematic
techniques of reference and recall.
With digitally coded information, the situation is much the same:
people need to master new, more powerful retrieval routines to manage
the cornucopia of information. These techniques relate to two different
problems in the use of information -- exchanging information and
applying ideas. In both exchanging ideas and applying them to problems,
people need to retrieve information selectively. Exchanging materials
is somewhat similar to the phenomena of point-to-point switched
circuits while applying them is related to finding a station or
channel in broadcast communication. Exchange requires the precise
identification of start and end points and application requires
the substantive sifting through extensive materials to select out
the precise components pertinent to the problem at hand. Since the
problems and prospects in each domain are rather different, let
us consider each briefly in turn.
Our means for managing the exchange of information have already
been heavily influenced by characteristics of digital coding, at
least insofar as digital coding involves discrete units, as distinct
from continuous waves. For instance, integer numbers are a system
of digital entities: each number is discrete, autonomous, separate
from any other. So too is the alphabet, which is a more restricted
set of discrete elements, most simplistically twenty-six, but preferably
256, if we take extended ASCII code as the norm. Long before computers,
people became adept at using numbers and letters to assign precise
locators to all sorts of objects, persons, phones, buildings, accounts,
parts, and so on.83 Implementation of these coding principles in
digital computers enhances our capacity to manage them greatly,
extending the scope, precision, and speed of the process. In substance,
the problem of addressing things so that information about them
can be exchanged from point-to-point is less technical than socio-
political: the problem of privacy, of censorship, of deciding what
limits, if any, to place on the reach of possible exchange. Whenever
the power to exchange information increases significantly, it brings
such problems with it. The abuse of privacy thus seems to be a structural
issue, occurring at the margins where new ways to manage exchange
are developing. Historically, people seem to opt for accepting the
benefits of new systems of information exchange, after instituting
measures to ensure that they will not be used to subvert personal
security and integrity. Unfortunately, this trade-off has not always
been benign as the tragic abuses of totalitarian regimes of right
and left repeatedly demonstrate. As computers make it possible to
exchange information that was formerly "private," easily kept to
oneself, we will need to face up to difficult issues of defining
limits and controlling abuses.
Retrieval that involves sifting, selecting, and applying ideas
presents different problems and opportunities. Our existing techniques
for doing this involve time-consuming secondary processing of materials
-- indexing books, abstracting articles, cataloguing things under
key words and subject headings, adding captions to pictures and
tables, annotating works with cross- references and footnotes. Digital
coding makes these practices more effective in three significant
ways. First it facilitates the processing by creating tools to help
people to index, abstract, caption, and catalogue their culture.
This presents us incremental gains. Second, it makes many traditional
references, which had been unidirectional from one work to another,
usefully bi- directional. Only where very special indexes have been
laboriously developed can I go into a library and ask for a list
of works that cite a passage that specially interests me. In a digital
environment, the electronic reference that implements a note will
point both ways, something that will make traditional references
useful in powerful new ways. Third, traditional references implemented
digitally will save users much time and energy, for following out
a reference will be nearly instantaneous. Currently it is often
hard to maintain a train of thought in following a reference as
one needs to go off to the library or bookstore, perhaps having
to wait weeks for a work to arrive from a distance. Digitally coded
links will be fast and transparent. Together, these three changes
will significantly enhance traditional resources for the reflective
retrieval of ideas and the application of them to our controlling
purposes.
In addition, new retrieval resources are under development. These
require no intelligent pre-processing of materials aside from the
capture of them in digital code. Instead, the end-user of the material
specifies criteria of interest, and the system matches materials
in it against these criteria, showing the resultant possibilities
and allowing the user to further winnow the results, should that
be necessary. These principles have been most fully developed with
respect to the retrieval of textual materials. Their novelty still
engenders some confusion, and many people, among them even professional
librarians, misuse the concept of "full-text retrieval." Thus some
think it simply means retrieving for an inquirer the full text of
a document, rather than an abstract of it. More properly it means
conducting the search for matches to an inquirer's criteria of interest
against the full-text of everything in a collection, rather than
against a list of keywords. Techniques for such full-text retrieval
are becoming both sophisticated and fast, and users can apply them
to both the flow of current information generated through correspondence,
calls, and news, as well as to libraries of accumulated information.
Techniques of search and retrieval have historically developed
far more fully with respect to text than with other forms of information.
Up to now, we use text to catalogue most other forms -- maps, pictures,
numeric tables, films, recordings, and so on. Yet text processing
is not the only form of intelligent recall and retrieval that we
can do. We can often find our way to places with a visual- spatial
memory that is much more effective that verbally forming a set of
directions for ourselves. We associate both moods and ideas with
various sounds and melodies and even colors and places. All this
suggests that beyond full-text retrieval, there lies the domain
of "non-text retrieval." In non-text retrieval we might point to
a geometric relationship and request the computer to search a graphic
database for other instances of the similar relation or play a chord
and have the system call up musical compositions in which it occurs.
Non-text retrieval should in principle be possible with digitally
coded information, but for the most part it is a possibility that
awaits development.
One area in which non-text retrieval has been underway for some
time, however, gives an idea of its potential power -- statistical
processing. Statistics can be thought of as a numeric system for
selecting and retrieving information that allows for judgments of
significance and relevance that are very hard by textual means alone.
Also, the ability to zoom-in and zoom-out to different levels of
detail on graphical materials such as maps, diagrams, and photos
provides substantial non-text retrieval capacities. In general,
digital code enables us to capture and link different kinds of information
pertinent to complex phenomena and to represent their interactions
in ways that we can see or hear, using those senses to select directly
between combinations. All sorts of complex controls work this way,
especially in simulation systems and innumerable computer games.
These variations on non-text retrieval really carry us into consideration
of the fifth area in which digital coding is deeply influencing
our culture -- the intelligent processing of information. For the
most part, up to the twentieth century, communication tools used
external artifacts to extend the memory, while leaving the intelligent
processing of ideas to take place almost exclusively inside the
human body and brain. Through cultural history, people have accumulated
vast stores of memory projected outside themselves into man-made
objects. Despite all that externalization of memory, the possible
agents for the key verbs describing intelligent operations on information
and ideas are still almost exclusively human person -- perceiving,
sensing, thinking, correlating, inferring, deducing, concluding,
and so on. With the computer, man-made objects are becoming useful
in performing these intelligent operations.
Memory, to be meaningful, must ultimately return to a sentient
human mind -- a library unread is not a culture preserved. In externalizing
memory into material objects, humans have not alienated memory from
ourselves, but enhanced our capacity to remember by transferring
parts of the task to objects that we make and manage. So too, in
externalizing intellectual activity, we do not entirely alienate
it from ourselves. Instead we compensate for limitations, strengthen
capacities for demanding operations, and enhance attention, precision,
finesse, or speed.
To understand how the computer is accelerating the transfer of
intelligence to external tools, it is important to realize that
this is not a sudden novelty in our culture. We perceive the world
with our senses and prepare it for thought: through most of history,
people did this without the aid of instruments. That began to change
some centuries ago. We can interpret the rise of modern science
as the intellectual fruits of externalizing capacities for perception
into instruments of observation. Clocks and chronometers permitted
people to perceive time with ever greater precision. The telescope
and microscope enhanced the human capacity to see distances and
details. The thermometer lent accuracy to our capacity to perceive
differences of hot and cold. Exact scales and rules and other measures,
tuning forks, prisms, filters, balances, samples, gages, a wondrous
panoply of instruments, allowed inquiring minds to develop the empirical
base of observation upon which they built our stock of scientific
understanding.
By working with digitally coded information, instrument designers
are extending the power of perception greatly. The unmanned space-probes
reporting on the solar system have perhaps been the most dramatic
of these extensions, with wondrous photographs and other readings
radioed back as masses of digital code. Not since the invention
of the telescope has our ability to perceive the universe around
us so leaped forward. But digital read-outs are all around us with
the computer creeping into all sorts of mundane tools, enhancing
our capacity to track and control their use. For many decades car
instrumentation, for instance, was very stable, consisting of a
few analog gages that indicated the car's speed and possibly the
RPM's of the engine, while additionally giving key hints about the
state of the car's fuel, coolant, engine oil, and electrical system.
That's fast changing now with digital sensors in new and old places
giving a much more exact picture of the car's condition of operation,
with an onboard computer relating readings to one another -- "it's
getting pretty close to empty" gives way to "range remaining fifteen
miles." The computer will greatly extend the reach and accuracy
of instrumentation as people apply it with increasing effects to
small matters and large.
With the computer, people can externalize into their instruments
more than their powers of perception. When Edison claimed that "genius
is one percent inspiration and ninety-nine percent perspiration,"
he probably thought that the human capacity for both inspiration
and perspiration were basically fixed, and by perspiration he had
in mind the laborious calculations needed to test speculative insight,
separating good from bad. Digital systems do not do away with the
need for perspiration, but they extend what we can accomplish with
a given amount of it. Most forms of calculation, correlation, combination,
and connection that people can make, computers can help them make
better. They can expand our abilities to sort, order, rank, and
select. Even this process of externalizing powers of calculation
is not entirely new historically, as one who has worked with a slide
rule will realize, but it is being vastly increased. The consequences
are likely to be very great.
Many people think that numeric calculation is the peculiar domain
for computers, but their reach goes far beyond numbers. The computer
can operate on anything that in some meaningful way can be represented
in digital code through an organized data structure. And any operation
that can be accurately described within the compass of binary logic
-- AND, OR, NOT -- the computer can perform. Let us leave as moot
whether people can, or should, or ever will, externalize into tools
that one percent of their genius -- inspiration. They are externalizing
in all sorts of ways that other ninety-nine percent, amplifying
greatly their powers to calculate and control objects of their attention.
Even if artificial intelligence, in the sense of the computer being
an autonomous rational agent, is not soon coming to pass, if ever,
AI, in the sense of amplified intelligence, is rapidly emerging
all about us. We need to come to terms with its implications.
This, then, is the computer. It is the representation of our culture
in digital code and the development of all the cultural possibilities
that result. The computer makes cultural work easier to produce
and reproduce, to preserve, to transmit, potentially accelerating
intellectual attainment and opening cultural access in unprecedented
ways. The computer greatly augments human powers of selection, memory,
perception, and calculation, potentially amplifying the intelligence
that each and all can bring to bear upon the panoply of questions
that life puts to them. We turn to the implications of this computer
for the activity of education.
Table of Contents
Chapter 3
|