The Best of Creative Computing Volume 2 (published 1977)

Page 4 << PREVIOUS >> NEXT Jump to page:
Go to contents Go to thumbnails
This book is also available for the Kindle

Computing Power to the People - A Conservative Ten-Year Projection (global equipment upgrading, lowing the user-machine barriers)

graphic of page

Man-machine interaction, combining the best of the twin worlds of computer
precision and human perception, should become the fastest growth area of
computer usage during the next decade.


Machine intelligence during the next decade will be available at very low cost,
lower, in many cases, than the
cost of the other devices within the same piece of equipment.

This implies a global upgrading of current "dumb" equipment into "intelligent"
equipment, at a small incremental cost. There could also be a corresponding
downward price change, but never below the dumb equipment minimum.

Calculators with advanced features (trigonometric functions, memory, and
programmability) will only be slightly more expensive than, and will therefore
replace, the minimum four-function variety; this trend is quite visible today.
Much more importantly, intelligent terminals, housing small computers as
subsets, may not cost much more than those with a bare keyboard. This upgrading
will be more striking in optical display terminals, where an added intelligent
buffer will greatly enhance their ability to handle colors and complex
picture-processing algorithms.

Micro computers should approach, and in some respects exceed, current
minicomputers in sophistication and performance. Whether this will significantly
lower the price of small machines will depend on the attached equipment.

For larger machines, peripheral equipment has long been the hardware
cost-determining factor. Upgrading
here should lift the system into a new performance category. We can expect the
electronic memory to grow in
size by more than one order of magnitude; working hand-inhand with a very fast
cache memory, the combined effect is a superlarge, superfast memory, capable of
extremely complex management chores, not the least of which is the
[image]Micon MCM data terminal measures about 9" square and runs on rechargeable
effective handling of electronic disks and other storage devices to form
powerful virtual storage systems.

There will be upgrading of the I/O and communication interface. Large machines
will communicate freely through
networks, satellite, or packet radio. When a terminal deals with a "central
machine," the latter may actually be a collection of computers reacting
correctly under a uniform communication protocol.

It is assumed that most of the outstanding software problems and bottlenecks
will diminish through the added
LSI computing power, and new systematic programming practices. New software will
pay particular attention to data management algorithms and the human interface.


It has been estimated that the cost to program and debug a line of code รท the
cost to execute the line has now reached the astronomical value of 100 million
(3). Clearly in a typical installation, the most expensive
component is the human cost, which should now be minimized at the expense of
machine time. indeed, human convenience should be maximized whenever possible.

The relationship between the programmer and the machine has seen ups and downs.
ln the early days of computing, users had physical contact with the machine in
order to push the appropriate buttons, but had to state their needs through the
unwieldly machine code. The advent of FORTRAN and other procedural languages
permitted programming on human terms, but the user was soon ejected from the
machine room and had to communicate through a batch-centered job-control

The advent of terminals and time-sharing has helped the user to reassert
himself, under the desirable illusion of direct machine involvement. But there
still remain complex sign-on procedures, difficult control statements varying
from layer to layer, incomprehensible error messages, unexplained delays, also
unexpected system crashes, destroying the work of innocent users.

The intelligent terminal, provided with powerful monitoring programs, can go far
to serve as go-between, much as a resourceful receptionist mediating between an
executive and a visitor. The work includes expanding simple
sign-on codes into the proper format, explaining unusual happenings, catching
and fixing simple errors, keeping statistics, recoding and storing locally for
safekeeping security and economy. Small jobs can certainly be handled locally,
from start to finish.

With sharply lowered machine cost, interpretive computing on terminals will
become common for small problems, especially for students. The conventional
compiling process introduces an extra layer of problem transformation into the
job, and is a source of misunderstanding. On the other hand, it is easy to learn
the use of interpreters. Further, on a terminal every interpretive step can be
monitored in terms directly meaningful to the programmer. Compiling and batch
processing can be reserved for time-consuming programs, as an economic measure.
Optimum interpretation, involving the real-time balancing between interpreting
and compiling, should become a reality.

The computer, far from freeing the average citizen from drudgery, actually
generates some resentment in him,
because he has no direct use of the computer, yet is often the recipient of its
less-desirable by-products, such as wrong bills and junk mail.

Page 4 << PREVIOUS >> NEXT Jump to page:
Go to contents Go to thumbnails
This book is also available for the Kindle