The Best of Creative Computing Volume 1 (published 1976)

Page 163 << PREVIOUS >> NEXT Jump to page:
Go to contents Go to thumbnails

Input/Output

graphic of page

CAI -- A failure?

Dear Editor:

I propose that the reason why CAI has failed is because computer experts have
not yet found a way to code human fallibility. The best teaching of human beings
is by sensitive but fallible teachers. Why, in CAI, is it always assumed that
the student is the one who will make the mistakes?

J. D. Tinsley
Inspector of Schools
Birmingham Education Department
Council House, Birmingham, England

How about some comment or opinion from other readers on the failure of CAI? -DHA
***
Correction.

Dear Editor:

Thanks again for a superb issue of Creative Computing. I'm sure that you can't
keep up the pace on improvement because there just isn't that much room to go.
This is the best journal for my purposes that I have ever seen. 

There is one thing that l'd like to point out in relation to that historical
reprint from IBM [Digital Calculators - Then and Now. Jan-Feb 1975]. In an
article called 'Will the Inventor of the First Digital Computer Please Stand
Up?" W. David Gardner reports on the work of Dr. John Vincent Atanasoff-for
Datamation (Feb., 1974, pp. 84-90). The article gives the decision of Federal
District Court Judge Earl R. Larson which "defrocked Dr. J. Presper Eckert and
Dr. John W. Mauchly as the high priests of electronic digital computer
invention." It goes on to explain how the decision arose in a case involving
Sperry Rand and Honeywell over the patent of ENIAC. After carefully considering
the evidence, Judge Larson decided that the patent was invalid because the basic
ideas were taken from a machine which Atanasoff developed between 1935 and 1942
at the University of Iowa. Atanasoff has gone without proper credit long enough
(and besides too many people have the idea that nothing important but
agriculture happens out here on the plains).

Paul J. Emmerich
Dana College
***
Standard BASIC?

Dear Editor:

I think you should stick to "standard" BASIC in programs that are included in
CREATIVE COMPUTING. In volume 1, number 2, there were programs on pages 12, 13,
and I9 that use the backslash for multiple statements on a line. The one on page
19 also has some construction that looks like Fortran implied do loops in a
print line and
if-then-else with statements allowed as arguments. It is honestly not BASIC and
will probably only run on the machine that originated it. The use of an output
string in an
input statement (e.g. INPUT "YOUR MESSAGE PLEASE" A$) is also nonstandard. Sorry
to push the point so hard, particularly on one of your own programs, but I think
that
programming style is pretty important, especially in publications that lots of
people are going to see. The language you choose is an important part of style,
and encouraging weird extensions that don't conform to the spirit of a language
is poor style.

Christopher G. Hoogendyk
Dartmouth College

I agree with you in spirit; however, when a significant or interesting program
is submitted to us (for example, SUPER STAR TREK in this issue), should we not
publish it because it is not in standard BASIC? Or should we require the
submitter to convert it to "standard" BASIC (to which request, most contributors
would reply, "Why should I bother?"). Or should we convert it to standard BASIC
(at which request, most of our volunteer editors would find other things to do).
Or should we publish it and leave it as an
exercise for readers to convert? 

READERS: What do you think?

Parting note: to my knowledge, the BASIC Standards Committee has not yet defined
"standard" BASIC. - DHA.

***
 Some words from the giant . . . .

Dear Editor:

Thank you for your letter of June 24 asking me to participate in Creative
Computing magazine's November-December issue on the computer and society.

Many of us within IBM are intensively seeking answers to a number of the
problems touched upon in your questionnaire. For example, those questions
dealing with
privacy and data security are addressed in the enclosed statement by Dr. Lewis
Branscomb, IBM Vice President and Chief Scientist, in testimony given before a
subcommittee of the House of Representatives.*

The role of the computer in society is, of course, only part of a broader area
dealing with the role of technology in general. We recognize that the computer,
like any instrument of technology, can be a force for good or harm depending
upon the use to which it is put.

IBM's past experience and future outlook reassure me that the computer, in
virtually every instance, will be used for good, not harm and that this
technological tool will continue to fulfill its great promise.

Frank T. Cary
Chairman of the Board, IBM

*See a summary of this testimony on page46. - DHA.

***
Computers save congressmen time in voting.

(Does that mean more talk or more vacations?)
Dear Editor:

I appreciate your invitation to present my views on the role of computers in
society. Computer support to the House of Representatives began in the late
l960's when the
Clerk of the House introduced data processing equipment as a means of
administering several clerical tasks. In 1971, the Committee on House
Administration established the
House Information Systems staff to provide a professional base for computer
activities. This staff continues to act under the guidance and leadership of the
Committee on
House Administration, currently chaired by the Honorable Wayne L. Hays of Ohio.

The members of the House are constantly aware of the utility and the importance
of computers in our society: the electronic voting system, for example, was used
for nearly
1500 rollcalls during the 93rd Congress and saved approximately 500 hours of
legislative time that would otherwise have been needed to answer rollcalls under
the manual method. The House computer has also been applied to many other useful
tasks including a bill status system, committee calendar system, data analysis
services and administrative support systems. The Committee on Science and
Technology has been a frequent user of these systems.

I look forward to the appearance of your November-December l975 issue as it
sounds extremely interesting.

Olin E. Teague
Chairman, Committee on Science and Technology
U.S. House of Representatives
***
The Last Number
Dear Editor, 
I noted with some interest your article on multiple precision arithmetic. For
some time, I have had a personal mania for computing huge factorials exactly,
and it is gratifying to see that I am not alone in my proclivities. I have
calculated l0,000 factorial exactly (I think – how could you ever check it?)
and think that this may well be the largest one yet computed exactly. In any
event, I also learned that there is nothing quite so dull as 7 pages of digits.

A very good reference on multiple precision arithmetic is in Knuth's
Seminumerical Algorithms. He says all that you would normally need to know.
Although it might be tough going for computing neophytes, the book is well worth
the effort. Especially interesting is the part on modular arithmetic in which it
is revealed how to do multiple
precision arithmetic without having to do any carries.

Keep up the good work.

John Levine, Student
Yale University

Page 163 << PREVIOUS >> NEXT Jump to page:
Go to contents Go to thumbnails