• Goto NETFUTURE main page
  •                                 NETFUTURE
    
                Technology and Human Responsibility for the Future
    
    --------------------------------------------------------------------------
    Issue #12      Copyright 1996 O'Reilly & Associates         March 26, 1996
    --------------------------------------------------------------------------
                Editor:  Stephen L. Talbott (stevet@netfuture.org)
    
                         On the Web: http://netfuture.org
         You may redistribute this newsletter for noncommercial purposes.
    
    ##########################################################################
    ####  Don't forget the $5000 SPIDER OR FLY? deadline: April 30, 1996  ####
    ##########################################################################
    
    CONTENTS:
    *** Editor's note
          NETFUTURE now has two indexes
    *** The tool's threat lies in our unawareness (Stephen L. Talbott)
          Give ambiguity and complexity their due
    *** Every tool is an obstacle (Stephen L. Talbott)
          Computer critics need not worship books
    *** About this newsletter
    

    *** Editor's note

    If you've been to the NETFUTURE Web site in the past week or so, you've discovered that we now have an up-to-date index of all our issues. Actually, two indexes, one by issue number, and one by author.

    The last issue was mostly given over to critics of certain remarks of mine; I have impounded the current issue in order to offer some response. I hope that the outcome will be constructive.

    My general sense, however, is that back-and-forth argument between people who have few assumptions in common is only rarely constructive--and almost never when conducted over the Net. Every publication of note--electronic or otherwise--fosters vigorous discussion upon some sort of common ground, so that the participants can talk to each other. This ground is what lends the publication its distinctive character. I'm not yet quite prepared to suggest how the "givens" of NETFUTURE might be framed--apart from the general concern with "computerized technology and human responsibility for the future." But I will certainly try to steer the newsletter away from unproductive sorts of debate in the future.

    Goto table of contents


    *** The tool's threat lies in our unawareness
    Follow-up to "A (partial) defense of Peter Quince" (NF-9)

    From Stephen L. Talbott (stevet@netfuture.org)

    The Risk Inherent in Tools

    In NF-11 Alain Henon, Peter Faller, Mike Fischbein, and David Petraitis all seemed to choke on my remark that "every tool is a threat." Henon wondered whether "this is not one long awful joke at the end of which you will tell us you were really kidding."

    It seems to me remarkable that in our sophisticated, tool-dominated age the point at issue should even be debated. Remarkable, and also threatening, since the one and only thing that makes the tool a threat is our blithe unawareness of the threat--our willingness to take the threat for a joke.

    Rather than say more, I include in this issue the essay ("Every Tool Is an Obstacle") that I referred to in my correspondence with Peter Quince. I forwarded the essay to Quince in order to convince him that no tool can be viewed as inherently evil. I reprint it here in order to convince his antagonists that every tool is indeed a threat.

    (If you're still unconvinced after reading it, ask me about the risks inherent in the spade. Or, if you lived through the Dust Bowl, perhaps you can share with us something about the consequences of the particular reconfiguration of the spade known as the plow. Henon asks also about the stirrup, and if memory serves me right, an entire book has been devoted to the influence of the stirrup, or some closely related contrivance, in changing the course of civilization by rendering mounted warfare much more destructive.)

    Mr. Henon, on the way to labeling our discussions vacuous and intellectually undisciplined, found himself distressed by the lack of definition in my use of such terms as "the natural world," from which I had noted that we tended to be disconnected. Few are the discussions that wouldn't benefit from a more explicitly qualified use of words. But then, too, few are the discussions that wouldn't benefit from a minimal effort by the hearer to produce reasonable meanings in context. In the current case, nineteen out of twenty people would surely have checked off the "correct" meaning if given a multiple-choice quiz.

    (Yes, that meaning has something to do with the flora and fauna that our forbears knew on the whole much more intimately than we; with the cycles of season and weather that, for all the availability of ready forecasts, we experience more as incidental and accidental than as the ritual heart of our existence; with the heavens whose rhythms once influenced human life far more directly than now....) And before you infer a mess of value judgments or condemnations from this simple parenthesis, please read on.

    Does the Net Limit Our Options?

    Another cause for stumbling was my comment that, while avoiding computers limits one's options, so does going online for a couple of hours a day. I went on to say that, one way or another, "we must limit our options if we would declare ourselves for anything worthwhile in life."

    Not imagining the point to be very controversial, I clearly underestimated the power of blind optimism. Faller responded with, "Limit our options? No thanks." Going him one better, Fischbein noted that "with every social option that is lost, three new ones appear." Applying this optimism to education, he claimed that "TV and Net-based learning were and will be only as successful as they improve education. If they don't, they will be abandoned as pedagogical tools, or more likely, relegated to a supplemental role that they do fit."

    Well, there's not much to say here, for the objections simply ignore the obvious. My point, after all, was not a subtle one. Time spent doing one thing is time not spent doing another. Devote your life to helping refugees in Somalia and you will not be able to devote your life to helping refugees in Bosnia.

    One doesn't have to argue for the superiority of an earlier era to note that several decades of city- automobile- and road-building (among many other things) have changed the options available, say, to the inner-city dweller. At the individual level, someone who spends a few hours a day cultivating valuable online friendships restricts by that amount the available time for cultivating other friendships--just as cultivating those other friendships restricts the opportunity for online engagements. Over time, these patterns of behavior shape social institutions, and therefore help to determine what sorts of activity are reasonable or possible.

    Any hope of our using technology wisely rests upon the routine recognition of at least some of the consequences of the choices we make. This is ground already covered well in an extensive literature of technology assessment, and a good starting point for anyone who wants to look into it is the Confronting Technology resource center associated with NETFUTURE.

    Emotion and Human Contact

    I wrote (NF-5): "The error most people make here is to assume that strong emotion is a sign that people are making deep, human contact. The usual reality is nearly the opposite of this." Carl Wittnebert, in a not unreasonable post, asked, "Then what is the criterion of human contact? Why is `real' friendship limited to face-to-face interactions?"

    I am always disturbed at how quickly certain critical or cautionary notes regarding the Net are immediately translated into one of a set of "standard" arguments that critics are supposed to have. (On another front, David Petraitis immediately takes my mention of our disconnection from nature as proof that I hold a "utopian view of the natural.") It's one thing to point out, as I did, that much of the most emotive behavior demonstrated on the Net is less a sign of contact than of not making contact; it's quite another thing to imply, as I did not, that friendship is necessarily face-to-face.

    Every means of communication can be used for expression of friendship. Some of the American POWs in Vietnam apparently formed deep connections on the basis of not much more than occasional tapping on walls.

    But while everything is possible, not everything is equally easy. If you restrict the internal communications of a corporation to wall-tapping, competitive failure will be much more certain than the formation of profound friendships. In general, the narrower the communication channel (and I'm not referring to the "bits" tracked by information theory, but to the ease of expressive richness), the harder we have to work to communicate meaningfully.

    One doesn't have to believe that the Net is useless for personal exchange in order to hold some such opinion as the following: in a society already uncertain of its commitment to community, a medium that makes exchange even more indirect and automated does not show much promise of improving things overall. (As to what I mean by "even more indirect": if you want to avoid coming to terms with someone, will you typically find it easier to meet him face-to-face, to call him on the phone, or send him an email message?)

    Essential Ambiguity

    Without a certain flexibility in our stance toward these issues, all argument ricochets back and forth forever off the same, fixed pins. Say that we are free and responsible for making whatever we want of technology, and someone replies that it's all too big for any individual to affect. Say that we're trapped by technology, and most individuals quickly testify to their own freedom. There is truth in both views.

    But both views are also incomplete. If we are free, one of the things we are free to do with our freedom is to shape ourselves into increasingly unaware agents of the technologies we set to running in the world. That is, we are free to abandon our freedom. Contrariwise, wherever we recognize that we are not free, that recognition itself plays a freeing role. To realize that something has prejudiced one's judgment is to gain a possibility of working against the determining influence. Freedom and determination work together in a complex weave.

    So--and I argued this strongly in The Future Does Not Compute--we have to consider things at two levels. As far as the individual and his choices are concerned, freedom--or potential freedom--is the decisive consideration. We all must work in our own ways to redeem, not only the high technologies, but every handful of earth that is entrusted to us--to raise it to the highest possible level of human expression.

    But, on the other hand, in matters of social policy we have to be willing to look at the pattern of choices we, as a society, have actually been making. We need to recognize dangers--possibly insuperable dangers--wherever they may lie. Your judgments may differ from my own rather pessimistic ones, which is fine. But please do not assume, just because I am pessimistic about the uses to which we are in fact putting our machines, that I believe (as was dishonestly attributed to me by Steven Levy in Newsweek) that "computers themselves are evil." Quite to the contrary: the fact that we have been failing the test makes it all the more urgent for those who sense the failure to work toward the constructive mastery of computers.

    I suppose that, as long as I continue receiving mail divided about equally between "you assume too much freedom and responsibility for the individual facing technology" and "you assume technology is an evil we no longer have the power to escape," I must be somewhere near the right path.

    Well I guess I did work up a little passion after all, despite my resolve. May it prove, as Mr. Wittnebert hopes, a sign of genuine contact! Actually, not feeling so good about it myself (who the hell am I arguing with anyway, and to what purpose, and is it for anything other than purely selfish ends?), I'm not so sanguine. But, before you leap to your guns, the implied judgment in that remark is upon me, not the technology.

    Steve

    Goto table of contents


    *** Every tool is an obstacle

    From Stephen L. Talbott (stevet@netfuture.org)

    Computer Critics Need Not Worship Books

    It inevitably happens: warn people about the risks of our growing reliance upon computers, and most of them immediately assume that you find books refreshingly risk-free and wholesome. The supposed contradiction is pointed out either directly ("So why do you read books? They're products of technology, too") or by mockingly ascribing to the critic a consistency he is assumed to lack ("You must be the kind of person who would decry hammers for alienating carpenters from their nails. `Damn it, man, in the days of higher thinking we used to pound nails in with our foreheads'").

    What is needed here is a little historical awareness. Gutenberg built his press in the middle of the fifteenth century, which was also when the discovery of linear perspective was taking hold. Western man was detaching himself from entanglement in the world. He was learning to look out at it through cameralike eyes, objectifying it and correspondingly "subjectifying" his own interior.

    The physical world, increasingly felt as solid, real, and external, fascinated the Renaissance artisans, even as their thinking processes were beginning a several-century's retreat into misty vagueness. Caught between these two movements, philosophers like Descartes discovered the mind-body problem: how does the weak, qualitatively disappearing mind accomplish anything against the resistance of a world growing ever more alien, independent, and material?

    Books fit in well with this double process. As the mind faded toward an airy chaos that seemed more fit for psychological novels than for scientific investigations, the book comfortingly objectified our words, which were now routinely beheld as external things detached from any human speaker. (Eventually, in the twentieth century, the computational manipulation of words would stand in for a scientific investigation of mind.)

    Books tempted us to hoard knowledge, as if it could be captured between covers and stored on shelves. We took pride in the size of our libraries and the number of books we had read, as if these testified to our grasp of truth. Ever so subtly, we were encouraged to exchange wisdom -- which assimilates information to a living understanding -- for the objectified bits of information themselves.

    Of course, the books lining the walls of my study do not help me through the tight spots in conversation. I am always at risk of being exposed. But the computer is changing that. It not only furthers the word's detachment from its human source, but sets those detached words in coherent motion. In the absence of a speaker, words simply converse with words. My study, you might say, goes head-to-head with your study.

    Computer programs freely exchange information with no human intervention. Increasingly, so do we. The psychological distance inserted between us by electronic media, our habits of superficial skimming, the computer's readiness to supply us with its own choice of words, and our casual inattention while typing at the keyboard all lead toward speech without a consciously present speaker. The computer helps to make the word-as-entity ever more objective, independent, and active, even as the human thinking behind the word sinks down toward mere logic, half-conscious association, and mechanism. What remains of thinking is only the hazy ghost many are dead set upon exorcising from what is, for this very reason, becoming a machine.

    So, far from embodying a danger altogether foreign to books, the computer can be seen as the natural extension -- and the nearly perfected climax -- of a long, historical development in which books have played a central role. If that development has become dangerously one-sided, obscuring the human interior and deadening the now cut-off word, it is only natural to seek a restored balance wherever it may be found. But this does not imply that earlier technologies are risk-free.

    The fact is that the primary -- and in the deepest sense the only -- gift of every tool is its resistance to the human good. In overcoming this resistance, we advance as human beings. The objector cited above had it quite wrong: we do hammer nails with our heads -- and with our hearts -- at least, we do when we hammer correctly. And the immature hammer-wielder who abandons himself to his new toy, taking everything in sight for a nail, is alienated from a true relation to nails.

    But the painful results of our indiscipline invite inner growth, which is the only enduring gift of the tool. After all, which is of more lasting value: the cabinet I build with nails and eventually leave behind, or the inner mastery I gain through struggling with myself, hammer in hand -- a mastery I will carry as healing capacity wherever I go in an overwrought world?

    Every tool promises inner rewards -- but only so long as we recognize it as an obstacle and accept its challenge. Otherwise, the tool is an instrument of ruin.

    The computer, of course, is a vastly more potent tool than the hammer. We will never, for example, mistake a hammer for a thinking device. Computers are highly adaptive, universal machines, and when we bring to them our willingness to mistake the tool for its gift, we risk sacrificing, not just one particular capacity, but the entire field of human achievement.

    It is all too easy to accept the intelligent machine's external reflection of our faculties as if the reflection were the real thing -- as if the tool were what really counted. It is not so. A computer hooked up to a hammer can be instructed to hammer correctly, but its effort is not one of self-mastery, and its success does not become a healing power in the world.

    In refusing to understand the computer as an obstacle, what we sacrifice is the highest capacity of all -- the capacity to make of every tool an occasion for advancement. By claiming to be master of all tools, the computer dares us to contend for our own mastery.

    At stake is no longer whether we will learn by overcoming the resistance of this or that tool, but whether we will continue growing at all. The computer is our hope if we can accept it as our enemy; as our friend, it will destroy us.

    ======
    Stephen L. Talbott is author of The Future Does Not Compute--Transcending the Machines in Our Midst. The foregoing reflection is part of a developing collection called Daily Meditations for the Computer-entranced.

    Goto table of contents


    *** About this newsletter

    Copyright 1996 by The Nature Institute. You may redistribute this newsletter for noncommercial purposes. You may also redistribute individual articles in their entirety, provided the NetFuture url and this paragraph are attached.

    NetFuture is supported by freely given reader contributions, and could not survive without them. For details and special offers, see http://netfuture.org/support.html .

    Current and past issues of NetFuture are available on the Web:

    http://netfuture.org/

    To subscribe or unsubscribe to NetFuture:

    http://netfuture.org/subscribe.html.
    Steve Talbott :: NetFuture #12 :: March 26, 1996

    Goto table of contents

  • Goto NETFUTURE page