Gerhard Fischer Center for LifeLong Learning & Design (L3D) Department of Computer Science and Institute of Cognitive Science University of Colorado, Boulder do not allow users to act as contributors and designers . In this paper I will: (1) differentiate between consumer and designer perspectives; (2) discuss media support or limitationsfor these roles; (3) envision a future for HCI from this perspective; (4) illustrate some of our own work to address these issues; and (5) provide some evidence for the ubiquity of this framework for our society.
The fundamental challenge for human-computer interaction (HCI) is to invent and design a culture in which humans can express themselves and engage in personally meaningfulactivities. Cultures are substantially defined by their media and tools for thinking, working, learning, and collaborating. New media change (1) the structure and contents of our interests, (2) the nature of our cognitive and physical tools, and (3) the social environment in which thoughts originate and evolve, and mindsets develop. Unfortunately, a large number of new media are designed from aperspective of seeing and treating humans primarily as consumers. The possibility for humans to be and to act as designers (in cases in which they desire to do so) should be accessible not only to a small group of Òhigh-tech scribes,Ó but rather to all interested individuals and groups.
2. Images of Humans
2.1 A Consumer Perspective.
The Director of Research for Time Warner Entertainment, in hisclosing plenary address at CHI Ô95, articulated the design of a remote control to browse and efficiently select 500 or more TV channels as the basic challenge for the CHI community. Without a doubt, solving this problem is of great commercial interest to industries that regard humans as the ultimate consumersÑbut is it, or should it be, a focal issue for HCI? In the early days of computing, humanswere considered the ÒservantsÓ of computers. As computers become cheaper, the basic economic criteria started to change and considerations of how to use computational power to augment and empower human beings were pioneered by some early visionaries [16,31]. These new ideas were neither known nor embraced by the community at large. The Artificial Intelligence (AI) community developed expertsystems (such as MYCIN ), which were built as backward-chaining inference mechanisms. Although these systems could be built with reasonable effort, they were behaviorally unacceptable computational environments because they restricted knowledgeable and skilled human professionals such as doctors to answering yes or no to questions generated by the system. Other disciplines such as human factorsoften considered humans as system components with specific characteristics such as limited attention span, faulty memory, and easy distractibility along with other undesirable characteristics. Early research in humancomputer interaction (HCI) focused on interaction (idiotproof systems, novices, naive users, and how walk-upand-use systems could support their needs. Little consideration in the firstdecade of HCI research was given to the following perspectives:
Cultures are substantially defined by their media and their tools for thinking, working, learning, and collaborating. A large number of the new media are designed to see humans only as consumers. Television is the most obvious medium that promotes this mindset and behavior  and contributes to the degeneration ofhumans into Òcouch potatoes,Ó for whom a remote control is the most important instrument of their cognitive activities. (A Òcouch potatoÓ is a colloquial expression for a person who spends a lot of time on a couch consuming food and information in a passive fashion and who does not often engage in intellectual or physical activities). Unfortunately, a consumer mindset does not remain limited to...