By Luis M. Rocha and Santiago Schnell
If it's green, it's biology, If it stinks, it's chemistry, If it has numbers it's math, If it doesn't work, it's technology. (Unknown)
It has become appallingly obvious that our technology has exceeded our humanity. (Albert Einstein)
sufficiently advanced technology is indistinguishable from magic. (Arthur C. Clarke)
Humans and Tools
Humans are unique in their ability to make, use, and incorporate tools into every aspect of life. Other animals are known to use tools, and even transmit knowledge of using tools across generations, but humans use tools to transform their environments in unprecedented ways. According to the Oxford English Dictionary, a tool is a thing (concrete or abstract) with which some operation is performed; a means of effecting something; an instrument. Technology is then the collection of tools plus the knowledge of how to develop and apply them in our environment.
To look at the history of humanity is to look at the history of tool-making and invention. The invention of new tools often brings about fundamental transformations in society and even in the way we think. For instance, the invention of language (first spoken then written) changed the way humans think and are able to solve problems – so much so that it is practically impossible to understand what it would be like to think without access to written words! An example of profound social transformation, was the introduction of the firearm to Japan by the Portuguese in the 16th century, which largely resulted in the end of a 250-year civil war leading to the unification of Japan.
Other famous tools which lead to major transformations are: stone tools (pre-homo sapiens), paints, boats, pulleys, screws, fabrics, metal, writing system, the wheel, levers, coinage, astrolabe, telescope, printing press, steam engine, adding and computing machines, airplanes, telephone, typewriter, antibiotics, digital computers, internet, etc. We are so intertwined with these tools that we can say that humans are characterized by living in symbiosis with their technology.
Technology and Humanity
"A man without technology,..., is not a man". (José Ortega y Gasset)
The Spanish philosopher José Ortega y Gasset, in his essay "Thoughts on Technology", defines technology in the context of humanity and nature. According to Ortega y Gasset, human real needs are independent of nature and rather are based on an individual's will and desires. These acts of will translate into humanity's manipulation of nature in an effort to address these needs. When compared to natural needs, such as food and sleep, personal needs are superfluous. The end result is that humanity creates a new nature, a super-nature, which is separate from real nature. The super-nature can be dominant over the real nature leading to numerous ethical issues. From this point of view, technology is the means whereby humanity separates itself from nature and that it is the mechanism used to adapt the natural environment to the individual. This is the reason why the Spanish philosopher said that a man without technology is not man.
According to José Ortega y Gasset "Everything becomes clear...when we realize that there are two purposes [of technology]: "One, to sustain organic life, mere being in nature, by adapting the individual to the environment; the other, to promote the good life, well-being, by adapting the environment to the individual." Thus Ortega y Gasset distinguishes technology which is for survival from technology which is the result of will and desire. Therefore technology must be recognized as going beyond minimal existence. In doing so, technology becomes integral with using our environment for what we see as good; values generate technology.
Technology and Problem Solving
Humans create technology to adapt their environment to themselves, but technology also changes the way humans live, think, multiply, and die. In this sense, we say that humanity and technology live in symbiosis, as to a large extent one creates the other. Indeed, while it is useful to think about technology as a means to solve problems, sometimes the introduction of the tool precedes the problem to be solved! Did the introduction of the firearm solve the civil war problem in Japan? The invention of the telephone also did not solve a particular problem, as humans were communicating in other ways before its invention. But it did enable faster communication.
Seeing technology as a means to solve problems, while correct, is not the whole story. Tools enable us to both cope and change our environment, but as the environment changes so do our needs and ourselves, leading us to use the same and new tools in unforeseen ways, in an endless loop of social-technological interaction. Every new tool changes the problem space. This permanent evolution of the problem-space is emphasized by our highly creative use of tools. As the German philosopher Martin Heidegger pointed out, we do not usually deliberately think about how to use our (best) tools. The tools available in our environment simply show up as solutions to present problems (Prem, 1998). And these may be problems that the introduction of the tool itself created!
"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled". (Richard P. Feynman)
Naturally, not all technology affords the same degree of ease of interaction. We can even speak of transparent and opaque technology. Andy Clark (from the book in our course materials) defines a transparent technology as a technology that is so well fitted to, and integrated with, our own lives, biological capacities, and projects as to become almost invisible in use. An opaque technology, by contrast, is one that keeps tripping the user up, requires skills and capacities that do not come naturally to the biological organism, and thus remains the focus of attention even during routine problem-solving activity. When using opaque technologies, such as our personal computers, we are constantly aware of the tool and ourselves as we use it. Transparent technologies, in contrast, are not noticeable. We are capable of using them, unconsciously, as extensions of our own bodies to solve problems in our environment.
Examples of transparent technologies are our wristwatches, pen and paper, proficiently driving a car, using sports or music equipment. Often, such integration and ease of use require training and practice. We are not born in command of the skills required. Nonetheless, some technologies may demand only skills that already suit our biological profiles, while others may demand skills that require extended training programs designed to bend the biological organism into shape. The processes by which a technology can become transparent thus include both natural fit (it requires only modest training to learn to use a hammer, for example) and the systematic effects of training. The line between opaque and transparent technologies is thus not always clear-cut; the user contributes as much as the tool. (ibid)
As Clark well describes, a very good example of transparent technology is our wristwatch which we have incorporated into our sense of self. Indeed, most of us, when asked if we know the time, will answer yes before we actually read the time from the watch. This means that we conceive the ability to get the time from a reliable and portable tool as our own knowledge. We access the external watch in the same way as we access memories in our brains. The tool has become a part of ourselves, and we have expanded not only our capacities but also our bodies, minds, and social organization.
As we build much more sophisticated portable, knowledge tools, such as cell phones which allow us to query the web in real time (e.g. the Google SMS service,or via browsing with a third generation mobile), we may find that our sense of personal knowledge will expand much further.At which point will our ability to easily and reliably obtain a piece of knowledge on the web, via our portable devices, become as second nature as checking the time? When will we incorporate the Web's knowledge into our sense of knowing?
Transparent Knowledge Technology
"It's impossible to move, to live, to operate at any level without leaving traces, bits, seemingly meaningless fragments of personal information". (William Gibson)
We are already immersed in knowledge technology that is used to track and respond to our behavior by recommending possible choices for us. The most basic level of such technology is based on data analysis or data mining of human behavior, for instance, consumer behavior. A famous example of data mining is the database analysis of the transactions in a Midwest supermarket chain which found that on Thursdays and Saturdays males who buy diapers also buy beers. This type of information is systematically used to relocate merchandises to more strategic places in stores.
Expanding on the capabilities of data mining are tools which are designed to push information. That is, proactive tools to recommend rather than wait to be queried. An example is the recommendation (or recommender) system at amazon.com which suggests merchandise based on the choices of other consumers who have bought similar products.
But even these proactive techniques are being expanded with the capacity to adapt to the habits of specific users, enabling individualized and specific responses. They can for instance use a brain-like Hebbian Learning mechanism to have the recommendation system adapt to a particular user: the more certain items (e.g. web pages) get selected simultaneously, the stronger they get in the network of items that defines each user/consumer.
The placement of such tools in our environments is establishing a growing reality of ubiquitous computing, smart-rooms, etc. In other words, our environments are getting filled with everyday transparent tools which respond, adapt, and evolve to and with our situations, without us being aware. As the portability, adaptability, and proactive nature of our tools increases, it is essential that we understand how such tools operate so that we can simultaneously take advantage of their capabilities, but also defend ourselves from their capacity to control our behavior.
Due to the tremendous importance of Information Technology (IT), and especially its potential for transforming society at large, the National Science Foundation - the biggest government institution supporting research inside and outside of universities and fostering young scientific talent - has established the field as one of its priority areas(1).
The number of information systems, computing devices, data archives and other IT resources that are interconnected in complex, distributed systems is exploding. The resulting systems have the potential to transform both science and engineering research (e.g., with environmental and geological systems, remote observing systems, or embedded sensor systems for research on materials) and expectations about how we live, learn and work (e.g., with transportation and telecommunications networks, power generation and distribution systems, or distributed life long learning systems.) The USA, as a nation, harnessing the capabilities and sophistication of these resources will enable us to engage in endeavors that were never before possible. At the same time, when complex interactions and interdependencies within and among disparate systems result in failure, such as last summer's electric power grid outage, the many research challenges still confronting the Nation become more urgent. Understanding and predicting the possible behaviors of such systems, and developing better design strategies for these systems (e.g., based on a better understanding of complex systems) are critical to achieve long-term national goals that depend on reliable, high confidence, distributed systems. A better understanding of how failures cascade, how scalability and interoperability among heterogeneous systems can be ensured, how inherent complexity can be managed, and how people and society interact with these systems is necessary (2).
Today, networks link people, software, hardware, computational resources and data archives, and they enable unprecedented communications, coordination and collaboration among them. Powerful distributed applications enable new forms of science by collecting, disseminating, and analyzing observational or experimental data, or data from models or simulations. Other powerful applications include the networked services essential to our daily lives, such as cell phones, email, banking systems, transportation systems, critical infrastructures, distributed inventory control systems, and modern environmental observing systems. New knowledge is needed to improve the design, use, behavior, and stability of these widely distributed systems. A better understanding of this historical shift towards increasing connections and interdependencies among heterogeneous systems and how to harness their potential in service to society is necessary. (3)
Some Milestones of Information Technology
- The Abacus. A counting aid, may have been invented in Babylonia in the fourth century B.C. Not an automatic device, but rather a memory aid for intermediate calculations. Very used in China and Japan
- In 1623 Wilhelm Schickard (1592- 1635) built the first mechanical calculator. Capable of working with six digits, and carry digits across columns. Did not make it beyond the prototype stage.
- In 1642 Blaise Pascal (1623-1662) built a mechanical calculator with the capacity for eight digits. It had trouble carrying its computations as its gears tended to jam.
- In 1670 Gottfried von Leibniz (1614-1716) built a mechanical calculator capable of multiplication and division.
- Charles Babbage (1791 – 1871) conceived the Difference Engine which was a special-purpose digital computing machine for the automatic production of mathematical tables. It was steam-driven, and consisted entirely of mechanical components. Numbers were represented in the decimal system. It was never completed but several fragments were produces. In 1990, it was built from Babbage's designs and is on display at the London Science Museum. Babbage working with Ada Lovelace (daughter of Lord Byron) also designed the Analytical Engine which was to have been a general-purpose mechanical digital computer with a memory store and a central processing unit (or 'mill'). It would have been able to select from among alternative actions consequent upon the outcome of its previous actions. The ability to select from alternatives is known as conditional branching and it implies the ability to deal with choice and information. It was designed to be programmed with instructions contained on punched cards.
- Herman Hollerith (1860-1929) devised a system of encoding data on cards through a series of punched holes. Hollerith's machine, used in the 1890 U.S. census, "read" the cards by passing them through electrical contacts. Closed circuits, which indicated hole positions, could then be selected and counted. His Tabulating Machine Company (1896) was a predecessor to the International Business Machines Corporation (IBM). Reduced reading errors, work flow was increased, and, more important, stacks of punched cards could be used as an accessible memory store of almost unlimited capacity.
- In 1935 Alan Turing (1912-1954), invented the principle of the modern computer: the Universal Turing Machine. It is an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols. The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols.
- John Von Neumann (1903-1957) worked the concept of stored-program, general-purpose electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running.
- Semi-conductor Transistors were invented by John Bardeen, Walter Houser Brattain, and William Bradford Shockley at Bell Laboratories in December 1947. They were awarded the Nobel Prize in physics in 1956. Transistors function switches: devices for making or breaking an electric circuit, that is, for choosing between several states, between on and off, 1 or 0. Allows the construction of logic gates.
- Integrated Circuits were first conceived by Geoffrey W.A. Dummer of the Royal Radar Establishment of the British Ministry of Defense in 1952. They were first manufactured independently by two scientists: Jack Kilby of Texas Instruments on February 6, 1958 and Robert Noyce of Fairchild Semiconductor (Silicon) on April 25, 1961. They consist of at least two interconnected semiconductor transistors, as well as passive components like resistors, assembled on a thin chip. They are used to build microprocessors which are the Central Processing Unit (CPU), that is, the part of a computer that interprets and carries out the instructions contained in the software.
Note: See the lecture slides for additional milestones and more detailed descriptions.
(1)Advances in Information Technology (IT) have dramatically transformed the way in which people live, work, learn, communicate and conduct business. Please visit the Information Technology Research Overview, at NSF.
Ortega y Gasset, José. "Thoughts on Technology." Trans. Helene Weyl. Philosophy and Technology: Readings in the Philosophical Problems of Technology. Ed. Carl Mitcham and Robert Mackey. New York: Free, 1972. pp. 290-313. For comments about this book, you can read George Milkowski piece entitled "" in the Technology and Education Seminar, Brown University, Spring 1998,
Clark, A. . Natural-Born Cyborgs: Minds, technologies and the Future of Human Intelligence. Oxford University Press.
Heidegger, Martin . Being and Time: A Translation of Sein and Zeit. SUNY series in Contemporary Continental Philosophy.
Prem, Erich . "Semiosis in embodied autonomous agents". Proc. of IEEE International Symposium on Intelligent Control, IEEE, Piscataway, NJ, pp.724-729