Lanier, J. (2011). You are not a gadget: A manifesto. Random House Digital, Inc. $15.00 (paperback), 240 pp. (ISBN: 978-0-307-38997-8).
Nowadays, most people strongly believe that the internet, and more broadly technology, has transformed our lives for the better. A single click can perform a search on billions of web pages, reach every single human being connected to the net, access an aggregation of individual knowledge (e.g., Wikipedia), get a virtual or physical copy of almost every single book humanity has written, or get a snapshot of each other’s lives by sharing multimedia content through a social network. Surely, this technological revolution is having a tremendous impact on our cognitive skills and our way to organize and develop knowledge; however, as Jaron Lanier argues in You Are Not a Gadget: A Manifesto, most people are so blinded by the potential benefits that they forget to consider how it may threaten our intellectual growth. This is precisely the purpose of his book: to open a philosophical discussion on how technological progress is shaping and constraining the human mind.
As a simple example, Lanier discusses the notion of files. At the birth of computers, plenty of computer scientists believed the concept of files was not such a great idea. Alternatives were proposed, without success. Soon files became the standard, and every computer uses this metaphor to symbolize information. So why does that matter? Or as Lanier frames it, “what do files mean to the future of human expression?” (p. 13). The same question could be asked about human languages—how do words and cultures shape the way we think? Ultimately files or languages are a means by which to express ourselves. But limiting our array of expression means that we constrain the richness of our cognition. In other words, by limiting ourselves to one information format, we have prevented other possible futures where another concept for data structure would have existed (and potentially led to more efficient way to organize data and knowledge).
The kind of design decision that happened for computer files is called a locked-in situation—one idea becomes so big that it can’t be changed anymore. Plenty of examples can be found in Lanier’s book (e.g. web anonymity would be the result of '60s paranoia). As Lanier phrases it, “Lock-in makes us forget the lost freedoms we had in the digital past. That can make it harder to see the freedoms we have in the digital present” (p. 14).
An even bigger subject of concern for Lanier is the growth of the Internet cloud. He argues that current designs are based on the faith that “internet as a whole is coming alive and turning into a superhuman creature” (p. 14). One manifestation of such an entity is Wikipedia: the way it suppresses human authorship, giving the text superhuman validity. According to Lanier, “traditional holy books work in precisely the same way and present many of the same problems,” such as a blind adoration of those entities (p. 32). As a consequence, people “degrade themselves in order to make machines seem smart all the time,” because they believe in this supernatural hive mind (p. 32). Furthermore, they are more likely to blame themselves when technology doesn’t work, instead of recognizing its limitations and defects. According to Lanier, “the ‘wisdom of crowds’ should be thought of as tool,” nothing more (p. 59).
Another side effect of cloud computing is the dehumanization of the data because the growth of the digital hive is done at the expense of individuality. Services such as Wikipedia completely erase points of view, while Facebook organizes people into “multiple-choice identities” (p. 48). Furthermore, “What computerized analysis of all the country’s school tests has done to education is exactly what Facebook has done to friendships”—life is degraded and turned into a database (p. 69). Can we adequately judge a child’s intelligence based on standardized test scores? Can we say we know anything about someone just by looking at his or her Facebook page?
With the development of more and more sophisticated algorithms, Lanier believes that creativity will become the most valuable resource among human beings, since all other tasks can be performed more quickly and accurately by technology. Unfortunately, creativity is not left unscathed in Lanier’s view. He takes music as an example and declares that “pop culture has entered into a nostalgic malaise,” because “online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media” (p. 20). For Lanier, music hasn’t produced anything original since the late ’90s; everything is retro or a remix of existing style. In his own words, Generation X is “exceptionally bland” and inert because of its dependence to the cloud, which provides all kind of material for free.
Finally, Lanier mentions a plethora of other cases where technology, and more specifically Web 2.0, may harm us (e.g. the case of money: how advertisement has become central and sacrosanct in the web and how it is corrupting us; which alternatives to Wikis existed; what kind of alternative economic models exist for music; how the “Lords of the Clouds” are more evil than they pretend to be).
Even if this manifesto generally stays on a philosophical level, it has the merit of opening up questions about, and giving an alternative framing on, how technology influences human cognition. Lanier does not provide any empirical evidence or proof to support his claims; he merely asks us to imagine what would have been different if the internet had been created in a different time and place.
In conclusion, it should be noted that this book has several interesting implications for education in general. It asks us to consider how technology may constrain our cognitive abilities (e.g. how many children—and adults—are taking Wikipedia as their only source of information?), and to what extent teaching is in a locked-in situation because of previous educational decisions.
Bertrand Schneider (email@example.com) is a PhD student in the School of Education at Stanford University.