Email Brian Ballentine
West Virginia University
Firefox
Flock
Greasemonkey
Web Developer
PDF version
Introduction Writing as Hacking Web 2.0 and Open Content Hacker Ethics Hacking Writing and Plagiarism Firefox Extensions Sample Hacks Closing References
 

In many ways, Web 2.0 innovations like “refspoof” are embodiments of the theories Selber describes in his chapter on critical literacy. Because content owners have chosen to take a more hands-off approach to controlling access to their data, hackers/writers are seizing opportunities to re-write content to produce any number of counter artifacts. Web 2.0 is the realization that we are no longer passive roamers of the Web, refining our search parameters and clicking links with a hope its path yields our desired result. Web 2.0’s contribution to read-write culture has been, in part, to set a precedent for utilizing unfettered content found online on a much larger scale. The right to view and use this ever growing mass of information has been the fuel for Web 2.0. The practice of keeping content open is critical to read-write culture because as Michael Salvo (2005) points out, “Access to the database determines who gets to speak, as well as who has the authority of the data behind the words. Design of the database determines who gets access” (p. 65). Fortunately, the model design for Web 2.0 databases has been geared predominately toward openness.

Tim O’Reilly’s “What is Web 2.0?” (2005) is perhaps the most widely recognized recounting of Web 2.0’s emergence. Shortly after the dot-com bubble burst in 2001, O’Reilly became aware that ethical views on sharing data as well as the practice of involving users in development processes encouraged by early hackers like Stallman and Raymond were being embraced to a much greater degree. The landscape of the internet was changing. Instead of designing applications that blocked users from valuable data, Web 2.0 technologies promoted read-write culture by soliciting user participation. Consequently, Web 2.0 can “harness collective intelligence” from all over the internet and view users as co-developers (O’Reilly, 2005, p. 2).

Marshalling the brain-power of many to attack a common problem is not new to hacker communities. Back in August of 1991, Linus Torvalds, the developer of the popular open source operating system Linux, posted the first version of his project online as an invitation to what would become thousands of co-developers (Goetz, 2003, p. 164). Since that time, Torvalds has made a habit of distributing new versions of Linux’s code with an unprecedented frequency and transparency. Raymond became involved with Linux in 1993 and by 1997 debuted the first version of his manifesto “The Cathedral and the Bazaar” at a Linux conference. In very practical terms, Raymond writes in one section of the Cathedral and the Bazaar: “Treating your users as co-developers is your least-hassle route to rapid code improvement and effective debugging” (The importance of having users). Prior to observing Torvalds manage the development of Linux, Raymond “believed that the most important software needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta released before its time” (Cathedral and the Bazaar, 2008). In contrast, Linux blossomed due to a flurry of innovative ideas contributed to an ever growing and collective knowledge base all centered on advancing Linux. Raymond remarks:

 

Linus Torvalds’ style of development – release early and release often, delegate everything you can, be open to the point of promiscuity – came as a surprise. No quiet, reverent cathedral building here – rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches out of which a coherent and stable system could seemingly emerge only by a succession of miracles. (Cathedral and the Bazaar, 2008)  

 

Similar to the unfettered content of Web 2.0, the source code for Linux is still found circulating freely for anyone to download, use, or contribute to as a co-developer. Normal practice on the part of the cathedral builders would entail locking down the source code with intellectual property law but Linux uses the law to instead promote openness and access to code. This approach to content and the subsequent growth of a project like Linux coincide with what O’Reilly describes as one of the primary tenets of Web 2.0 where any service offered by a Web 2.0 site “automatically gets better the more people use it” (p. 2). In short, collaboration begets success.

Icons of the Web 2.0 phenomenon like Google have made their fortunes on collecting metadata from millions of web sites and serving it up in neatly refined search results. According to O’Reilly, Google is a success because it recognizes that the business models, particularly approaches to content and collaboration, have evolved differently online: “Google isn’t just a collection of software tools, it’s a specialized database. Without the data, the tools are useless; without the software, the data is unmanageable” (p. 1). Google now offers a suit of Web 2.0 services to make use of that data including their AdSense program. When a web site owner enrolls in the AdSense program, Google “serves” or includes advertisements on the web site owner’s pages which are dynamically generated based on a number of variables including the page’s content and the visitor’s geographic location. Google pays participants based on the number of clicks generated by each advertisement. A recent article on nichegeek.com profiles several individuals making hundreds and even thousands of dollars a month with this Web 2.0 service. By comparison, a Web 1.0 model would see Google advertising only on its own search pages. AdSense means Google’s rich database puts more information in front of more users including even those that did not begin their browsing session at Google.com. In short, Google gives up some of their control and shares profit in order to increase its advertising reach.

In addition to Google, sites like Flickr, Flock, BitTorrent, as well as services by Yahoo! invite users to remix, hack, or otherwise repurpose “their” collected data. Yahoo! Pipes, for example, is billed as “a powerful composition tool to aggregate, manipulate, and mashup content from around the web.” After a user creates or “composes” a Pipe, the software allows users to save it and even share it with others. My last visit to the Yahoo! Pipes page featured a Pipe for remixing Yahoo! Search results for Napa Valley wineries with photos of the same Napa wineries posted to Flickr. This Pipe, like other Web 2.0 remixes, becomes more robust the more people use and contribute their images to Flickr.

How are these hacks possible? At its core, Web 2.0 relies on the concept that the Web itself is our universal computing platform. That is, we are not writing and developing for Windows, Leopard, or even one of the various Linux distributions, but instead for a platform with open standards and protocols free from any one agenda. The idea is to leverage the Web as platform in order to reach more users without the typical practice of attempting to control that platform. This absence of control means the end result is that we as users or co-developers can follow O’Reilly’s advice to “design for ‘hackability’ and remixability” so that the “barriers to re-use are extremely low” (p. 4).

An excellent explanation of leveraging the Web as a platform is found in the often cited YouTube sensation piece, “The Machine is Us/ing Us” by anthropology professor Michael Wesch. With Web 1.0, developers used HTML to “mark up” content as a means to format and display data. Using HTML resulted in fixed content that was bound up with its form. As Wesch explains, a newer method for tagging content using extensible mark-up language or XML gives users the flexibility to describe the content without prescribing its form. Separating form from content means that the content can, for example, be exported via RSS feeds and aggregated as a user wishes. Wesch also reminds us that users do not need to know “complicated code” in order to participate and use Web 2.0 technologies. With Flickr, users can post and tag images (e.g., describing an image of a Napa Valley winery) and that same image may ultimately be remixed and shared in a variety of uses, including search results on Yahoo. In answering his own prompt of “Who will organize all of this information?” Wesch types in the response that we all will. However, the close of his video offers a series of serious challenges to the users of Web 2.0. Wesch writes that “we will need to rethink a few things” and his list of considerations includes “copyright,” “authorship,” and “ethics.”
Next: Hacker Ethics-->