Index Editorial Staff Submissions Resources Archives  
 

» Anecdotal evidence » Introduction

So far I've mainly discussed tools for generating statistical assessments of writing program web sites. But, as writing instructors, I think it's safe to say we value not just writing but written assessment. After all, I'd like to think that's why I spend so much time commenting on student work. Thus, one useful (if informal) kind of assessment for a writing program web site is anecdotal evidence and feedback.

Tutor Comments on the Web Site
We've received a small number of such anecdotal assessments since launching the redesigned Rutgers Writing Program web site in fall 2000, and from a variety of populations: students, teachers, university administrators, tutors in our writing centers, and professionals in the field. The web site has also been somewhat more formally evaluated in this way by incorporating it into our writing center internship course for undergraduate tutors (see sidebar), asking tutors to comment on the web site in the online forum for the class.

But as my descriptions perhaps already make clear, anecdotal evidence is at best a minor tool for assessment. Though it remains valuable for our internal audience, and has some value when included in requests for funding, it cannot paint a full picture of a web site's success.

For one thing, anecdotal evidence is, well, anecdotal--it provides assessment from individual visitors without indicating overall success. Indeed, one might argue that the lack of anecdotal evidence is equally a sign of a program web site's success: users, after all, rarely write when things work well; they most often write when there are problems. Just as true, though, users who have unsatisfactory experiences at a web site leave without saying anything and never return.

summary »

 

» home | email «