Methods and Cautions

When I began distributing this survey in December 2020, I was not able to find any similar and already published studies. Given the timing of the pandemic, that is not surprising, and a cursory review of the scholarship as I prepare the final draft of this article in late Spring 2022 reveals many scholars interested in distance education at all levels were studying the shift to online delivery during the Covid pandemic more or less at the same time as me. The best example of a study similar to mine that I have found to date is Emergency Move to Remote Teaching: A Mixed-Method Approach to Understand Faculty Perceptions and Instructional Practices" by Jeonghyun Lee, Farahnaz Soleimani, and Stephen W. Harmon (2021). Their study is similar in that they too were investigating faculty perceptions of and adjustments to online delivery, though focused on faculty at one university. They focused on the early stages of the shift online and during what they describe as the "Emergency Remote Teaching Environment" of March 2020—essentially, the early in Covid time-frame Zimmerman suggested we should research for self-selection bias—but like my study, their work included an anonymous survey of faculty with follow-up interviews. Broadly speaking, their "... findings suggest that the extent to which faculty perceived remote teaching as easy or satisfying is closely associated with their degree of adjustment, their level of comfort with remote teaching, and whether their course was suitable for online instruction" (p. 259).

My survey, which was approved by Eastern Michigan University's IRB office, consisted of 20 questions asking survey participants about their experiences teaching online during the 2020-21 school year. The original survey questions are available here as a PDF. To recruit participants, I utilized a simplistic snowball sampling method. I announced the survey on my own website and on social media platforms, and I also encouraged early survey participants to share the link to the survey with any of their colleagues at their institutions who were teaching online during that school year.

As part of the consent question for participating, all participants had to indicate that they were over 18 years old and were teaching at least one college course online in the U.S. during the 2020-21 school year. The next question on the survey defined an online course for the purposes of this survey as one where all aspects of the course were delivered over the internet and with no face to face/physical classroom interactions. My goal was to exclude various hybrid modes, and my presumption was—given the conditions of the "natural experiment" brought on by Covid—that the majority of faculty who taught online during the 2020-21 school year did so against their will. One hundred and eight respondents began the survey, but three indicated they were not teaching entirely online and they were dismissed from the rest of the survey.

One hundred and four respondents completed all the survey questions; 67 identified as women, 28 men, and the remaining nine either preferred not to say or identified as gendered queer/non-binary. The majority of respondents—64—said they were tenure-track faculty or the equivalent; 20 described themselves as full-time but non-tenure-track, and 20 said they were either non-tenure-track and part-time instructors or graduate assistants. Also, the majority of respondents-- again, 64-- said they worked at institutions that granted Doctorates or Masters degrees, while the remaining 40 said they worked at institutions that granted only Bachelors or Associate degrees.

The last question on the survey invited subjects to participate in a follow-up interview with me, and about 70 of the participants agreed. I began conducting those interviews in January 2022. At the time of this writing in late spring of 2022, I have completed over 30 of these interviews. While I have not yet systematically analyzed the hundreds of pages of transcripts generated from these 45-90 minute interviews conducted via Zoom, the responses have already helped me better understand the results of this survey. I look forward to reporting on these interviews in more detail in future scholarship.

Obviously, this is a pilot study and there are problems with this research. I put the survey together quickly because I thought it was important to begin collecting responses in the midst of the 2020-21 school year; had I been able to take more time, I certainly would have revised and refined some of my questions. The small and non-representative sample size and the design of the questions (particularly the open-ended ones) meant a more statistically robust analysis wasn't possible, and these results aren't generalizable beyond the survey itself. I was hoping my snowball sampling approach would increase my chances of attracting participants from a variety of different academic disciplines, but based on the interviews I've conducted, almost all of the respondents were from English, Writing Studies, and the humanities more generally. In hindsight, this is not surprising since I solicited interest on social media where the majority of my academic "friends" or "followers" are in academic disciplines like my own. And in a way, this overall limitation is actually a strength for this website since Computers and Composition Online is a journal focused on the teaching of writing, both within and outside English departments, in writing programs, and in other writing-intensive venues.

So did I make some mistakes and would I do things differently if I did this again? Absolutely. But even with these shortcomings, I believe these results are interesting and relevant, and I hope this work can be seen as a starting point for similar research.