surveys for students: using google forms to get to know your classroom

Like any good humanist, I want as much information as I can get in any situation. (I imagine a scientist, or anyone, really, has similar desires.) And when I'm teaching, information about my students becomes invaluable as the class grows and shifts from planning to execution. 

One of the most powerful tools for gathering this information is the online, anonymous survey. Below, I'll show three examples of how I've used Google Forms to create and administer these surveys at different points in a course. But first, why use them at all?

Surveys give students a private channel of communication

Throughout my teaching career, I often wanted to know how my students were doing individually, but also a collective group. And as anyone who has asked a class of undergraduates "how the reading went," one does not always get accurate, or honest, information when students are asked to be vulnerable in front of one another. I started using online surveys as a way to collect information that I'm interested in, and was pleasantly surprised when students were much more open and honest when they could answer privately, sometimes anonymously, and on their own time. 

Get to know your students before the course begins

This is my pre course survey. I send it out (slightly modified for the type of course) before each class I teach, either as an assistant or instructor of record, as soon as is reasonable. This survey (as part of a "welcome to the class" on-boarding email) is the first signal I send to students that I value their voices and diversity in the class, and aim to be aware and inclusive of their strengths and backgrounds as I teach. I did configure this form to be associated with an email, but appreciate that I can both see individual responses and the answers in aggregate. 

For example, the range of responses I get for the "what should I know about you as a student?" question range from preferences about class activities to concerns about preparation. 

After getting responses like these, I made sure to include extra resources on film materials on the course website, as well as providing many ways for students to participate.

After getting responses like these, I made sure to include extra resources on film materials on the course website, as well as providing many ways for students to participate.

After getting responses like these, I made sure to include extra resources on film materials on the course website, as well as providing many ways for students to participate. 

Most of the questions are pretty standard, but I am consistently surprised by the technology answers my students provide. I am guilty, as many of my digital humanities/studies colleagues are, of assuming access to technology. But this particular responses gave me pause, as I realized that a significant group of my students did not have smart phones, and thus my assumption that students would see an email blast before class with an announcement was in error. I made a special effort that semester to communicate information on the course site and over email with as much lead time as possible. 

Tech.jpeg

The most unexpected benefit of the pre-course survey is reduced nerves! I know quite a bit about my students before I ever step in front of them, which helps me feel more confident and prepared, and also gets me excited to work with them over the course of the semester. Usually, this is just the boost I need to get pumped about being in front of students again. 

Muddiest Point gives you class by class data

'Muddiest Point' is a technique I was first introduced to in a STEM pedagogy seminar. This is a quick feedback tool, where students at the end of each course meeting are required to submit, anonymously, what they feel the most unclear point of the lecture or course session was. This is incredibly helpful in large lecture courses where it can be hard to gauge understanding, but gives students in classes of all sizes a way to express comprehension challenges without needing to do so publicly. If more than a handful of students express discomfort with a certain idea, that's a great sign that you might want to add an extra problem solution to the course website, or quickly review the idea next lecture.

In my own courses, I often handed out index cards at the end of each course, and students got in the habit of filling one out before leaving. But putting a durable link on the course website can be even more effective (and you don't have to carry around/buy paper!) Here's an example of how that form would look. 

I also found that it was a great, anonymous way for students to talk back to me about how the course was going. Because their names aren't attached to the feedback, students would express all kinds of ideas to me - things they liked, new examples they wanted to share but didn't get to, explanations for their low energy, or candid feedback about how they were enjoying the course. Unlike student intake surveys, I do take these with a grain of salt, but students consistently mention in my end-of-term feedback that they enjoyed this structured, private way of communicating with me. 

Check in with your students at the midpoint of the course

Midterm Student Feedback sessions were a service I performed for instructors on behalf of CRLT where I would interview instructors about how they felt a course was going, observe a section of their teaching, and facilitate an anonymous feedback session with their students. These were incredibly valuable conversations, allowing students to give feedback while time remained for the instructor to incorporate their feedback, and giving instructors a chance to open conversations and adapt courses. 

But, even if you don't have access to a full, facilitated MSF session, you can still solicit feedback from your students at the midpoint of the course. Here's an example of a form I've distributed to students. And here are a few responses to an actual MSF form I distributed (this was after a facilitated session that was done on my behalf, and these two students chose to respond further to the form or were absent - I didn't have attendance data for that course session, so I can't say for sure, which of course protects my students.)

Many students come to my courses expecting to see 'important' 'artistic' films and instead watch Justin Bieber: Never Say Never, but I won't apologize for showing them that the same concepts that animate their high theory readings function in documentaries aimed at teen girls. 

Many students come to my courses expecting to see 'important' 'artistic' films and instead watch Justin Bieber: Never Say Never, but I won't apologize for showing them that the same concepts that animate their high theory readings function in documentaries aimed at teen girls.

Many students come to my courses expecting to see 'important' 'artistic' films and instead watch Justin Bieber: Never Say Never, but I won't apologize for showing them that the same concepts that animate their high theory readings function in documentaries aimed at teen girls.

After receiving this feedback (along with the full report), I did continue doing the muddiest point exercise after every class (I hadn't been sure, with this group in particular, that it was helpful) but chose not to change my screenings. I also appreciated the answers that students gave about what they can do to improve their experience of the course. Students often know that their own investment in reading and attendance impacts their learning, but it was reassuring to know that they viewed themselves as co-creators of their learning experience. 

MSFs give me a chance to adjust my teaching - every course is different, every student is different, and making changes halfway through can really re-energize a course that's straying off course, or help to shore up effective structures even more. 

But okay, what do I do with all this data?

My first, and most important, piece of advice, is to honor what students share with you. Are many students saying in their muddiest point answers that they hate a certain assignment? Add a follow-up question to figure out why, and maybe address their concerns. Do all of your students have missing background preparation for the course? Build regular units of background knowledge into your classes, or provide extra resources on the course site. 

But, as with so many pedagogy tools, the tool becomes much more effective when it gains the students' trust. After each survey, I referenced the results in class, and gave my students concrete actions that I was taking to address the ideas and concerns they volunteered. I made an effort to address muddiest point concerns, and explicitly referenced those answers in class. But keeping these channels of communication open, especially with these surveys, makes me a more responsive instructor, and empowers my students to participate in creating the learning environment with me, and with their peers. 

habits as a practice: how to ease up on some all or nothing thinking

using your brain for both: when you have to do the anxiety (or your brain weather of choice!) and work in the same brain

0