Use Student Surveys to Transform Your Online Classroom

Cindy Black
7 min readMar 30, 2022

If you’ve found this article, you’re probably looking for ways to make your online classroom or course work better for both you and your students. As many teachers and families observed during remote learning, student engagement is not guaranteed in online course delivery. Student engagement, or a student’s focus on and interest in an activity, remains a perennial topic of interest in education because it’s one of the best predictors of student achievement and satisfaction.

Young school aged boy looking at a laptop computer disinterested in remote learning virtual school class during COVID-19 quarantine.
Photo by Thomas Park on Unsplash

So, robbed of our in-person tricks like noticing how our students react and listening to their responses and academic conversations, how do teachers — some of whom are creating online courses with little training — create engaging, user-friendly online classrooms?

Ask your users.

In an online classroom, students are the end users. They are also experts in their own experiences and, once a teacher has built rapport, are typically very willing to share their opinions and experiences using a course — and any other software you’re asking them to use.

One of my favorite ways to collect students’ opinions is through user surveys.

Sticky notes next to pens.
Photo by Ravi Palwe on Unsplash

Student user surveys don’t have to be intimidating.

I’ve used student and guardian surveys as long as I’ve been a teacher. Low-stakes surveys like providing a question and sticky notes are easy to implement in an in-person classroom, while tools like Zoom polls offer similar ease during live remote classes. But to gain the full benefit of student surveys, I recommend going the slightly more complex route and leveraging the built-in data analytics available in Google Forms surveys and the built-in survey tool in learning management systems like Instructure’s Canvas.

If sticky notes, Zoom polls, and surveys via Google Forms sound a lot like formative assessment to you, you’re right — I view student user surveys as an extension of the excellent habit of asking students what they already know or think about a topic. In the case of user surveys, the topic isn’t how well students understand my content, it’s how well my course design decisions meet my users’ needs.

So how do you get started?

I use the same basic approach for in-person and online course design surveys: keep the focus small, plan my questions, use the same questions as much as possible, plan how often I’ll survey my users, and analyze feedback right away.

Tip 1: Keep the Focus Small

Ask narrow questions to get the kind of specific feedback that can help you decide which parts of your course are working and which parts need minor revision (or major overhaul).

Open-ended questions are appealing because they seem to provide space for respondents to share whatever they like. And sometimes, the results are delightful: broad questions were how I learned which read-aloud books were the biggest hits, how much my students loved studying weather, and that students’ families wanted a heavier emphasis on math fact practice.

Unfortunately, it took me almost two years of quarterly surveys to have enough data to identify these trends.

Google form asking: What was your favorite thing we did this year? What is one thing you would change?
From my 2016–2017 end-of-year survey

When I started writing narrower questions, I got results I could use immediately to improve specific aspects of my course design: students were able to give me clear advice on everything from which supplemental material they found most helpful (video demonstrations of new assignment types) to the kinds of support and check-ins they found helpful and which just annoyed them (yes to Canvas Inbox reminders about assignments they hadn’t turned in, no to Zoom Breakout Rooms for in-class work accountability).

For example

Asking “What did you find most helpful this quarter?” resulted in answers that ranged from naming specific people to “Having a place to study.”

When I thought carefully about the results I was getting from my open-ended questions, I realized that I needed to ask questions about aspects of my course I could impact: navigation, pacing, methods of instruction, and volume and types of assigned work.

Don’t throw out o-ended questions entirely.

Open-ended questions have a place in your surveys, but I’ve had the most success including them as clarifying questions that come after the narrower checkboxes, ranked lists, or radio-button multiple choice questions that target specific aspects of my course design.

Tip 2: Plan Your Questions

When I design or revise a student user survey, I now consider three things: where do I think students are struggling or excelling, which parts of my course am I most interested in revising, and what aspects of my class do students and families complain about?

I gather data on students struggles and victories through anecdotal means (conversations, noticing trends in Inbox questions) and by examining quantitative data like students’ scores on writing assignments and quizzes.

To determine which parts of a course might need revision, I try to walk through a course in student-view mode and notice where any pain points show up for me as an expert user; if I struggle to access something or complete a task, that’s something my students — as novices — will likely find an unnecessary challenge.

The final piece — student and family complaints — is probably easiest to collect as your students work through a course. As you notice trends in unsolicited feedback, you can craft questions to help you understand how widespread a particular point of difficulty is.

Tip 3: Use the Same Questions

Using the same questions allows you to easily compare survey results over time and quickly identify whether changes you make to your course create positive or negative experiences for your student users. One of my favorite questions to ask is: Which tools did you use and find most helpful?

This allows me to collect direct user feedback on a variety of software programs I’ve used with my middle schoolers and use that feedback to determine which tools were worth continuing to use (Kami PDF mark-up) and which should be abandoned or used less frequently.

Two raised hands in a classroom.
Photo by Artem Maltsev on Unsplash

Tip 4: Plan Survey Frequency

User surveys that are conducted too frequently annoy students and add to the considerable workload many students are already shouldering, which can result in rushed or sarcastic answers.

For elementary students and families, I surveyed students:

  • one week after any big changes (like trying out math stations for the first time or introducing flexible seating to the classroom) as this allowed me to identify problems before they became pain points,
  • at the end of major units so I knew what to do differently the next time I taught that unit and to identify unit-to-unit trends so I could make adjustments to upcoming units of study,
  • and just before long breaks so I’d have time to reflect on what was working well and what needed changing.

When I moved to middle school in August 2020, my district was 100% remote and our recommended course design followed a one-unit-per-quarter model. We also would swap student cohorts each quarter. This meant that weekly surveys probably wouldn’t provide very interesting data, but waiting until the end of the unit meant that I wouldn’t be able to meet the expressed needs of the humans who had given me the feedback.

For both remote and in-person middle school teaching, I’ve found that scheduling user surveys at mid-quarter and one week before the end of a quarter works well because I have opportunities to make improvements for both short- and long-term audiences.

A laptop screen shows pie, line, and bar graphs.
Photo by Campaign Creators on Unsplash

Tip 5: Analyze Feedback Right Away

Once the survey closes, I make 3 passes through results: an initial skim of key questions within 2 work days, a read to identify overall trends within one week, and a more careful reading within 2 weeks.

Course homepage before (left) and after (right) student feedback

Results

Like many teachers, I had to learn a new LMS when my district switched from Google Classroom to Canvas LMS and while I had a minimum viable product ready to go at the beginning of the school year, my Canvas course had not been user-tested before it was deployed.

As a result of student user survey feedback, I redesigned my course homepage to provide clearer, more visually-appealing links to our most commonly-used internal and external sites, moved the daily agenda slide off the home page, and added a very popular feature: the tell-me-a-joke Google Form (I collect Dad jokes and puns).

Let’s Connect!

I love connecting with other technologists, learners, and teachers on Twitter! Add me on LinkedIn here or join me on Twitter here. I’m currently learning JavaScript on Twitter, and I’m having a blast!

Bonus points if you share your best student user survey question with me — I’m always looking for opportunities to learn from others.

--

--

Cindy Black

I nicknamed my cat “Potato,” address my students as Fellow Humans, and usually have a ridiculous number of tabs saved in OneTab.