Faculty surveys help determine staff promotions

This version corrects mistakes in paragraphs three and six relating to instructor status. 


Every day for two weeks, Tarah Burke checked to see if the results of the faculty course survey were available. The first-time fashion design and communication instructor was worried about receiving negative feedback that would go on her file.

Burke taught three classes last semester. In a class of almost 70, only about a dozen provided written feedback in addition to scores. The students who doled out the most negative scores tended to forgo a written response. One entire class gave no written comments at all.


A woman filling out a survey. (Courtesy Flickr)

The results of the surveys — among other measures — are used to evaluate professors for tenure positions. Whether the results change how instructors run their classes is usually up to them.

The participation rate for online faculty course surveys is continually low. Only 21 per cent of all students completed surveys in the fall 2014 semester.

“Those assessing survey results take into account response rates and factor low response rates into the weight given to the results,” said Saeed Zolfaghari, vice-provost of faculty affairs, in an email this week.

He added that until the university can decide how to increase the low online participation rate, paper-based surveys handed out in classes will be used to evaluate pre-tenure professors.

Nick Giuffre taught management and accounting at Ryerson for 15 years until 2007. While most of his students took the process seriously and gave criticism, Giuffre didn’t think they made much of a difference.

“I don’t think anyone in authority looked at it or took it seriously to be quite frank,” he said.

He was sent the final scores and few of the comments, but never discussed them with anyone in the department.

“Beyond that, I just didn’t see anything come from the top,” Giuffre said.

Faculty chairs and directors are responsible for ensuring the survey is available for all courses and that results are sent back to instructors. But according to the Ryerson website, they’re not explicitly required to review them with staff.

Danielle Landry, who teaches in the school of disability studies, said that even with so few responses, the results can be helpful, especially to new instructors. Landry also personally requested critiques from her students near the end of the semester.

“I wanted them to not only rate my teaching but to have input in the shape of the course,” she said.

Landry modelled her approach on the graduate school environment, where regular conversation between students and faculty encourages feedback. The Yeates School of Graduate Studies consistently has a participation rate higher than the university average. It was 39 per cent last semester.

Though she sees the value in course evaluations, Landry is wary of treating teaching assessments like “consumer surveys,” where students grade a service rather than communicate with their professors about how they learn.
“It does mean something. It is meaningful,” she said. “Don’t make it personal, make it useful.”

By Shannon Clarke

This story also appeared in The Ryersonian, a weekly newspaper produced by the Ryerson School of Journalism, on Feb 4, 2015.

Comments are closed.

Read previous post:
How socially challenged RU? This student game show will let you know

“We would rather sit on a subway and stare at our cellphone screens or an advertisement for 20 minutes than...