Picking courses has been a pain in my ass since I started attending Ryerson four years ago.
When I’m not battling the mass of students just trying to get onto RAMSS, I’m fighting to get into a course I need.
On countless occasions, I have been forced to take another class because the one I want fills too quickly. I’ve been then left to my own devices trying to find the one elective I need to graduate.
Every student wants to do well in school. Students are much more likely to get that A when a course has a great professor that makes material easy to grasp.
But semester after semester, teachers pester students to complete faculty course surveys.
And we still have no official way of seeing how well each professor scored on those evaluations.
Ryerson doesn’t offer a way for students to see how well individual faculty members are doing their jobs.
We’re left to trust our instincts, ask our friends’ opinions or check a teacher’s score on RateMyProfessors.com.
And while those aren’t the most reliable ways, they’re all we have.
At the University of Toronto, course evaluations policy dictates that the data will be shared with the students unless individual instructors choose not to release the data for their courses.
Ryerson, on the other hand, releases only aggregated results for an entire program.
Not only is this vague, but it also poses a problem for students trying to get more information on the quality of education available at Ryerson.
And considering that most students probably forget about doing the faculty surveys available online anyway, the results aren’t accurate.
The reason why the results from faculty course surveys are never posted is due to a “contractual provision” between the university and the professors, according to vice-provost of faculty affairs John Isbister.
Tara Deschamps reports on the issue in a story on page four of today’s Ryersonian.
Isbister says he personally believes that the results from these surveys should be posted for students to see, and he agrees with U of T’s system.
But even if they are, he says there’s always going to be a bias with the data collected.
“It’s extremely difficult to assess good teaching, but we do try,” he said, noting that faculty are also evaluated by colleagues who sit in while they teach. “The best we can do is use more than one way.”
Like faculty course surveys, RateMyProfs definitely includes its own biases on everything, from how professors engage their class right down to their looks.
But for the past four years, it’s been my go-to resource for choosing teachers.
The site is generally pretty accurate, but it would be nice to get some feedback from Ryerson instead of using a website that isn’t affiliated with the school.
I have taken some classes with poorly rated professors, and there were many times when I was left pleasantly surprised by how much I learned. I personally fill out faculty course surveys, mainly because of the possibility of winning an iPad.
However, I also believe that I owe it to professors to show how effective they were in presenting their course material.
But if they’re allowed to see their results, why can’t we?
I think it’s unfair to be asked to do faculty course surveys without seeing the outcome.
The issue could be up for debate again soon, as the professors’ contract expires in June 2015, Isbister says.
But as I’m about to graduate, it’s too little too late. The school needs to be more transparent in providing results.
I hope students will fight for these changes, even if Isbister says the contract is not a top priority.
What we have now just isn’t enough, even if it is fun to see how hot the professor you’re about to spend the semester with is.
This story was first published in The Ryersonian, a weekly newspaper produced by the Ryerson School of Journalism, on March 19, 2014.