The SGA hopes to improve the University of Maryland’s course evaluation system so responses are more readily available and useful to students.

In the summer of 2014, this university implemented a commercial course evaluation system compatible with some mobile devices, which could increase response rates, said Renee Baird Snyder, coordinator of course evaluations for the Office of Institutional Research, Planning and Assessment.

Information collected through course evaluations under the new system is available to academic administrators, professors, department chairs and teaching assistants, but not to students.

Snyder said officials are building the capacity to provide the reports to students, which has been a longer process than anticipated. She said she hopes the reports will be available by next year.

Shabnam Ahmed, the Student Government Association’s vice president of academic affairs, said she believes the new system will be more efficient, but she wants to make sure students will be able to view the reports.

READ MORE: Active-learning courses gain traction at the University of Maryland

A student can view information on a course only if 70 percent of students in the previous class completed the evaluation. Ahmed said her committee is considering introducing legislation asking to lower that requirement to 50 percent.

“I really do think this new system would be really great, especially if we lower the percentage rate of people that need to post their evaluations,” said Ahmed, a junior in the College Park Scholars Global Public Health program. “But because we came from a past of it being shaky, it’s hard for people to see the potential of something new.”

In 2007, the university transitioned from a paper evaluation system to an online one. Snyder said the update was innovative then, but over time it became outdated and less efficient.

“The previous system was a strong one when we first started it,” she said. “However, to keep building the additional features our campus has wanted would have taken more resources than shifting to a commercial system.”

A survey of faculty and students found that instructors who prioritize their students completing the evaluations often receive higher response rates, Snyder said.

Jeff Harring, a professor in the human development and quantitative methodology department, said he typically gives his students several reminders throughout the evaluation period.

He said his classes, made up mainly of graduate students, often get high response rates because he attempts to guide his students through the process.

“Encouraging faculty to take a little time out and have students do it in class … is as close to nirvana in terms of getting participation rates up,” Harring said.

Snyder and Ahmed said that regardless of the system’s efficiency, students must prioritize evaluations for them to serve the purpose of improving courses. Ahmed said the SGA and this university will have to market the system once it is fully operational.

“Course evaluations are important, and the only way they can really work is if students do it,” she said. “If we really want it to work this year, we have to do it and watch and see how it works.”