Bubble sheets used for course evaluations at the end of each semester will be things of the past starting this spring.
Karen Helm, director of the University Planning Office, announced all course evaluations will be done online. She served as chair of a committee to develop the new system.
Helm said the decision came from the provost’s office, and intends to help all departments of the University start using the same system to conduct electronic evaluations.
“We are trying to develop a single system where all evaluations for the University are administered centrally,” she said. “We are trying to centralize evaluations, whereas in the past, individual departments have used their own systems for evaluations. Putting it online is a much cheaper way of doing it.”
Nina Allen, faculty senate chair, said the decision was made to save time and money for the University.
According to Allen, the faculty senate discussed the matter and concluded it would be best to test the system further and research the results of using the new process more carefully.
“Personally, I only care that the evaluations are done fairly — I don’t really care how,” Allen, a professor of plant biology, said. “But from an administrative point of view, it’s more efficient to do the evaluations electronically. It will streamline the process.” Allen said the new evaluations would create a new batch of potential problems.
Seven departments have already experimented with electronic evaluations during the fall 2006 semester when they used the new system as beta sites.
Allen said the results from these evaluations should be taken into consideration as the University attempts to improve the evaluation system.
She said they should be compared with evaluations of the same courses with the same professor using the bubble sheet system.
“Some professors in the College of Management found that fewer people completed the evaluation when it was conducted online,” Allen said. “One concern is that the bubble sheets were done in one of the last classes of the semester, so everyone does them. We don’t have any knowledge of how average ratings from an online survey are going to be different.”
Allen said the motives of the students who do complete the evaluation might affect evaluation results.
“One other thing we have to consider is whether it is only going to be the grouchy students who take the time to fill out the survey,” Allen said. “It could end up skewing the ratings if only the students who dislike their professors or only the students who love their professors complete the evaluations.”
Helm agreed — getting students to fill out the new evaluations would be one of the most challenging aspects of the new system.
“We are concerned that we could suffer a drop in response rate,” Helm said. “When you have students filling out bubble sheets in class, that’s a captive audience. In an online environment, we are asking students to go home and find the time during a very busy week to fill out the evaluations on their own time.”
Allen said several different options for getting students to fill out the electronic evaluations could be considered.
She said the University might choose to hold grades for those students who don’t complete their evaluations, or reward those who do by holding drawings for prizes. However, Allen said she has concerns that forcing students to complete their evaluations would have a negative affect on the results.
“We are concerned that some students might just fill in the evaluations without thinking about them. If a student just fills in random numbers, I’d almost rather they not fill them out at all.”
Helm said while the committee had discussed using both positive and negative incentives, the initial data from the seven departments that beta tested the system showed there are other ways of influencing the number of students who completed the evaluations.
“We found that the departments where there was a lot of communication between faculty and students had a much higher response rate,” Helm said. “Communication is very, very important, and we need to communicate with students to encourage them to fill out the surveys, as they supply very important information.”
Allen said the concern about accurate evaluation results stemmed from how important the evaluations are, and noted the system may continue to change in the future.
“Student evaluation is very valuable. Professors can learn a lot from students. They are used in the faculty review and tenure review processes, so faculty members do pay attention to them. Courses can be changed to fit students opinions.” Allen said. “We hope the system can be tweaked as we go to make it a more accurate tool.”