Students who fill out evaluations of instructors at the end of each term play an important role in the future of their teachers’ careers, but the figures tabulated from the results are not readily available across campus.
The University’s academic departments and schools are required by University policy to release four key questions from the questionnaire forms to students who seek the information.
But a survey of several popular departments on campus has shown that while most release the information, the format for scoring the figures changes dramatically among majors, making it hard for students to evaluate their instructors’ scores.
Obtaining figures is difficult
Quantitative questionnaires are used at the end of each quarter to determine the effectiveness of instructors on campus in the eyes of students. After students fill out the forms, figures are gathered to examine the ability of the various instructors.
Many on campus say a high amount of emphasis is placed on the results. After the figures are released to each department, faculty can be released from their positions in extreme cases.
But the availability of the figures to students is almost non-existent.
According to the 1985 Faculty Handbook, the University Assembly approved legislation May 1, 1985 that would require the results to be placed in three separate locations. The evaluations, called course reaction inventories, were to be made available in the reserve reading room of the Knight Library, in the Office of Academic Affairs, and in each individual academic department.
But in the 1993 handbook, the writing was changed to require only the academic departments to handle the figures of the evaluation and make the results available to interested students.
Lorraine Davis, vice provost for academic affairs at the University, said the results were taken out of Knight Library because of a change in budgeting and technology at the University.
“We’re anticipating trying to return to the mechanism of making them available,” she said.
Davis said University academic departments should make the figures readily available to students. But an Emerald survey of several departments around campus shows that this is not necessarily the case presently.
Of the nine departments visited, seven released figures, though the formats for the figures were varied. Two departments — sociology and political science — did not release figures.
Staff members in the sociology department declined to release figures, saying they were grouped in a confusing manner.
Staff in the political science department office said they did not keep results for the four questions, and could not release the results from the rest of the evaluations, citing confidentiality concerns.
Department workers said they believed the figures were available in Knight Library, but library personnel said they have not stored the evaluation results there for several years.
Diane Bricker, an associate dean for the College of Education, said she is aware that the evaluation results for the department are not available at Knight Library, and would probably allow students to come and look at the results.
“If a student came and asked, I’m pretty sure we’d give [the results] to them,” she said.
Bricker said she believes there is not enough interest from students to look at the evaluations, and to place them in one central area may be too time-consuming. She said she believes the current system of keeping the figures in the individual departments is adequate.
Keith Richard, a University archivist and the University Senate secretary in 1993, said the figures were part of a system which included a more in-depth questioning system.
But, he said, “[the figures] were supposed to be much more systematically effective.”
Evaluations affect faculty
promotions
Despite the lack of availability to students, the evaluations are important for faculty promotions.
“The strongest component [in promotion] is what the students have to say about the instructors,” said Van Kolpin, professor of economics and head of that department.
He said instructors that receive low marks on their evaluations can go through a process to help improve their teaching abilities. This can include a “face-to-face” discussion to talk about the problems an instructor may be having. But sometimes, he said, he doesn’t need the evaluations results to let him know of problems in a class.
Usually, “you catch wind of things before the class has been completed,” he said.
Davis said she believes the current process of evaluating instructors is enough.
“The mechanisms and procedures in place can provide us with sufficient information,” she said.
Raymond King, associate dean of the Charles H. Lundquist College of Business, said professors employed on a year-to-year contract may not have their contracts renewed if they receive low marks by students. Tenured professors are protected by due process, but King said tenure is not possible without high results on the evaluations.
“The student evaluations are not the only piece of evidence, but an important one,” he said.
According to the current faculty handbook, statistical data comprised from the student evaluations are placed in the permanent personnel file of the instructor being evaluated. They are then reviewed as supplementary materials in the promotion and tenure file.
Bricker said figures obtained in the evaluations may not be accurate because students may not always take the evaluations seriously. Also, she said, students have a tendency to give higher marks to professors who teach classes with less work. Therefore, a professor who assigns a heavier work load for their course may receive more negative evaluations.
“Instructors sometimes suffer because of the content of their course,” she said.
Click here to read about online archiving of evaluations.