[Paleopsych] CHE: Professor Develops Software to Help Grade Essays
Premise Checker
checker at panix.com
Thu Aug 4 22:39:25 UTC 2005
Professor Develops Software to Help Grade Essays
The Chronicle of Higher Education, 5.8.5
http://chronicle.com/weekly/v51/i48/48a02902.htm
By DAN CARNEVALE
Computers routinely grade multiple-choice tests, but can machines be
trusted to grade subjective exams, like a multipage essay?
Ed Brent, a professor of sociology at the University of Missouri at
Columbia, says yes, but that computers should not do it alone. He has
developed a computer program that not only grades his students' essays
but also gives feedback on how they can improve their work.
Mr. Brent assigns a paper on a specific topic -- say, a chapter on
group culture in the sociology textbook. Students then submit drafts
via a Web site. Within seconds the software corrects each paper,
assigning a score and telling the students which points they nailed
and which points need work.
The computer's grade is not final. Students are encouraged to revise
and resubmit their papers to the computer as many times as they wish.
Mr. Brent then grades the final copy the old-fashioned way. "The idea
is for them to have immediate feedback and helpful suggestions," he
says.
His computer program, called Qualrus, does not attempt to evaluate a
clever anecdote or to criticize an overuse of alliteration. In fact,
the software doesn't even bother with spelling, grammar, or
punctuation; for those, students can use the spelling and grammar
checkers built into their word-processing programs.
Mr. Brent's software looks for key words and terms to determine if the
assigned topic was covered adequately. It can evaluate the
relationship between the terms to look for logical flow and reasoned
arguments, he says.
The professor supplies a checklist of terms and concepts to the
computer program for each subject. The program simply runs through the
students' papers to see if those elements are thoroughly presented,
analyzing the semantics and assessing the writer's understanding of
the topic. If a student leaves something out or gets something wrong,
the program will flag those mistakes, to help the student improve the
next draft.
The program improves student learning while reducing the most tedious
aspects of grading papers, Mr. Brent says. Professors can focus on
evaluating the overall quality of each paper, he says, without having
to count concepts and terms.
Development of the Qualrus software was financed in part by a $100,000
grant from the National Science Foundation. Mr. Brent also used money
from his private company, Idea Works Inc.
A new version of the software, called SAGrader, will be ready in the
fall, he says. It will be more versatile, in that professors will be
able to plug in assignment guidelines on a wide range of subjects.
Qualrus is limited to sociology.
Mr. Brent says he hopes to sell SAGrader to other educators. He is in
talks with several institutions, book publishers, and individual
professors who have taken an interest in the program
([3]http://sagrader.com).
Hunting for Plagiarism
In addition to grading the content of a paper, the program can compare
similarities among papers to see whether one appears to be copied from
another. Mr. Brent says he could plug in the texts from assigned
readings as well, to make sure students do not copy word for word from
those texts.
The computer program has its limitations, however. It looks only for
specific terms that the professor has programmed into it. If a student
uses different words to describe the same concepts, then the computer
could misgrade the assignment. When that happens, Mr. Brent says, the
students are usually pretty vocal about it.
"Sometimes we miss a particular synonym, and we put that in," he says.
"Most of the time, though, I agree with the program and not with
them."
Students may try to get the computer program to do their work for them
-- say, by submitting a lousy paper at first just to see what they
need to do to get a passing grade. But even then, Mr. Brent says, the
students are learning the concepts. "Sure, you can play the system to
some extent," he says. "But you have to know enough to do that."
Mr. Brent had been using the computer program to grade essays in his
class for a year when he approached the university about using the
program in one of Missouri's writing-intensive courses. They have at
least two major writing assignments, with students revising their
papers along the way. But first he needed the blessing of the
university's Campus Writing Program.
When Martha A. Townsend, director of the program, first heard about
Mr. Brent's ideas, she did not even return his phone calls. "My first
thoughts were skeptical -- I thought, Is this an educational
charlatan?" says Ms. Townsend, an associate professor of English. But
"Ed was very persistent," she says.
Eventually Mr. Brent persuaded her to allow him to demonstrate the
software to the program's faculty board. The board members, who come
from various departments of the university, were impressed enough to
give him the go-ahead. He has been using the computer program for a
writing-intensive course for a year now and is likely to get approval
for another year.
"It was in that demonstration process, when I finally let him in the
door, that I became convinced that he was not trying to avoid the hard
work," Ms. Townsend says. "Ed was using this computer program not as a
replacement for human feedback, but as a supplement to human
feedback."
Ms. Townsend sees the tool as a way to improve instructor-student
interaction in essay grading. "Students are getting very specific,
point-by-point responses to what they wrote," she says.
Most telling, Mr. Brent says, has been student reaction. When he
polled them, twice as many students said they preferred writing essays
for the computer as preferred taking multiple-choice tests.
More information about the paleopsych
mailing list