« Mommy wars: Elisabeth Badinter vs Amy Allen | Main | Strawberry and Almond Smoothie: A traditional smoothie with a twist of nutty protein. »

Robots score essays well


the William and Flora Hewlett Foundation sponsored a competition to see how well algorithms submitted by professional data scientists and amateur statistics wizards could predict the scores assigned by human graders. The winners were announced last month -- and the predictive algorithms were eerily accurate.

The competition was hosted by Kaggle, a Web site that runs predictive-modeling contests for client organizations -- thus giving them the benefit of a global crowd of data scientists working on their behalf. The site says it "has never failed to outperform a pre-existing accuracy benchmark, and to do so resoundingly."

Kaggle's tagline is "We're making data science a sport." Some of its clients offer sizable prizes in exchange for the intellectual property used in the winning models. For example, the Heritage Health Prize ("Identify patients who will be admitted to a hospital within the next year, using historical claims data") will bestow $3 million on the team that develops the best algorithm.

The essay-scoring competition that just concluded offered a mere $60,000 as a first prize, but it drew 159 teams. At the same time, the Hewlett Foundation sponsored a study of automated essay-scoring engines now offered by commercial vendors. The researchers found that these produced scores effectively identical to those of human graders.

Barbara Chow, education program director at the Hewlett Foundation, says: "We had heard the claim that the machine algorithms are as good as human graders, but we wanted to create a neutral and fair platform to assess the various claims of the vendors. It turns out the claims are not hype."

-- Randall Stross.

"A few years back, almost all states evaluated writing at multiple grade levels, requiring students to actually write," says Mark D. Shermis, dean of the college of education at the University of Akron in Ohio. "But a few, citing cost considerations, have either switched back to multiple-choice format to evaluate or have dropped writing evaluation altogether."

Tom Vander Ark, chief executive of OpenEd Solutions, a consulting firm that is working with the Hewlett Foundation, says the cost of commercial essay-grading software is now $10 to $20 a student per year. But as the technology improves and the costs drop, he expects that it will be incorporated into the word processing software that all students use.

"Providing students with instant feedback about grammar, punctuation, word choice and sentence structure will lead to more writing assignments," Mr. Vander Ark says, "and allow teachers to focus on higher-order skills."

Teachers would still judge the content of the essays. That's crucial, because it's been shown that students can game software by feeding in essays filled with factual nonsense that a human would notice instantly but software could not.

Jason Tigg, a London-based member of the team that won the essay-grading competition at Kaggle. As a professional stock trader who uses very large sets of price data, Mr. Tigg says that "big data is what I do at work." But the essay-scoring software that he and his teammates developed uses relatively small data sets and ordinary PCs -- so the additional infrastructure cost for schools could be nil.

TrackBack

TrackBack URL for this entry:
http://www.stylizedfacts.com/cgi-sys/cgiwrap/fotohof/managed-mt/mt-tb.cgi/7561

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)