News
HMS Is Facing a Deficit. Under Trump, Some Fear It May Get Worse.
News
Cambridge Police Respond to Three Armed Robberies Over Holiday Weekend
News
What’s Next for Harvard’s Legacy of Slavery Initiative?
News
MassDOT Adds Unpopular Train Layover to Allston I-90 Project in Sudden Reversal
News
Denied Winter Campus Housing, International Students Scramble to Find Alternative Options
Daniel Koretz, an expert on educational testing and achievement, has been appointed a full professor at the Graduate School of Education (GSE), the school announced this week.
“We’re very excited to have him. The testing area has been heating up a lot recently both regionally and nationally,” said GSE spokesperson Christine Sanni.
While his work focuses more broadly on measurements of educational achievement, Koretz’s work has a specific focus on so-called “high-stakes” testing, like the Massachusetts MCAS exams or the New York Regents exam. This field has been the subject of widespread controversy in recent years.
“One theme that has characterized recent educational policy is test-based accountability. It’s becoming a hotter and hotter topic every day,” he says.
Recent efforts have been taken—both on the state level and on the national level by President Bush—to hold teachers accountable for the performance of their students through testing and even to hold students back unless they can successfully pass tests. Using testing for such important decisions is often referred to as “high-stakes” testing.
Koretz’s research has shown that under such “high-stakes” circumstances the scores on tests rise quickly and that there is often a “substantial inflation” of scores.
“There are rapid rises in test scores when people are told it matters more,” he says. “[Educators are] basically saying: ‘one way or the other get the scores up.’”
He explains that teachers begin to focus more on “teaching to the test,” and not on the broader areas that the test is supposed to represent.
Koretz has studied or worked as a consultant on the assessment systems of several states, including the alternative assessment “portfolio” program in Vermont that includes no testing.
Koretz says one of the biggest problems facing states as they try to develop assessment programs is ensuring that the results are comparable across different school districts.
Many factors besides teaching, like nutrition and parental education, affect student performance, Koretz says.
“Parental education is a strong predictor of student performance,” he says.
Therefore, when education administrators evaluate student achievement it can sometimes present a misleading picture of the quality of teachers in a particular school district.
Koretz says officials must take into account outside influences when they evaluate test scores—or they may not be able to improve the district, even with good teachers.
“What if we don’t attend to those other factors holding kids back?” he asks.
This interface between policy, testing and student achievement is one of the attractions to the discipline, according to Koretz.
Although he has been working at the RAND Institute and the Center for Research on Evaluation, Standards, and Student Testing recently, Koretz says he was attracted to GSE for its educational aspect.
“I wanted to get back to teaching. The faculty [in GSE] are very serious about their teaching,” he says.
Although he has not officially received his teaching assignments, Koretz says he hopes to teach two larger survey courses and several smaller and more advanced seminars. The survey courses will focus on the quantitative and non-quantitative measurements of student achievement, while the seminars will be more policy-oriented.
—Staff writer Garrett M. Graff can be reached at ggraff@fas.harvard.edu.
Want to keep up with breaking news? Subscribe to our email newsletter.