News

HMS Is Facing a Deficit. Under Trump, Some Fear It May Get Worse.

News

Cambridge Police Respond to Three Armed Robberies Over Holiday Weekend

News

What’s Next for Harvard’s Legacy of Slavery Initiative?

News

MassDOT Adds Unpopular Train Layover to Allston I-90 Project in Sudden Reversal

News

Denied Winter Campus Housing, International Students Scramble to Find Alternative Options

Op Eds

Framing the Student Voice

By Jack W. Deschler

Last Thursday, the Undergraduate Council released a survey to the student body in an effort to gauge student opinion ahead of the upcoming faculty vote on social group sanctions. However, the survey the UC has chosen to administer is so fundamentally flawed that it will do more harm than good in actually representing student opinion. In the best case, the release of this survey is grossly irresponsible; in the worst case, it is deliberately misleading.

My first concern is that the sampling method used in the survey is non-probabilistic: It has been sent to every undergraduate, each of whom must opt-in if they wish to take it. Since no demographic information is collected besides gender and class year, it will be impossible to turn the sample frame taken by the survey and generalize it to the overall student population. This will result in what pollsters call “sample frame error,” which alone would be enough to stop any good social scientist from trying to draw conclusions from the resulting data.

More worryingly, the UC survey is vulnerable to the phenomenon of “non-response bias,” which occurs when the respondents to a survey differ fundamentally from the non-respondents. Members of unrecognized single-gender social organizations, a group which includes the UC president, Yasmin Z. Sachee ’18, likely have significantly more motivation to fill out this survey than students who are not. Additionally, the fact that these groups are organized means that they can encourage participation among like-minded individuals and suggests that any results will be biased. Non-response bias can be mitigated in two ways: either with a representative, probability based sample or by achieving a response rate near 100 percent.

Since this survey does not employ a probabilistic sampling method, and does not collect nearly enough demographic information to accurately generalize its results, we must turn to the latter. The two most recent analogous surveys, in terms of the method of administration, are UC elections and the fall 2016 UC referendum on USGSO sanctions. UC elections have seen chronically low participation, and the fall 2016 referendum saw less than 50 percent turnout. There is no evidence that this survey will be immune to the same non-response bias.

Each method of minimizing non-response bias presents its own solution. The UC could conceivably sample from the student population randomly instead of through an opt-in method. Though a 100 percent response would still not be reached, the fundamental difference with this approach is that the responses and non-responses would both be randomly distributed. The current survey presents a way for institutions, like USGSOs, to amplify their voice relative to the rest of the student population in a way that would be impossible with a random sample. The other solution is to have the College make a survey question like this a part of spring check-in, thus making it mandatory for all students, and forcing a higher participation rate. With the faculty vote scheduled for November, this is currently untenable, but this could be an effective way to measure student opinion in the future.

Considering that Sachee is a member of a final club, and that both she and UC Vice President Cameron K. Khansarinia ’18 concentrate in Government, a department that mandates its undergraduates learn about good survey design (through either a Statistics course or department equivalent), the average student could be forgiven for viewing the survey’s methodology as a tactic to intentionally bias the results in a certain manner. I still believe it is important to assume best intentions. Sachee and Khansarinia’s goal is indeed admirable: The student voice must be heard in this debate. My apprehension lies not with their intentions, but with their methods.

The UC survey is so fundamentally flawed that it cannot produce the “updated pulse on what students want with regard to social groups” that it intends to measure. The faults are so blatant that Sachee and Khansarinia even mentioned them in a footnote to the email releasing the survey. The results might be biased towards pro-USGSO sentiments, but in an unmeasurable way. As such, the UC should cancel the survey and look into scientifically sound ways to measure student opinion. If none can be found, then no survey should be administered. Student voice is important, but only if it is accurately represented. If data cannot be trusted to accurately represent what students want, then it at best adds no value to the ongoing debate, and at worst it obscures the truth.

Jack W. Deschler ’19 is a joint Computer Science and Government concentrator in Cabot House. He is a member of the Committee on Student Life and the Co-Chair of the Cabot House Committee.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
Op Eds