Though most of us have never served on a graduate program admissions committee, we can still appreciate the difficulty of their task: Given a stack qualified applicants, choose the few that you believe will succeed.
Where do you start? Perhaps you check on each applicant’s GPA, or focus just on the GPA in their science classes.
Or maybe you trust the Graduate Record Examination (GRE). After all, it’s designed to measure a student’s readiness for graduate school, right?
Because reviewers differ on which metrics they trust most, it’s worth considering a scientific approach to admissions. Are there any predictor variables that actually correlate with student outcomes?
Let Me Gaze into My Crystal Ball
That’s exactly the question taken up by Joshua Hall et al. in their recent paper titled “Predictors of Student Productivity in Biomedical Graduate School Applications.” If you’re paying close attention, you’ll notice that Joshua Hall is also the co-host of the Hello PhD podcast!
Josh works with a large biomedical training program at UNC Chapel Hill, and he and his colleagues have a vested interest in choosing the very best students from a large pool of applicants. They wanted to collect data on student outcomes, and then correlate them to the screening data included in a typical application.
Would GRE scores or extensive research experience help predict laboratory success?
Of course, “success” is subjective and difficult to define or measure. Instead, they used student publications as a proxy for productivity and a tangible outcome.
Using data from 270 graduate students admitted to the biomedical program at UNC Chapel Hill between 2008 and 2010, Josh and his colleagues correlated the following five factors to students publishing rates:
- Undergraduate GPA
- GRE Scores
- Length of prior research experience (in months)
- Letters of recommendation (standardized answers)
- On-site faculty interviews (standardized answers)
The results? Only recommender reviews had any significant correlation with student productivity. Even faculty reviews and longer research experience did not predict which students would publish.
Listen in to this week’s episode to hear more on each factor included in the study, and why it’s such a serious indictment of the GRE and other traditional screening metrics.
With time and additional research, we may be able to screen and interview graduate students in a fairer, more consistent way. For now, you’ll still need to shell out $200+ for that GRE score.
Feeling Hot, Hot, Hot
For this week’s Science in the News, Josh brings us down with a report that the global temperature is up.
A few years ago, climate scientists observed a ‘stall’ in the predicted warming caused by increasing CO2 levels. It turns out that may have just been due to a measurement error from data collected by ships crossing the oceans.
More recent analysis using data from a fleet of “Argo Floats” show that warming has been consistent, with the Earth breaking heat records for three years in a row. Here’s how the floats work:
This seemingly benign story of improved instrumentation now enters the realm of political theater, as politicians and pundits raise the specter fabricated data, media bias, and political pandering. Isn’t science fun?
With all that heat, we try to stay cool with the Jai Alai IPA from Cigar City Brewing. It’s crisp and refreshing, and a great way to introduce IPA to your friends who only drink Miller High Life.
“Oh, you wanted a High Life? Sorry, I thought you said Jai Alai. My bad. Also, you’re welcome.”