This post appears here courtesy of The James G. Martin Center
. The author of this post is Robert C. Thornett
As students and parents shop for colleges, trying to envision what they will receive in return for tens or hundreds of thousands of dollars, one of the central questions on their minds is, "How much personal attention and access to professors does this college offer?"
No simple statistic can provide an answer. At first glance, average class size seems like a good measure. But it can be misleading, as it varies widely depending on the type of class. For example, introductory STEM courses often have 200 students or more, whereas introductory writing courses tend to be capped at between 15 and 30.
Like average class size, student-faculty ratio also has the appearance of a rough gauge of personal attention. But it can be even more misleading than average class size, for several reasons. First, the most obvious problem with student-faculty ratio is that the process of reporting it is based on an honor system. While there is a standard formula provided by the National Center for Education Statistics (NCES), which schools are supposed to use, colleges do not have to report any of the specific details of their calculations, only the outcome. Given that, numerous times over the past decade, U.S. News & World Report has removed schools from its annual "Best Colleges"
ranking for misreporting data, and that, this past March, the dean of Temple's business school was sentenced to 14 months in prison for sending false information to the magazine, there is reason to doubt whether the honor system produces trustworthy results.
In fact, top-ranked colleges habitually cheat in the way they calculate student-faculty ratio, according to Columbia University mathematics professor Michael Thaddeus, who explored the subject in March in an 11,000-word article. Thaddeus argues that fudging their student-faculty ratios is one of several ways that the top-ten colleges, among others, knowingly misreport data. He further states that many lower-ranked colleges are punished for following the NCES guidelines.
For example, the NCES guidelines stipulate that students and faculty should not be counted in student-faculty ratio if they study or teach "exclusively in stand-alone graduate and professional programs,"
which confer degrees at the graduate level only. (Examples include law schools and medical schools.) By contrast, all students and faculty should be counted if they are in schools that confer both undergraduate and graduate degrees, like engineering or business schools. Thaddeus found, however, that Columbia did not follow the NCES formula: It excluded all graduate students in engineering and arts & sciences, even though these schools do also teach undergraduates. Thus, while Columbia has reported a student-faculty ratio of 6:1 every year since 2008, Thaddeus calculated that its actual student-faculty ratio is somewhere between 8:1 and 11:1. He notes that all of the other top-ranked colleges bend the rules in similar fashion. Thaddeus points out that Johns Hopkins' reported student-faculty ratio somehow dropped from 10:1 in 2016 to 7:1 in 2017, indicating that there must have been a dramatic change in the formula it used.
Further problems with student-faculty ratio stem from ambiguities in the NCES guidelines, which are also used by U.S. News and the government. The guidelines state that the total number of full-time equivalent (FTE) students and faculty should be calculated as the sum of a) all full-time students plus b) one-third the number of part-time students. But since colleges often define full-time status as 12 credits or more, the NCES guidelines indicate that one part-time student taking 9 or 11 credits counts the same as another taking 3 credits. Both count as one-third FTE. The first student might have three or four teachers, while the second only has one, yet both count the same in student-faculty ratio calculations. Or to give another example, students taking 12 credits count three times more than those taking 11. These sorts of issues can result in two schools with the exact same number of students and faculty having very different student-faculty ratios. This particularly affects colleges with a higher percentage of part-time students, like commuter schools.
Similar problems occur in calculating faculty FTE. The NCES guidelines specify only that full-time faculty should be counted as one FTE and part-time faculty as one-third FTE, without giving specifics about what constitutes "full-time"
For example, many full-time professors teach only a few classes and have other research and/or administrative duties to justify their full-time status, yet they count the same in student-faculty ratio as other full-time professors who teach four classes. Strangely, a full-time professor teaching only six credits counts three times more than an adjunct professor teaching the exact same six credits. Adjuncts often teach anywhere from 1 to 12 credits (or more) per year, and colleges where adjuncts teach higher individual course loads will appear to have fewer faculty per student. Thus the student-faculty ratio formula rewards colleges for hiring more adjuncts and keeping their individual course loads lower, which contributes to the widespread problem of colleges treating adjuncts as disposable, undermining both job security and the stability and continuity of faculty on campus.
What the above examples show is that there are myriad ways that student-faculty ratio can fluctuate that have nothing to do with the level of personal attention students receive. These include anything from colleges cheating in reporting data to random trends in the number or course loads of part-time students or adjunct faculty. In short, student-faculty ratio is, as Thaddeus puts it, a "blunt instrument,"
and one whose name suggests a relevance that is more than it can deliver.
In the big picture, student-faculty ratio is just one of many variables colleges tinker with in the high-stakes game of rankings. This game heavily favors colleges with the available wealth and resources to invest in manipulating the statistics.
Coming only a few years after the FBI's "Operation Varsity Blues,"
the 2019 college-admissions bribery scandal, Thaddeus's revelations about Columbia's misreporting present a dual irony: While parents are cheating to get their kids into colleges, colleges are cheating to get top rankings. As Malcolm Gladwell observed on his blog, "Parents weren't cheating to get their kids into Boise State. They were cheating to get their kids into schools that they'd been convinced-by systems like the U.S. News rankings-were worth cheating to get into. And now at least one of the schools that parents think are worth cheating to get into, is cheating in order to be on the list of schools worth cheating to get into."
If statistics like student-faculty ratio and average class size are not reliable indicators of personal attention and access to professors, what is? The truth is that a large part of the answer to this question lies within students. Few students take full advantage of office hours, the opportunity to engage one-on-one with professors outside of class. Students often go to office hours only in a crisis, not realizing that they can go simply to discuss an interesting topic, get ideas, or clarify something from class. Katie Snyder Martin, a teacher in Jacksonville, Texas, observes:
I worked at a university for eight years and office hours are a game changer. Students need to learn how to advocate for themselves to be successful in college. In high school, if their grades were slipping or they weren't engaging in the lessons, a teacher would often notice and intervene, call the parent, etc. An adult did the legwork to help them. But in college, a professor typically won't check the progress of their students frequently. There is minimal to zero intervention-and no parent contact, because the student is a legal adult. So the student needs to learn how to monitor their own progress and reach out for help when they need it.
Beyond office hours, colleges have writing and tutoring centers, and classes often have online chats moderated by professors and teaching assistants. A 2011 CBE: Life Sciences Education study found that students often need guidance in forming study groups, so prospective students and parents could look for ways that colleges facilitate the formation of such groups outside of class.
As Tocqueville wrote in Democracy in America, "Sentiments and ideas renew themselves, the heart is enlarged, and the human mind is developed only by the reciprocal action of men upon one another."
Nowhere is this truer than in education. In the age of mass higher-ed, using creative strategies proactively to form relationships with professors and peers is the most effective way to make even a large college smaller and to create the personal connections that students are looking for.
Robert C. Thornett has taught in seven countries and has written in Quillette, Education Next, Front Porch Republic, Solutions Journal, American Affairs, Modern Diplomacy, Earth Island Journal, and Yale e360.