Young runner for the circle
Aging our way
Work & Play
Passion for the Climb
Life of the Mind
Arts & Culture
New, Noted & Quoted
Get to Know
About the Scene
Alumni Bulletin Board
Marriages & More
Update Member Profile
Message from President Jeffrey Herbst
The start of a student’s college career has many rites: the crowded car on move-in day, the misty emotions of saying goodbye to family, orientation, starting new classes, making new friends. In recent decades, another tradition has emerged: the onslaught of college rankings.
(illustration by James Yang)
U.S. News & World Report
introduced its college rankings in 1983, an entire industry has developed that seeks to measure, parse, and evaluate higher education. Like U.S. News, some address the entire collegiate experience, while others are more focused. Those that have emerged in recent years include
, Kiplinger’s, CNBC, PayScale and
(graduates’ salaries), the Princeton Review (from academics and administration to the social scene), the Sierra Club (“eco-enlightened” universities),
Women’s Wear Daily
(best-dressed students), and
(most wired campuses). There’s even a twist in response to the college rankings craze: the online news magazine Gawker’s 25 Most Unranked Colleges in America.
Beyond our society’s obsession with rankings of all kinds, the impetus behind this industry stems from the challenge of evaluating and choosing from among the thousands of colleges and universities, and concerns about the increasing cost, as well as the value, of higher education. Choosing a college is enormously consequential, yet there is no concrete or clearly understood metric for decision making like quarterly profits or return on investment. And because families go through the process only a few times at most, there is no learning curve. As well, some measures are contradictory. For instance, we can be proud of a ranking that shows a significant percentage of our graduates earn high salaries, but alumni who influence thousands of lives in less-lucrative careers, such as teaching or the nonprofit sector, can only be considered great successes as well.
Educators have criticized many of these rankings, and paradoxically, the most popular,
is also viewed by many (including myself) as amongst the most problematic.
seeks to compress a huge range of measures into a single number to rank enormously complex and diverse institutions. The University of Michigan, for example, is an outstanding public institution with more than 40,000 students against whom Colgate (2,900 students) competes for applicants. The difference in student body size is only one indicator of the difficulty in comparing educational experiences and opportunities between the two. Also troubling is the fact that 22.5 percent of the
ranking is derived from reputational polls sent to senior academic administrators and high school guidance counselors. Like presidents across the country, I fill out this form, but I have sufficient knowledge of only a handful of schools, so I don’t feel my evaluation can be comprehensive. Many other
indicators measure inputs rather than outputs. For instance, one indicator is financial resources per student. That could be important; however, as former Dean of the Faculty Lyle Roelofs has pointed out repeatedly, if we received a vast sum of money and simply buried it in the ground, our students would see no benefit, but our
ranking would increase. As a result of these concerns, Colgate, among many other liberal arts schools, agreed some years ago not to use the
rankings in our publicity.
At the same time, some in higher education have gone too far the other way, suggesting that it is pointless to assess what students learn and what colleges do. I disagree; in fact, I welcome careful assessment and analysis because I believe that our faculty offer a world-class education and that we have a great story to tell. That does not mean, however, that we are perfect.
My philosophy is to continually measure defined aspects of what we do, while holding ourselves accountable for our ultimate output: the value of the education we provide. My colleagues and I devote an enormous amount of attention to the National Survey of Student Engagement, which provides student feedback on Colgate’s academic and extracurricular experiences and compares it to other schools. We also participate in other surveys that measure different aspects of Colgate compared to our peers.
Last year, working with the Board of Trustees, we developed “dashboard” indicators that are “owned” by my academic and administrative senior staff and provide important information on our progress over time and, when possible, compared to other schools. Examples include class size, faculty-student interaction, percentage of students from underrepresented groups, admissions acceptance rate, endowment-per-square-foot of physical plant, cost per dollar raised through fundraising, and student-athlete GPAs. We are also developing new ways to assess how we are doing. For example, last year, we began requiring all seniors to complete a comprehensive exit survey on their Colgate experience. The first set of responses has already yielded important findings, on topics as diverse as satisfaction with advising to course availability.
The reams of data we use to guide us are not as easy to digest as rankings. The process requires nuance and respect for the complexity of higher education. By assessing ourselves against our ambitions and our peers, we will be able to engage in continuous improvement. That is how we will move Colgate forward.