Canalblog
Editer l'article Suivre ce blog Administration + Créer mon blog
Formation Continue du Supérieur
25 mars 2012

Gaming in the American university ranking system

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy William Patrick Leonard. International parents and students considering an undergraduate education in the United States frequently consult one or more of the big three ranking publications – Shanghai, QS and Times Higher Education. Emphasising research publication productivity and an institution's accompanying reputation, the reports tend to filter out all but the top tier research institutions in any country.
With hundreds of high-quality yet sub-top tier institutions in the United States, these premier reports are of limited utility to international parents and students comparing American institutions. US News and World Report’s rankings have been considered a source of reliable and relevant consumer information to fill the gap.
Since its initial ranking report in the mid 1980s, US News and World Report (USN&WR) has expanded its portfolio of reports to include a selection of discipline- and professional-based rankings, as well as an array of regional reports.
In the United States it has retained its dominance as a relevant source of comparative institutional information. It is said that within hours of its annual autumn release, visits to its website jump into the millions. One can safely presume that a good number of international visitors are among them.
USN&WR’s potential appeal to parents and students appears to be the inclusion of more relevant consumer information in its mix of ranking metrics. The ‘big three’ and many of the other ranking systems that have sprung up in the past 30 years gauge institutional quality by a mix of metrics frequently calibrated to the generation of new knowledge and subsequent impact.
USN&WR appears to focus on a mix of metrics more directly reflecting the quality of an institution’s academic programming and its graduates. Well over half of the available 100 points are allotted to admission rates, student-faculty ratio, freshman retentions and graduate rates. New knowledge generation-weighted ranking systems tend to rely on one or more independent third parties for their metrics. An example is Thomson Reuters’ Social Sciences Citation Index, which tracks publications in 2,474 major social sciences journals across 50 academic disciplines. An academic’s research productivity and impact is affirmed by an impartial third party source. USN&WR ranking are substantially based on self-reported data provided by each institution. Thus, the validity of the data could raise concern.
Gaming the rankings
A recent New York Times article, “Gaming the College Rankings”, identified a handful of relatively well-known undergraduate programmes that have been found to or have acknowledged “…twisting the meanings of rules, cherry-picking data or just lying”.
Robert Morse, USN&WR’s director of research, is quoted as saying that Claremont McKenna College is “the highest-ranking school…to admit to misreporting”. The college acknowledged that a high-ranking officer had inflated the average SAT scores given to USN&WR over the past six years.
Gaming has also been found below nationally ranked institutions. The New York Times further reported that when Iona College, a small institution in a New York City suburb, had its 30th rank among the northeast’s regional universities reviewed against corrected data, it was found that it would have dropped to 50th. Professional schools have also been identified as gamers. In recent years two law schools, Villanova University and the University of Illinois, have admitted that they misreported selected statistics. The same New York Times article reported that Villanova conceded that its deception was intentional. Illinois did not acknowledge misrepresentation.
A soon-to-be-released book, Failing Law Schools by Brian Tamanaha, a former law school dean, leads to a similar conclusion. He describes a number of questionable ways a school can attempt to advance its ranking standing. Among the tactics gaming institutions employed, Tamanaha cites selectively reporting admissions test results to pump up its image, hiring its own graduates on short-term contracts to inflate its employment statistics and selectively reporting starting salaries.
There is, however, no reason to mistrust the USN&WR rankings because they partially rely on self-reported data. The publication does cross-check self-reported data against other public sources. Further, it adjusts its metrics and seeks to close loopholes on a continuing basis. The vast majority of reporting institutions do play by the rules and report accurate data. Still, international parents and potential students may want to consult other public comparison sources.
* William Patrick Leonard is vice dean of SolBridge International School of Business in Daejeon, Republic of Korea.
Commentaires
Newsletter
49 abonnés
Visiteurs
Depuis la création 2 783 504
Formation Continue du Supérieur
Archives