Sunday, May 13, 2007

The University of Santo Tomas

Varsitarian, the newspaper of the University of Santo Tomas (UST) in the Philippines has published an article questioning the credibility of the THES-QS world university rankings.

The complaint appears to be valid although the newspaper makes several errors about the rankings.

Alberto Laurito, assistant to the rector for planning and development at UST, has claimed that QS got the number of students wrong. The consultants reported 11, 764 students whereas the correct number is 32,971. The university’s figure seems to be correct. An article by Guzman and Torres in the Asia Pacific Education Review reports 32,322 students in 2002-3. However, QS’s deflating of student numbers, if it were the only mistake, would work to UST’s advantage in a number of ways. Firstly fewer students mean fewer students per faculty, if the number of the latter is constant, and hence a lower score on the student–faculty ratio component of the rankings. Secondly, if the number of international students is the same, fewer students overall means a higher percentage of international students.

However, this is not QS’s only error. They report that UST has 524 faculty, making a student faculty ratio of 22.45. According to the article, in 2002-3 UST had 1500 faculty. With 32,322 students, this would mean a faculty student ratio of 21.55. QS has made two errors and they have pretty much cancelled each other out.

Laurito then complained:

that THES-QS research on peer review was also irregular, considering that it was worth 40 per cent of the entire survey when only 1,600 universities turned in their responses or about one per cent of the 190,000 needed

The low response rate does of course invalidate the “peer review” but it was individual academics who were surveyed, not universities.

Laurito then points out that UST got a zero for research citations:

The score is obtained through a research collation database maintained by Thomson, an information-based solutions provider, called Essential Science Indicators (ESI). For every citation given to a university researcher or professor, the university would acquire a point.

The procedure is not like this at all. Laurito continues:

Based also on the survey, UST received the lowest grade on international outlook (meaning UST has no international students or faculty) when the University actually has seven international professors and 300 international students.”

Again, not quite. UST gets a score from QS of 3.6 for international faculty and 0.6 for international students, representing 12 international faculty members and 47 international students.

Laurito has got the wrong end of several sticks but the basic point still remains that QS got the data for students, faculty and international students wrong.

The newspaper then quotes Laurito as saying:

We were told by the research representative (of THES-QS) that the data they used were personally given to them by a University personnel, but they were not able to present who or from what office it came from

If Laurito is reported correctly and if this is what the “research representative” told him, there is something very strange here.

IF QS have a documentary record of an e-mail or a phone call to UST how could the record not indicate the person or office involved?

If they do not, how can QS be sure that the information came from an official university source or that there was any contact at all?

2 comments:

Anonymous said...

Yes there are flaws with the THES but so are there with any ranking systems. The Shanghai Jiaotong is criticized for being to science orientated (and along the heritage/history line) and Newsweek too American orientated (by combining relevant proportion of Jiaotong and THES).

It would be good that if you can write an article on the criteria to be listed to make a really good university ranking. You would soon find that the task is a difficult one. One possible criterion to be added may be count of mentions in popular newspaper, eg, Newsweek or Time. Unfortunately, which newspapers chosen would constitute biasness. Schools such as LSE may perhaps been even more highly counted than Harvard, simply because of the world's interest in financial news. Another thing is that sometimes reputation is not really justified. For example, MIT is much more famed than Caltech, even though when comes to citation count/Nobel winners etc, they are close.

Help the world improve the system. You have lamented enough about rankings, lets do something more constructive by using your knowledge and devise a more effective ranking system.

Anonymous said...

Nice try to test my contention about honest data provision by administrators, Richard.

My page specifically on rankings is http://www.benwilbrink.nl/literature/rankings.htm.

Be sure to download the article by Van Raan (2005). http://www.cwts.nl/cwts/AvR-ShanghaiConf.pdf

Van Raan and his institute at Leiden University (no, I have no connections to my local university) now claim to be able to rank fairly. I am sceptical. The problem, of course, is not the method of ranking, but the idea of ranking itself. University teaching and academic research are no Olympic sports.

Ben Wilbrink