Friday, June 27, 2008

Why are German universities so bad?

A recent post by Steve Sailer compares the Shanghai Jiao Tong and the QS-Times Higher Education Supplement rankings (although he refers to the London Times, a completely different publication).

He asserts

"This Chinese list seems less chauvinistically biased than the London Times rankings I cited in tonight's VDARE article (Harvard #1 in both, but Stanford is #2 on the Chinese list vs. #19 on the English list, behind a number of obscure provincial colleges in England). Because it's a better list, it supports the point I made in VDARE even more strongly than the previous list did: that America's exclusive universities are now enormously prestigious relative to Germany's and the rest of the world's.

German colleges that would have dominated the list 100 years ago have been hit hard by sincere, leftist anti-elitism"

That the Chinese rankings are less chauvinistic than QS-THES is absolutely correct. And if he is being deliberately offensive by describing Oxford, Cambridge and Imperial and University Colleges, London as "obscure provincial colleges in England", I suppose that I couldn't really argue.

But surely the decline of German universities began well before leftist anti-elitism appeared on the scene? Didn't it begin with the mass expulsion of Jewish students and academics after 1933?

Tuesday, June 24, 2008

Resumption of Posting

Teaching and family affairs have kept me away from this blog for a few months. I hope to start posting regularly again soon.


QS’s Greatest Hits: Part One

For the moment, it might be interesting to review some of the more spectacular errors of QS Quacquarelli Symonds Ltd (QS), the consultants who collect the data for the Times Higher Education Supplement’s (THES) World University Rankings.

During its short venture into the ranking business QS has shown a remarkable flair for error. In terms of quantity and variety they have no peers. All rankers make mistakes now and then but so far there has been nobody quite like QS.

Here is a list of what I think are QS’s ten best errors, mainly from the rankings and their book, Guide to the World’s Top Universities (2007). Most of them have been discussed in earlier posts on this blog. The date of these posts is given in brackets. There is one error, relating to Washington University in St. Louis, from last year’s rankings,

It has to be admitted that QS seem to be doing better recently. Or perhaps I have not been looking as hard as I used to. I hope that another ten errors will follow shortly.


One: Faculty Student Ratio in Guide to the World’s Top Universities (2007). (July 27, 2007; May 11, 2007)

This is a beautiful example of the butterfly effect, with a single slip of the mouse leading to literally hundreds of mistakes.

QS’s book, Guide to the World’s Top Universities, was produced at the end of 2006 after the publication of the rankings for that year and contained data about student faculty ratios of over 500 ranked universities. It should have been obvious immediately that there was something wrong with this data. Yale is given a ratio of 34.1, Harvard 18, Cambridge 18.9 and Pretoria 590 .3. On the other hand, there are some ridiculously low figures such as 3.5 for Dublin Institute of Technology and 6.1 for the University of Santo Tomas (Philippines).

Sometimes the ratios given flatly contradict information given on the same page. So, on page 127 in the FACTFILE, we are told that Yale has a student faculty ratio of 34.3. Over on the left we are informed that Yale has around 10,000 students and 3,333 faculty.

There is also no relationship between the ratios and the scores out of 100 in the THES QS rankings for student faculty ratio, something that Matt Rayner asked about, without ever receiving a reply, on QS’s topuniversities web site.

So what happened? It’s very simple. Someone slipped three rows when copying and pasting data and every single student faculty ratio in the book, over 500 of them, is wrong. Dublin Institute of Technology was given Duke’s ratio (more about that later), Pretoria got Pune’s, Aachen RWT got Aberystwyth’s (Wales). And so on. Altogether over 500 errors.


Two: International Students and Faculty in Malaysian Universities.

In 2004 there was great jubilation at Universiti Malaya (UM) in Malaysia. The university had reached 89th place in the THES-QS world rankings. Universiti Sains Malaysia (USM) also did very well. Then in 2005 came disaster. UM crashed 100 places, seriously damaging the Vice-Chancellor’s career, and USM disappeared from the top 200 altogether. The Malaysian political opposition had a field day blasting away at the supposed incompetence of the university leadership.

The dramatic decline should have been no surprise at all. A Malaysian blogger had already noticed that the figures for international students and faculty in 2004 were unrelated to reality. What happened was that in 2004 QS were under the impression that larger numbers of foreigners were studying and teaching at the two Malaysian universities. Actually, there were just a lot of Malaysian citizens of Indian and Chinese descent. In 2005 the error was corrected causing the scores for international faculty and students to fall precipitously.

Later, THES referred to this as “a clarification of data”, a piece of elegant British establishment obfuscation that is almost as good as “being economical with the truth”


Three: Duke’s student faculty ratio 2005 ( October 30, 2006 )

Between 2004 and 2005 Duke rose dramatically in the rankings. It did so mainly because it had been given a very low and incredible student faculty ratio in the latter year, less than two students per faculty. This was not the best ratio in the rankings. That supposedly belonged to Ecole Polytechnique in Paris (more of that later). But it was favourable enough to give Duke a powerful boost in the rankings.

The ratio was the result of a laughable error. QS listed Duke as having 6,244 faculty, well in excess of anything claimed on the university’s web site. Oddly enough, this was exactly the number of undergraduate students enrolled at Duke in the fall of 2005. Somebody evidently had copied down the figure for undergraduate students and counted them as faculty, giving Duke four times the number of faculty it should have.


Four: Duke’s student faculty ratio 2006 (December 16, 2006)

Having made a mess of Duke’s student faculty ratio in 2005, QS pulled off a truly spectacular feat in 2006 by making an even bigger mess. The problem, I suspect, was that Duke’s public relations office had its hands full with the Lacrosse rape hoax and that the web site had not been fully updated since the fall of 2005. For students, QS apparently took undergraduate student enrollment in the fall of 2005, subtracted the number of undergraduate degrees awarded and added the 2005 intake. This is a bit crude because some students would leave without taking a degree, Reade Seligmann and Colin Finnerty for example, but probably not too inaccurate. Then, there was a bit of a problem because while the number of postgraduate degrees awarded was indicated on the site there was no reference to postgraduate admissions. So, QS seem to have deducted the degrees awarded and added what they thought was number of postgraduate students admitted, 300 of them, to the Pratt School of Engineering, which is an undergraduate, not a graduate school. Then, in a final flourish they calculated the number of faculty by doubling the figure on the Duke site, apparently because Duke listed the same number classified first by department and then by status.

The result was that the number of students was undercounted and the number of faculty seriously overcounted, giving Duke the best student faculty ratio for the year. Although the ratio was higher than in 2005 Duke was now in first place for this section because QS had calculated more realistic ratios for the Ecole Polytechnique and the Ecole Normale Superieure.


Five: Omission of Kenan Flagler from the Fortune business school rankings. (March 05, 2007)

On the surface this was a trivial error compared to some that QS has committed. They got the business school at the University of North Carolina mixed up with that of North Carolina State University. The grossness of this error is that while most American universities seem unconcerned about the things that QS writes or does not write about them, business schools evidently feel that more is at stake and also have considerable influence over the magazines and newspaper that publish rankings. Kenan-Flagler protested vociferously over its omission, Fortune pulled the ranking off its site, Nunzio Quacquarelli, director of QS, explained that it was the result of a lapse by a junior employee and stated that this sort of thing had never happened before and would never happen again.


Six: "Beijing University"

China’s best or second best university is Peking University. The name has not been changed to Beijing University apparently to avoid confusion with Beijing Normal University. There are also over twenty specialist universities in Beijing: Traditional Chinese Medicine, Foreign Languages, Aeronautics and so on.

In 2004 and 2005 THES and QS referred to Beijing University finally correcting it to Peking University in 2006.

This was perhaps not too serious an error except that it revealed something about QS’s knowledge of its own sources and procedures.

In November 2005. Nunzio Quacquarelli went to a meeting in Kuala Lumpur, Malaysia. Much of the meeting was about the international students and faculty at UM and USM. There was apparently also a question about how Beijing University could have got such a magnificent score on the peer review while apparently producing almost no research. The correct answer would have been that QS was trying to find research written by scholars at Beijing University, which does not exist. Quacquarelli, however, answered that “we just couldn’t find the research” because Beijing University academics published in Mandarin (Kuala Lumpur New Straits Times 20/11/05).

This is revealing because QS’s “peer review” is actually nothing more than a survey of the subscribers to World Scientific, a Singapore-based company that publishes academic books and journals, many of them Asia-orientated and mostly written in English. World Scientific has very close ties with Peking University. If Quacquarelli knew very much about the company that produces his company’s survey he would surely have known that it had a cozy relationship with Peking University and that Chinese researchers, in the physical sciences at least, do quite a lot of publishing in English.


Seven: Student faculty ratios at Yonsei and Korea universities (November 08, 2006)

Another distinguished university administrator whose career suffered because of a QS error was of Yonsei University. This university is a rival of Korea University and was on most measures its equal or superior. But on the THES – QS rankings it was way behind, largely because of a poor student faculty ratio. As it happened, the figure given for Korea University was far too favourable and much better even than the ratio admitted by the university itself. This did not, however, help Jung Chang-Young who had to resign.


Eight: Omission of SUNY – Binghamton, Buffalo and Albany

THES and QS have apologized for omitting the British universities of Lancaster, Essex and Royal Holloway. A more serious omission is the omission of the State University of New York’s (SUNY) University Centres at Buffalo, Albany and Binghamton. SUNY has four autonomous university centres which are normally treated as independent and are now often referred to as the Universities of Buffalo and Albany and Binghamton University. THES-QS does refer to one university centre as Stony Brook University, probably being under the impression that this is the entirety of the SUNY system. Binghamton is ranked 82nd according to the USNWR and 37th among public national universities (2008). It can boast several internationally known scholars such as Melvin Dubofsky in labour history and Immanuel Wallerstein in sociology. To exclude it from the rankings while including the likes of Dublin Institute of Technology and the University of Pune is ridiculous.


Nine: Student faculty ratio at Ecole Polytechnique (September 08, 2006)

In 2005 the Ecole Polytechnique went zooming up the rankings to become the best university in continental Europe. Then in 2006 it went zooming down again. All this was s because of extraordinary fluctuations in the student faculty ratio. What happened could be determined by looking at the data on QS’s topgraduate site. Clicking on the rankings for 2005 led to the data that was used for that year (it is no longer available). There were two sets of data for students and faculty for that year, evidently one containing part-time faculty and another with only full time faculty. It seems that in 2005 part-time faculty were counted but not in 2006.


Ten: Washington University in St Louis (November 11, 2007)

This is a leading university in every respect. Yet in 2007, QS gave it a score of precisely one for citations per faculty, behind Universitas Gadjah Mada, the Dublin Institute of Technology and Politecnico di Milano and sent it falling from 48th to 161st in the overall rankings. What happened was that QS got mixed up with the University of Washington (in Seattle) and gave all WUSL’s citations to the latter school.