Monday, January 29, 2007

More About Duke

On January 23rd I wrote to John O’Leary, Editor of the Times Higher Education Supplement concerning the data for Duke University in the 2006 world university rankings. I had already pointed out that the 2006 data appeared to be inaccurate and that, since Duke had the best score in the faculty–student section against which the others were benchmarked, all the scores in this section and therefore all the overall scores were inaccurate. There has to date been no reply. I am therefore publishing this account of how the data for Duke may have been constructed.

It has been clear for some time that the score given to Duke for this section of the rankings and the underlying data reported on the web sites of THES’s consultants, QS Quacquarelli Symonds, were incorrect and that Duke should not have been the highest scorer in 2006 on this section. Even the Duke administration has expressed its surprise at the published data. What has not been clear is how QS could have come up with data so implausible and so different from those provided by Duke itself. I believe I have worked out how QS probably constructed these data, which have placed Duke at the top of this part of the rankings so that it has become the benchmark for every other score.

In 2005 Duke University made an impressive ascent up the rankings from 52nd to 11th. This rise was due in large part to a remarkable score for faculty-student ratio. In that year Duke was reported by QS on their topgraduates site to have a total of 12,223 students, comprising 6,248 undergraduates and 5, 975 postgraduates, and 6,244 faculty, producing a ratio of 1.96 students per faculty. The figure for faculty was clearly an error since Duke itself claimed to have only
1,595 tenure and tenure-track faculty and was almost certainly caused by someone entering the number of undergraduate students at Duke, 6,244 in the fall of 2005, into the space for faculty on the QS database. In any case someone should have pointed out that large non-specialist institutions, no matter how lavishly they are funded, simply do not have fewer than two students per faculty

In 2006 the number of faculty and students listed on QS’s topuniversities site was not so obviously incredible and erroneous but was still quite implausible.

According to QS, there were in 2006 11,106 students at Duke, of whom 6,301 were undergraduates and 4,805 postgraduates. It is unbelievable that a university could reduce the number of its postgraduate students by over a thousand, based on QS’s figures, or about two thousand, based on data on the Duke web site, in the course of a single year.

There were in 2006, according to QS, 3,192 faculty at Duke. This is not quite as incredible as the number claimed in 2005 but is still well in excess of the number reported on the Duke site.

So where did the figures, which have placed Duke at the top of the faculty student ratio component in 2006, come from? The problem evidently faced by whoever compiled the data is that the Duke site has not updated its totals of students and faculty since the fall of 2005 but has provided partial information about admissions and graduations which were used in an attempt to estimate current enrollment for the fall of 2006.

If you look at the Duke site you will notice that there is some information about admissions and graduations. At the start of the academic year of 2005 – 2006 (the “class of 2009”) 1,728 undergraduates were admitted and between July 1st, 2005 and June 30th, 2006 1,670 undergraduate degrees were conferred.

So, working from the information provided by Duke about undergraduate students we have;

6,244-
1,670+
1,728
=
6,302

The QS site indicates 6,301 undergraduate students in 2006.

It seems likely that the number of undergraduates in the fall of 2006 was calculated by adding the number of admissions in the fall of 2005 (it should actually have been the fall of 2006) to the number enrolled in the fall of 2005 and deducting the number of degrees conferred between July 2005 and June 2006. The total thus obtained differs by one digit from that listed by the QS site. This is most probably a simple data entry error. The total obtained by this method would not of course be completely valid since it did not take account of students leaving for reasons other than receiving a degree. It would, however, probably be not too far off the correct number.

The number of postgraduate students is another matter. It appears that there was a botched attempt to use the same procedure to calculate the number of graduate students in 2006. The problem, though, was that the Duke site does not indicate enrollment of postgraduate students in that year. In the fall of 2005 there were 6,844 postgraduate students. Between July 2005 and June 2006 2,348 postgraduate and professional degrees were awarded, according to the Duke site. This leaves 4,496 postgraduate students. The QS topuniversities site reports that there were 4,805 postgraduates in 2006. This is a difference of 309.

So where did the extra 309 postgraduates come from? Almost certainly the answer is provided by the online Duke news of September 6, 2006 which refers to a total of 1,687 first year undergraduate students composed of 1,378 entering the Trinity College of Arts and Science (Trinity College is the old name of Duke retained for the undergraduate school) and 309 undergraduates entering the Pratt School of Engineering. The total number of admissions is slightly different from the number given on the Duke main page but this may be explained by last minute withdrawals or a data entry error.

So it looks like someone at QS took the number of postgraduate students in 2005 , deducted the number of degrees awarded and added students admitted to the Pratt School of Engineering in the fall of 2006 and came up with the total of 4,805 in 2006. This is way off the mark because the 309 students admitted to the School of Engineering are not postgraduates, as is evident from their inclusion in the class of 2010, and no postgraduate admissions of any kind were counted. The result is that Duke appears erroneously to have lost about 2,000 postgraduate students between 2005 and 2006.

The undergraduate and postgraduate students were then apparently combined on the QS site to produce a total of 11,106 students, or about 1,000 less than QS reported in 2005 and about 2,000 less than indicated by Duke for that year.

What about the number of faculty? Here, QS’s procedure appears to get even dodgier. The Duke site refers to 1,595 tenure and tenure track faculty. The QS site refers to 3,192 faculty. Where does the difference come from? The answer ought to be obvious and I am embarrassed to admit that it took me a couple of hours to work it out. 1,595 multiplied by 2 is 3190 or exactly 2 less than QS’s figure. The slight difference is probably another data entry error or perhaps an earlier error of addition.

The Duke site contains a table of faculty classified according to school – Arts and Sciences, Engineering, Divinity and so on adding up to 1,595 and then classified according to status – full, associate and assistant professors, again adding up to 1,595. It would seem likely that someone assumed that the two tables referred to separate groups of faculty and then added them together.

So, having reduced the number of students by not including postgraduate admissions and doubling the number of faculty by counting them twice, QS seem to have come up with a a ratio of 3.48 students per faculty This gave Duke the best score for this part of the ranking against which all other scores were calibrated. The standardized score of 100 should in fact have been given to Yale, assuming, perhaps optimistically, that this ratio has been calculated correctly.

It follows that every score for the faculty student ratio is incorrect and therefore that every overall score is incorrect.

If there is another explain for the unbelievable Duke statistics I would be glad to hear it. But I think that if there is going to be a claim that an official at Duke provided information that is so obviously incorrect then the details of the communication should be provided. If information was obtained from another source, although I do not see any way that it could be, it should be indicated. Whatever the source of the error, someone at QS ought to have checked the score of the top university in each component and should have realized immediately that major universities do not reduce the number of their students so dramatically in a single year and keep it secret. Nor is it plausible that a large general university could have a ratio of 3.48 students per faculty.

To show that this reconstruction of QS’s methods is mistaken would require nothing more than indicating the source of the data and an e-mail address or citation by which it could be verified.

Friday, January 12, 2007

And Then There Were None

Something very odd has been going on at the University of Technology Sydney (UTS), if we can believe QS Quacquarelli Syminds, THES's consultants.

In 2005, according to QS, UTS had a faculty of 866 of whom 253 were international. The latter figure is definitely not real information but simply repesents 29% of the total faculty, which is QS's estimate or guess for Australian universities in general. This should have given UTS a score of 53 for the international faculty component on the 2005 world university rankings although the score actually given was 33. This was presumably the result of a data entry error. UTS was ranked 87th in the 2005 rankings.

In 2006, according to QS, the number of faculty at UTS increased dramatically to 1,224. However, the number of international faculty dropped to precisely zero. Partly as a result of this UTS's position in the rankings fell to 255.

Meanwhile, UTS itself reports that it has 2,576 full time equivalent faculty.
How Long is an Extended Christmas Break?

On the 21st of December I received a message from John O'Leary, Editor of THES, that he had sent my questions about the world university rankings to QS and that he hoped to get back to me in the new year since UK companies often have an extended Chtistmas break.

Assuming it started on December 25th, the break has now lasted for 18 days.

Monday, January 01, 2007

A Disgrace in Every Sense of the Word

That is the opinion of the Gadfly, a blog run by four Harvard undergraduates, of the THES world rankings. Here is a quotation:

"The Times Higher Education Supplement (THES) just released their global rankings, and it’s an utter scandal. Rife with errors of calculation, consistency and judgment, it is a testament not only to this ridiculous urge to rank everything but also to the carelessness with which important documents can be compiled."

The post concludes:

"One cannot help but think that the THES rankings are a British ploy to feel good about Oxford and Cambridge, the former of which is having a hard time pushing through financial reforms. Both are really universities who should be doing better, and are not. It may explain why Cambridge ups Harvard on the THES peer review, despite the fact that it lags behind Harvard under almost every other criteria, like citations per faculty, and citations per paper in specific disciplines."

Bangor is Very Naughty


Bangor University in Wales has apparantly been fiddling about with its exam results in order to boost its position in university rankings (not this time the THES world rankings). One wonders how much more of this sort of thing goes on. Anyway, here is an extract from the report in the THES . Contrary to what many people in Asia and the US think, the THES and the Times are separate publications.


And congratulations to Sam Burnett.


Bangor University was accused this week of lowering its academic standards with a proposal to boost the number of first-class degrees it awards.

According to a paper leaked to The Times Higher, the university agreed a system for calculating student results that would mean that about 60 per cent of graduates would obtain either a first or an upper-second class degree in 2007, compared with about 52 per cent under the current system.

The paper, by pro vice-chancellor Tom Corns, says that the university's key local rival, Aberystwyth University, "awarded 6.7 per cent more first and upper-second class degrees than we did". At the time, this helped place Bangor eight positions below Aberystwyth in The Times 2005 league table of universities.

He says: "We must redress the balance with all expedition", meaning the reforms are likely to take effect for 2007 graduates rather than for the 2007 entry cohort.

The move prompted heavy criticism this week. Alan Smithers, director of the Centre for Education and Employment Research at Buckingham University, said: "Hitherto, universities have been trusted to uphold degree standards, but such behaviour calls into question the desirability of continuing to allow them free rein in awarding their own degrees. Perhaps there should be an independent regulatory body."

He suggested that a body such as the Qualifications and Curriculum Authority, which regulates schools' exam awards, could be set up for higher education.

Sam Burnett, president of Bangor student union, said that Bangor had been "very naughty".

"The issue isn't about the system that should be in place... University figures seem to have identified the quickest way to boost Bangor up the league tables and will cheapen degrees in the process. Maybe it would be easier just to add 5 per cent to everyone's scores next July."