Friday, November 18, 2011

Update 3 on El Naschie vs Nature

The Guardian has a substantial report on the case by Alo Jha. It seems that El Naschie believes that expert witness Neil Turok is unqualified to understand his work.It is difficult to see how this argument, even if valid, is relevant to the point of whether or not peer review took place.

Should the court decide in favour of El Naschie, it would provide some sort of justification for the methods used in the citations indicator in the Times Higher Education rankings which gave high scores to Alexandria University mainly or partly because of the many citations of papers by El Naschie.

El Naschie is suing Nature as a result of a news article published in 2008, after the scientist's retirement as editor-in-chief of the journal Chaos, Solitons and Fractals. The article alleged that El Naschie had self-published several research papers, some of which did not seem to have been peer reviewed to an expected standard and also said that El Naschie claimed affiliations and honorary professorships with international institutions that could not be confirmed by Nature. El Naschie claims the allegations in the article were false and had damaged his reputation.
On Friday, Nature called Professor Neil Turok, a cosmologist and director of the Perimeter Institute in Canada, as an expert witness to assess some of the work published by El Naschie.
Turok described his expertise as being in cosmology. "I work at the theoretical end of cosmology … my work consists of applying unified theories, such as string theory, to the most difficult questions in cosmology, namely the beginning of the universe or the initial singularity, the moment where everything was at a single point in the conventional description."
In his evidence, Turok said he found it difficult to understand the logic in some of El Naschie's papers. The clear presentation of scientific ideas was an important step in getting an idea accepted, he said. "There are two questions – one is whether the work is clearly presented and readers would be able to understand it. It would be difficult for a trained theoretical physicist to understand [some of El Naschie's papers]. I couldn't understand it and I made a serious attempt to understand it. The second question is about the correctness of the theory and that will be decided by whether it agrees with experiments. Most theories in theoretical physics are speculative – we form a logical set of rules and deductions and we try, ultimately, to test the deductions in experiments. For me, clear presentation is the first thing in the presentation of a theory."
In response, El Naschie pointed out that even Albert Einstein had made mistakes in his publications. "Einstein is the most sloppy scientist ever. He never defined his quantities, he doesn't put in references and he made so many mistakes of mathematics and concepts. He was a very natural man when he explained something to lay people. But Einstein, whom I admire very much because he had imagination and the courage to stand up to the bloody Nazis, Einstein was an extremely sloppy man."
Later in the session, El Naschie accused Turok of having "no idea" about mathematics and being unqualified to assess his work. "If somebody doesn't understand things, it's his own limitation," El Naschie said.

Thursday, November 17, 2011

Times Higher Social Science Rankings


1.  Stanford
2.  Harvard
3.  Oxford
The Influence of Rankings

Varsity, the student newspaper at Cambridge, suggests that British universities are recruiting staff in order to improve their position in the QS rankings:


Matthew Knight, chairman of Universities HR and the University of Leeds HR director, said: “Within the context of £9,000 fees, many universities have a strategic drive to improve the quality of the student experience.
“Therefore, many are taking the opportunity to improve student staff ratios regardless of the numbers of applicants. So there’s a lot of recruitment going on at some universities, although there’s no specific pattern to this.”
As the QS World University Rankings use student-faculty ratios as the only globally comparable indicator to determine their tables, an increase in employment can be used to promote a university’s image and attract students.

Wednesday, November 16, 2011

Update 2 on El Naschie and Nature

Note that New Scientist describes El Naschie as an "independent physicist". Does this imply that he has no affiliation and that Nature was correct in questioning his claims to academic status?
Update on El Naschie and Nature

The New Scientist has provided some coverage of the trial which is also discussed at El Naschie Watch. On November 15,  this item by Chelsea Whyte appeared:


Benjamin De Lacy Costello, a materials scientist at the University of the West of England in Bristol, UK, testified yesterday that when El Naschie was editor, the peer-review process at Chaos, Solitons and Fractals was "frustrating" and unlike that of other journals.

With regard to the dispute over El Naschie's affiliations, Timothy John Pedley, former head of the department of applied mathematics and theoretical physics at the University of Cambridge, said that El Naschie was a visiting scholar with access to libraries and collaborations at the department, but was not an honorary scholar working with the privileges of a professor.

On November 16 this update appeared:
Update: Mohamed El Naschie, a former editor of the scientific journal Chaos, Solitons and Fractals, appeared in London's High Court today for the libel lawsuit he has brought against the scientific journal Nature.

El Naschie is representing himself.
During El Naschie's cross-examination of journalist Quirin Schiermeier, who wrote the 2008 article about him, Schiermeier stood by the content of the work, saying, "We wrote the article because you published 58 papers in one year in a journal where you acted as editor-in-chief. That is unusual and potentially unethical."

El Naschie responded that he felt it wasn't unheard of for journals to publish work that isn't peer-reviewed. He also said that his work had been stolen. "We published my work to secure it," he told the court. "Senior people are above this childish, vain practice of peer review."

I am not an expert, but it seems that El Naschie does not appear to dispute any longer  that his pattern of self-publication was unusual or that there had  been little or no peer review. He is simply claiming that publication was necessary to preempt the theft of his work by rivals and that the absence of peer review was excused by his seniority. Whether that is inconsistent with Nature's comments is, I assume, a matter for the judge to decide.


El Naschie and Nature

The El Naschie vs Nature case is under way at the Royal Courts of Justice in London.

Briefly, Mohamed El Naschie, the former editor of the journal Chaos, Solitons and Fractals, is suing the journal Nature and the writer Quirin Schiermeier for its comments on the journal's publication of many of his own papers.

El Naschie is claiming that he was defamed by the suggestion that his papers were of poor quality and were published without a normal peer review process. He also claims that he had been defamed by the imputation that he had claimed academic affiliations to which he was not entitled.

The case is of vital importance to academic freedom since if successful it would mean that wealthy persons could stifle even the most balanced and temperate comments on scientific and scholarly activities.

 It is also of importance to the question of international university ranking since El Naschie's unusual self-publication and self-citation within a short period of time in a field where citations are low allowed Alexandria University to achieve an extraordinarily high score in the 2010 Times Higher Education World University Rankings. Even this year,  the university had a n unreasonably high score in the ranking's research impact indicator. If El Naschie were successful in his claim then Times Higher and Thomson Reuters, who collected and analysed the data for the rankings, would be able to argue that they had uncovered a small pocket of excellence.

The case has been covered extensively in El Naschie Watch and has been discussed in the scientific press.

Updates will be provided from time to time.

Tuesday, November 15, 2011

The THE Subject Rankings

The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.

Engineering and Technology

1.  Caltech
2.  MIT
3.  Princeton

Arts and Humanities

1.  Stanford
2.  Harvard
3.  Chicago

Clinical, Pre-Clinical and Health

1.  Oxford
2.  Harvard
3.  Imperial College London

Life Sciences

1.  Harvard
2.  MIT
3.  Cambridge

Physical Sciences

1.  Caltech
2.  Princeton
3.  UC Berkeley

Social Sciences

To be posted on the 17th of November.

Monday, November 07, 2011

Conference in Shanghai

I hope to post something in a day or two on the recent World Class Universities conference in Shanghai. Meanwhile, there is an interesting comment by Alex Usher of Higher Education Strategy Associates, a Canadian consulting firm.

"In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned. "

Some technical points. First, Times Higher Education Supplement changed its name to Times Higher Education when it converted to a magazine format in 2008.

Second, the Shanghai rankings are not entirely free from commercial pressures themselves although that has probably had the laudable effect of maintaining a stable methodology since 2003.

Third, both THE and QS accept data from institutions but both claim to have procedures to validate them. Also, the Shanghai rankings do include data from government agencies in their productivity per capita criterion and in some places that might not be any more valid than data from universities.

Fourth, until recently there has been a significant difference in the expert opinion used by THE and by QS. Most of QS's survey respondents were drawn from the mailing lists of the Singapore- and London- based academic publishers, World Scientific,  while THE's are drawn from those who have published papers in the ISI indexes. All other things being equal, we would expect THE's respondents to be more expert. This year the difference has been reduced somewhat as QS are getting most of their experts from the Mardev lists supplemented by a sign up facility.

Fifth, although THE publish a list of 200 universities in print and on their site, there is a fairly easily downloadable iphone app available that lists 400 universities.

The most important point though is the question of consistency. It is quite true that the various indicators in the Shanghai rankings correlate quite closely or very closely with one another (.46 to .90 in 2011 according to a conference paper by Ying Chen  and Yan Wu of the Shanghai Center for World- Class Universities) while some of those in the QS and THE rankings have little or no relation to one another. However, it could be argued that if two indicators show a high correlation with one another then they are to some extent measuring the same thing and one of them is redundant. Still, that is probably better than indicators which statistically have little to do with one another.

What is more important perhaps is the consistency from one year to another. The main virtue of the Shanghai rankings is that changes in position can be assumed to reflect actual real world changes whereas those in the THE and QS rankings could easily be the result of methodological changes or, in the case of THE, omissions or inclusions.

Friday, October 28, 2011

An Error

This replaces an earlier post.

Last year Times Higher Education admitted to an error involving Monash University and the University of Adelaide

Also, after the launch of the World University Rankings 2010 it became apparent that, owing to a data processing error, the ranking positions of two Australian universities in the top 200 list were incorrect — the University of Adelaide and Monash University.

Both universities remain in the top 1 per cent of world universities.


This year, a representative of Adelaide commented on the error: 


Adelaide's DVCR Mike Brooks said it had been "disconcerting'' that there had been a data processing error last year in the first iteration of the revised rankings since their split from QS. "It certainly raises further questions about the credibility of the rankings,'' Professor Brooks said.

"Based on our own analysis we believe that we have a similar ranking this year to that of 2010. The shift in position is attributed to the error in the processing last year, ongoing changes in THE methodology and increased competition.''

"I think the students and the wider community are able to judge for themselves.  As South Australia's leading research-university and only member of the Group of Eight, I know that we are in an incredibly strong position for the future.''

Adelaide's fall seems to have been due very largely to a massive fall in the score for research impact. How much of this was due to the correction of the 2010 error, how much to changes in methodology and how much to the inherent instability of the normalisation procedure is not clear

Monday, October 17, 2011

GLOBAL: Despite ranking changes, questions persist 

 My article on the Times Higher Education World University Rankings can be accessed at University World News

The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.

Read here

Sunday, October 09, 2011

Rising Stars of the THE - TR Rankings

These are some of the universities that have risen significantly in the rankings compared to last year.

Europe

Karolinska Institute
Munich
LSE
Zurich
Leuven
Wageningen
Leiden
Uppsala
Sheffield
Humboldt

USA

UC Davis
Minnesota
PennState
Michigan State

Australia

Monash

Asia

Osaka
Tohoku
Caltech in First Place

The big news of the 2011 THE - TR rankings is that Caltech has replaced Harvard as the world's top university. So how exactly did they do it?

According to the Times Higher iPad apps for this year and last (easily downloadable from the rankings page), Harvard's total score fell from 96.1 to 93.9 and Caltech's from 96.0 to 94.8, turning a 0.1 Harvard lead into one of 0.9 for Caltech.

Harvard continued to do better than Caltech in two indicators, with 95 .8 for teaching and 67.5 for international orientation compared to 95.7 and 56.0 for Caltech.

Caltech is much better than Harvard in industry income - innovation but that indicator has a weighting of only 2.5 %.

Harvard's slight lead in the research indicator has turned into a slight lead of 0.8 for Caltech.

Caltech is still ahead for citations but Harvard caught up a bit, narrowing the lead to 0.1.

So, it seems that what made the difference was the research indicator. it seems unlikely that Caltech could overcome Harvard's massive lead in reputation for research and postgraduate teaching: last year it was 100 compared with 23.5. That leaves us with research income per faculty.
 
According to Phil Baty :

"Harvard reported funding increases that are similar in proportion to those of many other universities, whereas Caltech reported a steep rise (16 per cent) in research funding and an increase in totalinstitutional income."

This seems generally compatible with Caltech's 2008-2009 financial statement according to which:

Before accounting for investment losses, total unrestricted revenues increased 6.7% including JPL, and 14.0% excluding JPL

and

Research awards in FY 2009 reached an all-time high of $357 million, including $29 million of funds secured from the federal stimulus package. Awards from federal sponsors increased by 34.4%, while awards from nonfederal sponsors increased by 20.7%.  We also had a good year in terms of private giving, as donors continue to recognize the importance of the research and educational efforts of our outstanding faculty and students.

It seems that research income is going to be the tie-breaker at the top of the THE - TR rankings.  This might not be such a good thing. Income is an input. It is not a product, although universities everywhere apparently think so. There are negative backwash effects coming if academics devote their energies to securing grants rather than actually doing research.
Update on Alexandria

Elnaschiewatch reports that Hend Hanafi, President of Alexandria University, has resigned following prolonged student protests.

Apparently she was under fire because of her links to the old regime but one wonders whether her university's apparent fall of nearly 200 places in the THE - TR rankings gave her a final push. If so, we hope that Times Higher will send a letter of apology for unrealistically raising the hopes of faculty and students. 
Meanwhile over in Alexandria
One of the strangest results of the 2010 THE - TR rankings was the elevation of Alexandria University in Egypt to the improbable status of fourth best university in the world for research impact and 147th overall. It turned out that this was almost entirely the work of precisely one marginal academic figure, Mohamed El Naschie, former editor of the journal Chaos Solitons and Fractals, whose worked was copiously cited by himself, other authors in his journal and those in an Israeli - published journal  (now purchased by De Gruyter) of which he was an editor.

The number of citations collected by El Naschie was not outrageously high but it was much higher than usual for his discipline and many of them were within a year of publication. This meant that El Naschie and Alexandria University received massive credit for  his citations since Thomson Reuters' normalisation system meant comparison with  the international average in a field where citations are low especially in the first year of publication.

Alexandria was not the only university to receive an obviously inflated score for research impact. Hong Kong Baptist University received a score of 97.6 and Bilkent one of 95.7, although in those two cases it seems that the few papers that contributed to these scores did have genuine merit.

It should be remembered that the citation scores were averages and that a few highly cited papers could have a grossly disproportionate effect if the total number of published papers was low.

This year Thomson Reuters went to some length to reduce the impact of a few highly cited papers. They have to some extent succeeded. Alexandria's score is down to 61.4  for citations (it is in 330th place overall),  Bilkent's to 60.8 (222nd place overall) and HKBU's to 59.7 (290th place overall).

These scores are not as ridiculous as those of 2010 but they are still unreasonable. Are we really expected to believe that these schools have a greater research impact than the University of Sydney, Kyoto University, the London School of Economics, Monash University and Peking University who all have scores in the fifties for this indicator?

I for one cannot believe that a single paper or a few papers, no matter how worthwhile, can justify inclusion in the top 300 world universities.

There is another problem. Normalisation of citations by year is inherently unstable. One or two papers in a low citation discipline cited within a year of publication will give a boost to the citations indicator score but after a year their impact diminishes because the citations are now coming more than a year after publication.

Alexandria's score was due to fall anyway because El Naschie has published vary little lately so his contribution to the citations score has fallen whatever methodological changes were introduced. And if he ever starts publishing again?

Also, if Thomson Reuters are normalising by field across the board, this rises the possibility that universities will be able to benefit by simply reclassifying research grants, moving research centres fromone field to another, manipulating abstracts and key words and so on.

Friday, October 07, 2011

Who else is down ?

Just looking at the top 200 of the THE rankings, these universities have fallen quite a bit.

University of North Carolina Chapel Hill
Sydney
Ecole Normale Superieure
Ecole Polytechnique
Trinity College Dublin
University College Dublin
William and Mary College
University of Virginia
Asian Decline?

The Shanghai rankings have shown that universities in Korea, China (including Taiwan and Hong Kong) and the Middle East  have been steadily advancing over the years. Did they get it wrong?

The latest  Times Higher Education -Thomson Reuters rankings appear to prove that Asian universities have inexplicably collapsed over the last year. Tokyo has dropped from 26th to 30th place. Peking has fallen twelve places to 49th. Pohang University of Science and Technology and the Hong Kong University of Science and Technology have slipped out of the top fifty. Bilkent and Hong Kong Baptist University are way down.The decline of China University of Science and Technology is disastrous, from 49th to 192nd. Asian universities are going to be dangerous places for the next few days with students and teachers dodging university administrators jumping out of office windows.

Of course, massive declines like this do not reflect reality: they are simply the result of the methodological changes introduced this year. 

Anyone accessing a ranking site or downloading an iPad app should be made to click on a box reading "I understand that the methodological changes in the rankings mean that comparison with last year's ranking is pointless and I promise not to issue a public statement or say anything to anyone until I have had a cup of tea and I have made sure that everybody else understands this."

Thursday, October 06, 2011

New Arrivals in the THE Top 200.

Every time a new ranking is published there are cries for the dismissal or worse of vice-chancellors or presidents who allowed their universities to lose ground. There will no doubt be more demands as the results of this year's THE rankings are digested. This will be very unjust since there are reasons why universities might take a tumble that have nothing to do with any decline in quality.

First, Thomas Reuters, THE's data collectors, have introduced several methodological changes. In the top 20 or 30 these might not mean very much but lower down the effect could be very large.

Second, rankers sometimes make mistakes and so do those who collect  data for institutions.

Third, many new universities have taken part this year. I counted thirteen just in the top 200 and there are certainly many more in the 200s and300s. A university ranked 200 last year would lose 13 places even if it had exactly the same relative score.

The thirteen newcomers are Texas at Austin, Rochester, Hebrew  University of Jerusalem, University of Florida, Brandeis, Chinese University of Hong Kong, Nijmegan, Medical University of South Carolina, Louvain, Universite Paris Diderot vii, Queen's University, Canada, Sao Paulo, Western Australia.
Highlights of the THE rankings

Some interesting results.

57.  Ohio State University
103.  Cape Town
107 Royal Holloway
149. Birkbeck
184. Iowa State
197. Georgia Health Sciences University
201-225. Bilkent
201-225 University of Medicine and Dentistry of New Jersey
226-250 Creighton University USA
226-250 Tokyo Metropolitan
251-275 Wayne State
276-300 University of Crete
276-300 University of Iceland
276-300 Istanbul Technical University
276-300 Queensland University of Technology
276-300 Tokyo Medical and Dental University
301-350 Alexandria
301-350 Aveiro University
301-350 Hertfordshire
301-350 Plymouth University, UK
301-350 Sharif University of Technology
301-350 National University of Ireland, Maynooth
301-350 Taiwan Ocean University
301-350 Old Dominion University, USA

Wednesday, October 05, 2011

THE Rankings Out


Here is the top 10.

1. Caltech
2. Harvard
3. Stanford
4. Oxford
5. Princeton
6. Cambridge
7. MIT
8. Imperial College London
9. Chicago
10. Berkeley
THE Rankings: Caltech Ousts Harvard

This is from the Peninsula in Qatar


LONDON: US and British institutions once again dominate an annual worldwide league table of universities published yesterday, but there is a fresh name at the top, unseating long-time leader Harvard.
California Institute of Technology (Caltech) knocked the famous Massachusetts institution from the summit of the Times Higher Education (THE) league table for the first time in eight years, with US schools claiming 75 of the top 200 places.
Next is Britain, which boasts 32 establishments in the top 200, but an overhaul in the way in which the country’s universities are funded has raised concerns over its continuing success.
Asia’s increasing presence in the annual table has stalled, with 30th placed University of Tokyo leading the continent’s representation.
China’s top two universities hold on to their elite status, but no more institutions from the developing powerhouse managed to break into the top 200.
THE attributed Caltech’s success to “consistent results across the indicators and a steep rise in research funding”.
THE Rankings

Caffeineblogging

The Guardian appears to have heard something.

On Thursday, the Times Higher Education its global universities rankings. As usual, UK universities shine disproportionately. Altogether a dozen are in the top 100 in the world, with seven in the top 50.

Tuesday, October 04, 2011

Latin American Rankings

QS have produced their new Latin American rankings. The Top five are:

1. Universidade de Sao Paulo
2. Pontificia Universidad Catolica de Chile
3. Unidersidade Estadual de Campinas, Brazil
4. Universidad de Chile
5. Universidad Nacional Autonoma de Mexico (UNAM)
Suggestion

In Times Higher Education, Terrance Karran claims that universities that do well in the THE rankings (and the other ones?) are those that show more regard for academic freedom, which is equated to "compliance" with the AAUP's academic freedom statement.

Perhaps an annual prize could be awarded to the university that has the most academic freedom. I propose that it be called the Lawrence Summers Prize
Expectation

David Willetts, the Brish minister for universities and science says that he expects that more British universities will be in the Times Higher Education World University Rankings top 200.

And if more British universities, then fewer........?
The US News rankings

The U.S. News rankings of American colleges and universities were released on September 13th. For more information go here.

The top 10 national unuiversities are:

1.  Harvard
2.  Princeton
3.  Yale
4.  Columbia
5 = Caltech
5 = MIT
5= Stanford
5= Chicago
5= University of Pennsylvania
10. Duke