Tuesday, March 06, 2012

O, My Felony is Rank

The Social Science  Research Network has just produced a paper by Morgan Cloud and George Shepherd, 'Law Deans in Jail'. Notice the absence of a question mark. Here is the abstract:

A most unlikely collection of suspects - law schools, their deans, U.S. News
& World Report and its employees - may have committed felonies by publishing
false information as part of U.S. News' ranking of law schools. The possible
federal felonies include mail and wire fraud, conspiracy, racketeering, and
making false statements. Employees of law schools and U.S. News who
committed these crimes can be punished as individuals, and under federal law
the schools and U.S. News would likely be criminally liable for their agents'
crimes. Some law schools and their deans submitted false information about the
schools' expenditures and their students' undergraduate grades and LSAT
scores. Others submitted information that may have been literally true but was
misleading. Examples include misleading statistics about recent graduates'
employment rates and students' undergraduate grades and LSAT scores. U.S.
News itself may have committed mail and wire fraud. It has republished, and
sold for profit, data submitted by law schools without verifying the data's
accuracy, despite being aware that at least some schools were submitting false
and misleading data. U.S. News refused to correct incorrect data and rankings
errors and continued to sell that information even after individual schools
confessed that they had submitted false information. In addition, U.S. News
marketed its surveys and rankings as valid although they were riddled with
fundamental methodological errors.

It is unlikely that we will ever see law school deans in jail. It seems that it would be necessary to show that there is a connection between the submission or publication (or failure to retract publication) of false or misleading information and monetary gain and that might be rather difficult to prove.

Also, the authors of the paper are from Emory University, a middling institution. If it ever did come to criminal proceedings you can bet that the big law schools would be lined up behind the rankings.

Monday, March 05, 2012

Best Student Cities

QS has just announced its Best Student Cities ranking. The criteria are performance in the QS World university rankings, student mix, quality of living, employer activity and affordability.

To be included a city must have a population of at least 250,000 and at least two institutions in the QS rankings That explains why Oxford and Cambridge are not on the list but Cairo and Santiago are. 
The top five are:

1.  Paris
2.  London
3.  Boston
4.  Melbourne
5.  Vienna

Saturday, February 18, 2012

Light Posting Ahead

For the next few weeks posting will be light as I am attending to family matters.

Wednesday, February 08, 2012

Eight years of ranking: What have we learned?

My post at the University World News World Blog can be viewed here.

Wednesday, February 01, 2012

University Ranking by Academic Performance

Perhaps too much attention is given to the big three international rankings -- ARWU (the Shanghai Rankings), the QS World University Rankings and the Times Higher Education World University Rankings.

So, here is what I hope will be the first of  a number of posts about the other rankings that have appeared recently.

The University Ranking of Academic Performance is produced by the Informatics Institute of the Middle East Technical University in Ankara.

The indicators are as follows:

Number of articles -- measures current scientific productivity
Google scholar results -- measures long term overall productivity
Citations -- measures research impact
Cumulative Journal Impact -- measures scientific impact
H- index --- measures research quality
International collaboration -- measures international acceptance


These are the top ten:

1.  Harvard
2.  Toronto
3.  Johns Hopkins
4.  Stanford

5.  UC Berkeley
6.  Tokyo
7.  Michigan, Ann Arbor
8.  Washington, Seattle
9.  UCLA
10. Oxford
Think Tank Ranking

There seems to be little that cannot be ranked these days. In the UK there are primary school league tables although nowhere in the world is there a kindergarten ranking. Yet.

The University of Pennsylvania has produced a report on think tanks. These are apparently flourishing everywhere: even Andorra has one.

Here are the top five world wide:

1.  Brookings Institution, USA

2.  Chatham House Royal Institute of International Affairs, UK

3.  Carnegie Endowment for International Peace, USA

4.  Council on Foreign Relations, USA

5.  Center for Strategic and International Studies, USA

Thursday, January 26, 2012

Guest Post

Today's post is by Masturah Alatas. It is in response to a comment in the Kuala Lumpur New Straits Times by James Campbell that begins:

"Any discussion of Malaysian tertiary educational policy needs to take into account the needs of national development in a specific and historical context. Recent debates in regard to the competitive position of Malaysian higher education globally is one area where the pressures of competition and liberalisation must be balanced by the interests of inclusion and social sustainability."


Over the last few years there has been an ongoing debate between those Malaysian academics who accept the challenge of globalisation  and those who more concerned with, as Campbell puts it, "ensuring national values, addressing horizontal social inequality, rural disadvantage and looking into the needs of sustainable and inclusive economic and social development".

The comment continued by invoking the name of Syed Hussein Alatas, a Malaysian social and political theorist who had considerable  international influence, being cited by, among others, Edward Said.



"The discourse of neo-liberal globalisation is itself still arguably beholden to what Syed Hussein Alatas critiqued as the discourse of “The Lazy Native”. Higher educational institutions’ commitment to inclusion and social justice is central to their merit in society."


The following is a response to this comment by the daughter and biographer of Alatas.

Finding the clarity

by Masturah Alatas

As biographer of late Malaysian sociologist Syed Hussein Alatas (1928 – 2007), I do not consider his life story to end with the end of his life. Any continuing narrative about Alatas also has to take into account how, for example, he is talked about in the media today.

As a case in point, I refer to the article Finding the Balance (Learning Curve, New Straits Times, 20 November 2011) by Dr James Campbell, a lecturer in Education at Deakin University, Australia, and researcher with Universiti Sains Malaysia.

Even though the article carries a moribund photo of Alatas, it contains only one sentence in direct reference to him. And the sentence is difficult to understand. “The discourse of neo-liberal globalization is itself still arguably beholden to what Syed Hussein Alatas critiques as the discourse of The Lazy Native”.

The article does not explain what, if at all, neo-liberal globalization has to do with “the discourse of The Lazy Native”. And nowhere in the article is it clearly stated that many of Malaysia’s current higher education policies are neo-liberal to begin with. So Campbell is vague. He seems to be criticizing neo-liberalism in general, but not what is specifically neo-liberal about Malaysia’s higher education policies.

Campbell argues that comparisons between the National University of Singapore and Universiti Malaya may not always be valid “given important distinctions and differences in national policies and political culture between the two nations.”

But how does this reasoning square with the fact that a Malaysian sociologist like Syed Hussein Alatas taught at the National University of Singapore for over twenty years, and was appointed Head of Department of Malay studies there? What criteria of merit did NUS apply when they tenured Syed Hussein Alatas? Is Campbell suggesting that the same criteria cannot be applied in a Malaysian university? And if so, why not?
Malaysians may still remember Syed Hussein Alatas’ short-lived, controversial term (1988 – 1991) as Vice-Chancellor of Universiti Malaya when he tried to promote academic staff based on merit. For Alatas, one way to establish merit was to look at the publications of academics. The New Straits Times itself carried reports of the controversy, so it is no secret. Some of them contained statements like “five members of the students association have come out in support of Vice-Chancellor Syed Hussein Alatas’s stand on the appointment of non-Malay deans to faculties in the university” (NST, 12 March 1990).

When Campbell writes that any discussion of Malaysia’s higher education policies “needs to be placed in perspective against the needs of national development in a specific historical context”, and that notions of merit must take into account ideas of inclusion and social justice, what exactly does he mean? Inclusion, of course, necessarily entails exclusion. But who is being excluded and on what grounds are they being excluded? And whose sense of social justice are we talking about here?

Syed Hussein Alatas’ works from The Myth of the Lazy Native to his writings on corruption precisely warn against the dangers of relativising notions of social justice. So it is quite odd that Campbell would refer to Alatas in his article.

All written legacies can be appropriated, rightly and wrongly, to support a particular persuasion or agenda, and it depends on critics to call attention to what is right or wrong appropriation. One of the best writers I know, for example, who has creatively applied Syed Hussein Alatas’ ideas on mental captivity and the inability to raise original problems to the role of education and the Arts, is former New Straits Times columnist U-En Ng. “You can use entertainment to pull wool over people’s eyes and divert attention away from whatever it is you don’t want them to see or think about. Or, more positively, you can use artistic expression to build civic participation and the capacity to raise original problems,” he writes in the article Governing through the Arts (NST, 09 January 2011). “Rebellious performance art by university students tells you both what you can expect from the current education system as well as how the public might react to new ideas.”

At the same time, a young Italian poet from Osimo by the name of Andrea Palazzo reminds us that an excess of “the so-called need to express oneself can be the mortal enemy of Beauty and Truth. Overshadowed by popular opinion, great art dies, or rather it disappears, or languishes in museums.” And for Palazzo, the prospects for Philosophy in some Italian universities, which he feels are “conservative rather than selective”, are not much brighter either.

Syed Hussein Alatas believed that any process of change must necessarily be accompanied by a philosophical set of criteria for selection of what to reject, retain and strive for (see Erring Modernization, 1975). Quoting from German philosopher Friedrich Nietzsche’s The Use and Abuse of History, Alatas stressed the importance of having a horizon of thinking, “a line that divides the visible and the clear from the vague and shadowy” in order for an individual, a community and a nation to thrive. “We must know the right time to forget as well as the right time to remember, and instinctively see when it is necessary to feel historically and when unhistorically.”

It is, of course, extremely difficult to know when and what to remember and forget, especially when the spectre of Malaysia’s 1969 May 13 Racial Riots still haunts us, and when Malaysia is making itself dizzy by rushing to become a High Income Nation before eight years are up.


But one way for a line, a horizon of thinking, to become clearer is through good writing. And this may just be what will rescue Alatas’ work from languishing in the shadows.



Masturah Alatas


Sunday, January 22, 2012

Worth Reading

Ranking in Higher Education: Its Place and Impact by Jan Sadlak. Originally appeared in the Europa World of Learning 2010.

Wednesday, January 18, 2012

The Power of Small Numbers

My attention has just been drawn to the citations per paper indicator in the 2011 QS Asian University Rankings. In first place is University of Santo Tomas in the Philippines, a very good school in some ways but not usually considered as a research rival to Hong Kong or Tokyo. It seems that UST's success was the result of just one much cited medical paper of which just one UST researcher was a co-author.

 Another highly cited many-authored medical paper seems to explain Universitas Padjadjaran's appearance in sixth place in this indicator even though the total number of papers is extremely small.

Leiden University have started offering fractional counting of publications in their rankings:

The fractional counting method gives less weight to collaborative publications than to non-collaborative ones. For instance, if the address list of a publication contains five addresses and two of these addresses belong to a particular university, then the publication has a weight of 0.4 in the calculation of the bibliometric indicators for this university. The fractional counting method leads to a more proper normalization of indicators and to fairer comparisons between universities active in different scientific fields. Fractional counting is therefore regarded as the preferred counting method in the Leiden Ranking.

This would be one way of avoiding giving a high position to universities that produce little but manage to get researchers included as co-authors a few papers, usually in medical journals,
FindThe Best

This is a site that has just come to my attention. There is a great variety of rankings of things like antivirus, snowboards and fertility clinics and also of colleges and universities, including business, law and medical schools.

The colleges and universities ranking is US only and includes a "smart ranking" combining  statistical information with the Forbes, US News and ARWU (Shanghai) rankings. This sounds like a good idea but there does not seem to be any information about the methodology.

Tuesday, January 17, 2012

The Happiest University in Britain?

According to the Daily Telegraph it's St Mary's University College Belfast. I thought Belfast was in Ireland but then again I did not do geography in secondary school.
Update: The Journal Bubble

Jeffrey Bealle has a list of predatory journals.

 Note  that some of the comments dispute the inclusion of some journals.
The Journal Bubble

Inside Higher Ed has a piece by Philip Altbach about the proliferation of dubious academic journals.

Clever people have figured out that there is a growing demand for outlets for scholarly work, that there are too few journals or other channels to accommodate all the articles written, that new technology has created confusion as well as opportunities, and (finally) and somewhat concerning is that there is money to be made in the knowledge communication business. As a result, there has been a proliferation of new publishers offering new journals in every imaginable field. The established for-profit publishers have also been purchasing journals and creating new ones so that they “bundle” them and offer them at high prices to libraries through electronic subscriptions.

I suspect that this is causing increasing problems for struggling but respectable journals in the academic periphery. I know of of least one journal that has had several submitters "disappear" after being asked  to make modest revisions or even deal with problems identified by a turnitin report. One wonders where the papers will reappear.

Sunday, January 15, 2012

Research Fraud in the UK

AN article in Times Higher Education by Jack Grove indicates that there is a large amount of research fraud going on in UK universities, although the comments raise questions about the validity of the study.

I am wondering if there is any comparative international data available.
Primary School League Tables

The craze for rankings continues unabated. There is now a League Table for English primary schools.


The league tables show the percentage of 11-year-olds in each school reaching Level 4 – the standard expected for their age group – in both English and maths at primary school.
Officially, this means they can spell properly, start to use grammatically complex sentences and employ joined up handwriting in English. In maths, they should be able to multiply and divide whole numbers by 10 or 100 and be able to use simple fractions and percentages.

Pupils exceeding this standard are awarded a higher Level 5.Data for individual schools also shows three other measures: average points score, value-added and pupil progress.

Saturday, January 14, 2012

Who says college isn't worth it?

There is a web site called SeekingArrangement which puts sugar daddies and sugar babies in touch with another. Apparently, large numbers of college graduates are signing up in the latter category, perhaps because of increasing difficulties in paying off student loans.

There is now a ranking of the top 20 colleges among sugar baby sign ups. I have doubts about the validity of the ranking. All that is necessary to be a certified college sugar baby, and get three times as many enquiries from sugar daddies (it's good to know that American society still values education), is an edu email address.

New York University might be number 1 because its tuition fees are so high or job prospects for graduates so meagre or maybe because the sugar daddies are in New York.

Here are the top twenty:

1. New York University (NYU) -- 185
2. University of Georgia -- 155
3. University of Phoenix -- 144
4. Tulane University -- 129
5. Temple University -- 113
6. Virginia Community College -- 108
7. University of Southern Florida -- 93
8. Arizona State University -- 85
9. Michigan State University -- 81
10. Ivy Tech Community College -- 78
11. Georgia State University -- 74
12. University of Wisconsin -- 73
13. Penn State University -- 72
14. University of Central Florida -- 67
15. Kent University -- 65
16. Maricopa Community College -- 63
17. Indiana University -- 62
18. University of California, Berkeley -- 61
19. The Art Institutes -- 60
20. Florida International University -- 59

Please note that I found this story via the Chronicle of Higher Education.
Online Education Rankings

US News has announced its rankings of American online education programs. There are six categories: Bachelor's, Business, Education, Engineering, Info tech and Nursing. Within each category there are rankings for faculty credentials and training, student services and technology, student engagement and assessment and, except for bachelor's, admissions selectivity. For bachelor programs the top universities are:

Faculty Credentials and Training
Westfield State University, Massachusetts

Student Engagement and Assessment  
Bellevue University, Nebraska

Student Services and Technology
Arizona State University, Tempe

Wednesday, January 11, 2012

The end of the university as we know it?


MIT has already been putting its course materials online for anyone to access free of charge. Now they are going a step further.

"MIT today announced the launch of an online learning initiative internally called “MITx.” MITx will offer a portfolio of MIT courses through an online interactive learning platform that will:
  • organize and present course material to enable students to learn at their own pace
  • feature interactivity, online laboratories and student-to-student communication
  • allow for the individual assessment of any student’s work and allow students who demonstrate their mastery of subjects to earn a certificate of completion awarded by MITx
  • operate on an open-source, scalable software infrastructure in order to make it continuously improving and readily available to other educational institutions.
MIT expects that this learning platform will enhance the educational experience of its on-campus students, offering them online tools that supplement and enrich their classroom and laboratory experiences. MIT also expects that MITx will eventually host a virtual community of millions of learners around the world.

There are a lot of questions that come to mind. Will students be assessed according to the same standards as conventional MIT students? If someone accumulates sufficient certificates of completion will they be entitled to an MITx degree? What will happen if employers and graduate school  start accepting MITx certificates as equivalent to standard academic credentials? If so, will MIT be able to resist the temptation to start charging hefty fees for a certificate.

MIT may, perhaps unwittingly,  have started a process that will end with universities becoming something very different.

Monday, January 02, 2012

How Did I Miss This?

The blog Registrarism has discovered a fascinating article, published in 2002 in Higher Education Quarterly, that compares university league tables (that is British university rankings) with the football (soccer to Americans) league tables.

Sunday, December 18, 2011

Leiden Ranking: Many Ways to Rate Research

My article on the Leiden Rankings in University World News can be found here.

 It looks as though a two-tier international university ranking system is emerging.

At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.

These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.

Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.

These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.

Wednesday, December 07, 2011

Update 6 on El Naschie vs Nature

There have been no reports for several days and the trial is now over. There will be a judgement in January.
What to do about the research bust

Mark Bauerlein has an article in the Chronicle of Higher Education on the disparity between the extraordinary effort and intelligence poured into scholarly writing in the humanities and the meager attention such writing receives.

"I devised a study of literary research in four English departments at public universities—the University of Georgia, the University at Buffalo, the University of Vermont, and the University of Illinois at Urbana-Champaign—collecting data on salaries, books and articles published, and the reception of those works. The findings:
  • Those universities pay regular English faculty, on average, around $25,000 a year to produce research. According to the faculty handbooks, although universities don't like to set explicit proportions, research counts as at least one-third of professors' duties, and we may calculate one-third of their salaries as research pay. This figure does not include sabbaticals, travel funds, and internal grants, not to mention benefits, making the one-third formula a conservative estimate.
  • Professors in those departments respond diligently, producing ample numbers of books and articles in recent years. At Georgia, from 2004 to 2009, current faculty members produced 22 authored books, 15 edited books, and 200 research essays. The award of tenure didn't produce any drop-off in publication, either. Senior professors continue their inquiries, making their departments consistently relevant and industrious research centers.
  • Finally, I calculated the impact of those publications by using Google Scholar and my own review of books published in specific areas to count citations. Here the impressive investment and productivity appear in sobering context. Of 13 research articles published by current SUNY-Buffalo professors in 2004, 11 of them received zero to two citations, one had five, one 12. Of 23 articles by Georgia professors in 2004, 16 received zero to two citations, four of them three to six, one eight, one 11, and one 16. "
Bauerlain suggests that these limited citation counts are telling us something, that talented scholars might find better things to do and that society might direct resources elsewhere.

The QS World University Rankings would apparently agree. Their citations indicator simply counts the total number of citations and divides it by the total number of faculty. This is a very crude measure, especially since it counts the current number of faculty but then counts the citations to articles written over a five year period. Any university seeking a boost in the QS rankings could simply axe a few English, history and philosophy specialists and replace them with oncologists and engineers. True, the world would lose studies about Emily Dickinson's Reluctant Ecology of Place, cited once according to Google Scholar, or Negotiations of Homoerotic Tradition in Paradise Regained, but if this was accompanied by even a small advance in cancer treatment, who would really care? There would be an even better effect on the Shanghai rankings which do not not count publications or citations in the humanities but still include the faculty in their productivity indicator.

But there are those who would argue that while disciplines go about citing differently they must be regarded as being on the same level in all other respects. Thomson Reuters, who collect the data for the Times Higher Education rankings, now normalise their data so that citations in a specific discipline in a specific country in a specific year are benchmarked against the average for that discipline in that country in that year. That would mean that the article by the Buffalo professors with five citations might look quite good.

I have a suggestion for those professors of English and other disciplines which hardly anyone seems to read anymore. Go to some Central Asian or East African republic where the papers in your field get only a few citations: the next article you write with its miserable handful of citations will be well above average for that country and your new university will suddenly perform well in the Times Higher rankings. Just make sure that your employer produces two hundred papers a year altogether.

Friday, December 02, 2011

European Universities and Rankings

Research Trends, the newsletter from Scopus,  reports on  a conference of European universities that discussed international rankings. The participants found positive aspects to rankings but also had criticisms:

Going through the comparison of the various methodologies, the report details what is actually measured, how the scores for indicators are measured, and how the final scores are calculated — and therefore what the results actually mean.
The first criticism of university rankings is that they tend to principally measure research activities and not teaching. Moreover, the ‘unintended consequences’ of the rankings are clear, with more and more institutions tending to modify their strategy in order to improve their position in the rankings instead of focusing on their main missions.
For some ranking systems, lack of transparency is a major concern, and the QS World University Ranking in particular was criticized for not being sufficiently transparent.
The report also reveals the subjectivity in the proxies chosen and in the weight attached to each, which leads to composite scores that reflect the ranking provider’s concept of quality (for example, it may be decided that a given indicator may count for 25% or 50% of overall assessment score, yet this choice reflects a subjective assessment of what is important for a high-quality institute). In addition, indicator scores are not absolute but relative measures, which can complicate comparisons of indicator scores. For example, if the indicator is number of students per faculty, what does a score of, say, 23 mean? That there are 23 students per faculty member? Or does it mean that this institute has 23% of the students per faculty compared with institutes with the highest number of students/faculty? Moreover, considering simple counts or relative values is not neutral. As an example, the Academic Ranking of World Universities ranking does not take into consideration the size of the institutions.

I am not sure these criticisms are entirely fair. It seems that the weighting of the various indicators in the Times Higher Education rankings emerged from a lot of to and fro-ing between various stakeholders and advisers. In the end, far too much weighting was given to citations but that is not quite the same as assigning arbitrary or subjective values.

The Shanghai rankings do have an indicator, productivity per capita , that takes  faculty size into account although it is only ten per cent of the total ranking. The problem here is that faculty in the humanities are counted but not their publications.

I am not sure why QS is being singled out with regard to transparency. The THE rankings are also, perhaps in a different way, quite opaque. Aggregate scores are given for teaching environment, research and international orientation without indicating the scores that make up these criteria.

So what is to be done?


The EUA report makes several recommendations for ranking-makers, including the need to mention what the ranking is for, and for whom it is intended. Among the suggestions to improve the rankings, the following received the greatest attention from the audience:
  1. Include non-journal publications properly, including books, which are especially important for social sciences and the arts and humanities;
  2. Address language issues (is an abstract available in English, as local language versions are often less visible?);
  3. Include more universities: currently the rankings assess only 1–3% of the 17,000 existing universities worldwide;
  4. Take into consideration the teaching mission with relevant indicators.

The first of these may become feasible now that Thomson Reuters has a book citation index. The second and third are uncontroversial. The fourth is very problematical in many ways.

The missing indicator here is student quality. To be very blunt, universities can educate and instruct students but they can do very little to make them brighter.  A big contribution to any university ranking would be a comparison of the relative cognitive ability of its students. That, however, is a goal that requires passing through many minefields.

Thursday, December 01, 2011

Diversity and Rankings

Robert Morse, director of data research at the US News and World Report discusses the question of whether "diversity" should be included in the ranking of American law schools.

"I was one of speakers on the "Closing Plenary: Reforming U.S. News Rankings to Include Diversity" panel, which discussed many of the issues pertaining to whether U.S. News should add a measure of diversity directly into the Best Law Schools rankings. I pointed out that U.S. News believes diversity is important and that is why we all ready publish a separate law school diversity index.

Our current index identifies law schools where law students are most and least likely to encounter classmates from a different racial or ethnic group. However, the current index does not measure how successful each law school is at meeting a diversity goal or benchmark at the school, state, local, or national level. It also gives schools enrolling one ethnic group a low score, though that school's enrolment may match its state's ethnic population or the school may be a Historically Black College or University. It's for these reasons the current index would not be appropriate to add into the rankings".

Diversity here does not mean diversity of ideology, religion, class, politics or nationality. It simply means the numbers of recognised minorities, mainly African-Americans, Hispanics and Asian Americans.

It is interesting to look at the diversity index and to see the likely effect of including diversity in the law school rankings. The most diverse law school is the University of Hawaii. The University of the District of Columbia and Florida International University also get high scores. Low scorers include Harvard, Yale and UCLA.

Somehow, I do not think that an indicator that benefited Florida International University at the expense of Harvard would add to the credibility of these rankings.

Unless it can be demonstrated that there is something magically transforming about the statistical profile of a law school reflecting that of its city, state, or nation or future nation, this proposal does not sound like a very good idea.
The Utility of Rankings

Another advantage of a good performance in international university rankings is that graduates will be able to get into Russian postgraduate programs (if your university is in a G8 country).


Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.

The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.

The reform is intended to attract more students to take part in Russian MA and PhD programs.