Canalblog
Suivre ce blog Administration + Créer mon blog
Formation Continue du Supérieur
11 mai 2012

Comparing universities: which country comes top for higher education?

The Guardian homeBy Ross Williams. A new ranking of international universities and higher education systems has been compiled to give more insight into the strength of HE in different nations.

Higher education is a dynamo for economic growth, powering the supply of high-level skills and the technological advances for improving productivity and opening up new markets. Where HE flourishes, so can an economy. Until now, however, there has been little interest in the comparative strengths and qualities of national education systems around the world. Which countries and governments provide the best environment? More transparency and clarity is needed around this in order to encourage knowledge-sharing, collaboration and development of opportunities for students in all countries.
While there are a number of well-regarded global rankings of individual institutions, these don't shed any light on the broader picture of the system itself, and its state of 'health' in terms of encouraging and supporting excellence and international links. It's important for governments to be able to benchmark how they're doing. A quality higher education system is one that is well connected internationally facilitates the introduction of new ideas, and fosters trade and other links with foreign countries, through the movement of students and researchers across national frontiers. At the same time, students are increasingly choosing countries to study in as much as individual institutions.
This week saw the first publication of a new ranking of national HE systems,
based on research at the Melbourne Institute of Applied Economic and Social Research (University of Melbourne) into data from 48 countries with a developed HE offering. The ranking is organised by Universitas 21, a global network of research universities. The ranking is based on 20 different measures critical to what makes a 'good' HE system, grouped under four umbrella headings: resources (investment by government and private sector), output (research and its impact, as well as the production of an educated workforce which meets labour market needs), connectivity (international networks and collaboration which protects a system against insularity) and environment (government policy and regulation, diversity and participation opportunities). Population size is accounted for in the calculations.
For the UK it's a mixed picture, particularly for a system which continues to attract such a large proportion of international students. Ranked tenth overall, the UK is held down by a ranking of only 27th on resources, including a low rank of 41st for government expenditure. Against that, the UK is ranked only second to the United States on output. The difference in ranking between output and resources is the greatest for all 48 countries and reflects very high productivity. The UK also does well on international connectivity, ranked sixth as it has the fourth largest percentage of international students. It's ranked 13th on environment, losing points for lack of diversity and being ranked at 19 by the World Economic Forum.
While the UK looks to the world stage, many other nations are more interested in what's happening in their region. The four Nordic countries are all in the top seven; four east Asian countries (Hong Kong SAR, Japan, Taiwan and Korea) are clustered together at ranks 18 to 22; Eastern European countries (Ukraine, Czech Republic, Poland, Slovenia) are together in the middle range; and the Latin American countries (Chile, Argentina, Brazil and Mexico) also cluster together. While many countries don't feel they can be a world leader, they do want to match the standards of their neighbours.
Government funding of higher education as a percentage of GDP is highest in Finland, Norway and Denmark, but when private expenditure is added in, funding is highest in the United States, Korea, Canada and Chile. Investment in research and development is highest in Denmark, Sweden and Switzerland. The United States dominates the total output of research journal articles, but Sweden is the biggest producer of articles per head of population. The nations whose research has the greatest impact are Switzerland, the Netherlands, the United States, United Kingdom and Denmark. While the US and UK have the world's top institutions in rankings, the depth of world class higher education institutions per head of population is best in Switzerland, Sweden, Israel and Denmark.
The highest participation rates in higher education are in Korea, Finland, Greece, the United States, Canada and Slovenia. The countries with the largest proportion of workers with a higher level education are Russia, Canada, Israel, United States, Ukraine, Taiwan and Australia. Finland, Denmark, Singapore, Norway and Japan have the highest ratio of researchers in the economy.
International students form the highest proportions of total student numbers in Australia, Singapore, Austria, United Kingdom and Switzerland. International research collaboration is most prominent in Indonesia, Switzerland, Hong Kong SAR, Denmark, Belgium and Austria. China, India, Japan and the United States rank in the bottom 25% of countries for international research collaboration. In all but eight countries at least 50% of students were female, the lowest being in India and Korea. In only five countries were there at least 50% female staff; the lowest being in Japan and Iran.
Competition between individual institutions on regional and international levels is intense and growing as mobility increases and all 'markets' become more open. It's crucial for nations and the appreciation of the global HE system as a whole that attention is not bogged down in rivalries between single 'name' players in HE capable of attracting an elite. Whole country systems matter to mass populations of people, improving their lives and contributing to national and international prosperity. The Universitas 21 Ranking should be recognised as an important reference point for governments and everyone involved in HE, to keep focus and attention on how HE can be galvanised for growth.
Professor Ross Williams, Melbourne Institute of Applied Economic and Social Research,
University of Melbourne.
11 mai 2012

Universitas 21 Top Countries For Higher Education - la France 15e

http://www.science20.com/sites/all/themes/sbv2/images/sb-logo.pngWhich countries are the best at providing higher education?
The Universitas 21 Ranking was announced today at Lund University in Sweden. Universitas 21, a network of research universities, has developed their own ranking as a benchmark for governments, education institutions and individuals to highlight the importance of creating a strong environment for higher education institutions that will contribute to economic and cultural development, provide a high-quality experience for students and help institutions compete for overseas applicants. 
So calibrate accordingly when the metrics for 'higher education' don't actually mention education. The researchers looked at the most recent data from 48 countries across 20 different measures. The range of measures is grouped under four headings: resources (investment by government and private sector), output (research and its impact, as well as the production of an educated workforce which meets labour market needs), connectivity (international networks and collaboration which protects a system against insularity) and environment (government policy and regulation, diversity and participation opportunities). Population size is accounted for in the calculations.
Overall, in the Universitas 21 Ranking of higher education systems, the top five were found to be the United States, Sweden, Canada, Finland and Denmark. Government funding of higher education as a percentage of GDP is highest in Finland, Norway and Denmark, but when private expenditure is added in, funding is highest in the United States, Korea, Canada and Chile. Investment in Research and Development is highest in Denmark, Sweden and Switzerland. The United States dominates the total output of research journal articles, but Sweden is the biggest producer of articles per head of population.
The nations whose research has the greatest impact are Switzerland, the Netherlands, the United States, United Kingdom and Denmark. While the United States and United Kingdom have the world's top institutions in rankings, the depth of world class higher education institutions per head of population is best in Switzerland, Sweden, Israel and Denmark.
The highest participation rates in higher education are in Korea, Finland, Greece, the United States, Canada and Slovenia. The countries with the largest proportion of workers with a higher level education are Russia, Canada, Israel, United States, Ukraine, Taiwan and Australia. Finland, Denmark, Singapore, Norway and Japan have the highest ratio of researchers in the economy. International students form the highest proportions of total student numbers in Australia, Singapore, Austria, United Kingdom and Switzerland. International research collaboration is most prominent in Indonesia, Switzerland, Hong Kong SAR, Denmark, Belgium and Austria. China, India, Japan and the United States rank in the bottom 25 percent of countries for international research collaboration. In all but eight countries at least 50 percent of students were female, the lowest being in India and Korea. In only five countries were there at least 50 percent female staff; the lowest being in Japan and Iran.
Lead author, Professor Ross Williams at the University of Melbourne, said, "In a globalised world, a strong higher education system is essential if a nation is to be economically competitive. "While there are a number of well-regarded global rankings of individual institutions, these don't shed any light on the broader picture of how well a nation's system educates its students, the environment it provides for encouraging and supporting excellence. Students choose countries to study in as much as individual institutions, and the Universitas 21 Ranking offers clear data to support decision-making."
Jane Usherwood, Secretary General of Universitas 21, said, "More transparency and clarity is needed around the comparative strengths and qualities of national education systems around the world in order to encourage knowledge-sharing, collaboration and development of opportunities for students in all countries. We hope the Universitas 21 Ranking will become an established point of reference for policy-makers, education institutions and development bodies globally."
Universitas 21 is an international research network of 24 universities and colleges. Its membership works together to encourage international mobility and engagement between staff and students.
The Universitas 21 Ranking

For each group of measures the highest scoring country is given a score of 100 and all other countries are expressed as a percentage of the highest score. Further details can be found at http://www.universitas21.com/link/U21Rankings.

United States 100 
Sweden 84
Canada 83
Finland 82
Denmark 81
Switzerland 80
Norway 78
Australia 78
Netherlands 77
United Kingdom 77
Singapore 75
Austria 74
Belgium 74
New Zealand 73
France 71

Ireland 70
Germany 69
Hong Kong SAR 67
Israel 66
Japan 64
Taiwan 62
Korea 60
Portugal 60
Spain 60
Ukraine 59 Czech
Republic 58
Poland 56
Slovenia 56
Greece 55
Italy 54
Bulgaria 53
Russian Federation 52
Romania 51
Hungary 51
Slovakia 51
Malaysia 50
Chile 49
Argentina 49
China 48
Brazil 47
Thailand 47
Iran 46
Mexico 45
Croatia 45
Turkey 44
South Africa 43
Indonesia 37
India 34

11 mai 2012

New ranking system rates ‘best’ countries not universities

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Geoff Maslen. A novel system of ranking 48 countries and territories said to be the ‘best’ at providing higher education was published today by Universitas 21, the 15-year-old global network of 23 research-intensive universities.
The latest ranking system makes a welcome change from the efforts of a growing number of commercial organisations and other groups to rank individual universities according to their various abilities. The top 10 countries claimed to be best at delivering higher education are the US, Sweden, Canada, Finland, Denmark, Switzerland, Norway, Australia, Netherlands and the UK.
The Universitas 21 results were launched at an event in Sweden at Lund University, where the ranking were described as a “benchmark for governments, education institutions and individuals”.
“It aims to highlight the importance of creating a strong environment for higher education institutions to contribute to economic and cultural development, provide a high quality experience for students and help institutions compete for overseas applicants,” according to a release from the network.
The rankings were produced by researchers at the Melbourne Institute of Applied Economic and Social Research, University of Melbourne. They reviewed the most recent data from 48 countries and territories across 20 different measures grouped under four headings: resources (investment by government and private sector); output (research and its impact, as well as the production of an educated workforce to meet labour market needs); connectivity (international networks and collaboration which protects a system against insularity); and environment (government policy and regulation, diversity and participation opportunities). Population size is also taken into account in the calculations.
The researchers found that government funding of higher education as a percentage of gross national product was highest in Finland, Norway and Denmark. But when private expenditure was included, funding was highest in the US, Korea, Canada and Chile. Investment in research and development was highest in Denmark, Sweden and Switzerland, and although the US dominated the total output of research journal articles, Sweden was found to be the biggest producer of articles per head of population. According to the Melbourne team, the nations whose research has the greatest impact are Switzerland, the Netherlands, the US, the UK and Denmark. While the US and UK have the world's top institutions in rankings, “the depth of world-class higher education institutions per head of population” is best in Switzerland, Sweden, Israel and Denmark.
Countries with the highest participation rates were listed as Korea, Finland, Greece, the US, Canada and Slovenia, while those with the largest proportion of workers with a higher level education were Russia, Canada, Israel, US, Ukraine, Taiwan and Australia. Finland, Denmark, Singapore, Norway and Japan had the highest ratio of researchers in the economy. The U-21 report says international students form the highest proportions of total student numbers in Australia, Singapore, Austria, the UK and Switzerland. International research collaboration is most prominent in Indonesia, Switzerland, Hong Kong SAR, Denmark, Belgium and Austria.
China, India, Japan and the US rank in the bottom 25% of countries for international research collaboration. In all but eight countries at least 50% of students were female, the lowest being in India and Korea. In only five countries were there at least 50% female staff, the lowest being in Japan and Iran. The U-21 report says the results represent an initial attempt to rate national systems of higher education for a relatively large number of countries covering different stages of economic development. While this widened the value of the exercise, it made the data collection more complicated. The researchers hope the rankings will encourage improvements in data, both for included countries and to enable them to extend the range of countries in future updates.
“While there are a number of international rankings of universities, commencing with the seminal Shanghai Jiao Tong index in 2003, less effort has been put into quantitative rankings of national systems of higher education,” the report states.
“A notable exception is the policy brief for the Lisbon Council, in which Edereer, Schuller and Willms in 2008 developed a university systems ranking for 17 selected OECD countries.
“The international rankings of universities emphasise the peaks of research excellence. They throw no light, however, on issues such as how well a nation’s higher education system educates all its students, possessing different interests, abilities and backgrounds.
“Even for universities, [Jamil] Salmi notes that ‘what happens in the institution alone is not sufficient to understand and appreciate the full dynamics of their relative success or failure’.”
Lead author, Professor Ross Williams at the University of Melbourne, said that in a globalised world, a strong higher education system was essential if a nation was to be economically competitive. Williams has previously produced rankings of Australian universities.
“While there are a number of well-regarded global rankings of individual institutions, these don't shed any light on the broader picture of how well a nation's system educates its students, the environment it provides for encouraging and supporting excellence,” he said.
“Students choose countries to study in as much as individual institutions and the Universitas 21 ranking offers clear data to support decision-making."
* Professor Alan Gilbert, the late former vice-chancellor of Melbourne University and later of Manchester University, came up with the idea for a global network of research-intensive universities in 1997. This led to the creation of what he called Universitas 21 in 2003, which he saw as becoming a kind of global for-profit institution with offshoots around the world. Despite some early turmoil, and loss of members and millions of dollars by the founding institutions, the network settled down and now includes 23 universities in 15 countries.

6 mai 2012

New THE ranking to select future Harvards and Cambridges

http://www.universityworldnews.com/By Phil Baty. In a scathing attack on the annual cycle of university rankings Daniel Lincoln, a visiting scholar at the Center for International Higher Education at Boston College in the United States, painted a wonderfully memorable image.
“Picture the year 1640,” he wrote in a blog post for a US higher education website. “You are an educated, upper-class Englishman, having a hearty laugh with your mates in London at the news that those religious fanatics in the colonies have now ‘founded their own university’ in Boston, led by the benefaction of a certain John Harvard – priceless!
“A few generations later, I’m guessing no one was laughing.”
He concluded: “Make no mistake: excellence is a longitudinal affair. By that standard, year-on-year rankings are inconsequential.”
While Lincoln’s point may have been very nicely illustrated, I believe his conclusion was wrong. Of course, barring a managerial catastrophe universities are, as the cliché goes, like oil tankers – it takes a long time to turn them round. But we live in uncertain times, and the established global hierarchies are under constant threat from many angles. Things can change quickly. Take the United Kingdom. Oxford historian Howard Hotson has described the reforms taking place to England’s universities as “the most radical experiment ever conducted on a major university system in the modern world”.
By replacing the vast bulk of public funding for university teaching with tripled student tuition fees and by ushering in market principles in a bid to drive up standards, the government has enacted “the virtual privatisation of...an entire university system at the stroke of a pen”, he said. Provisional funding allocations released in March 2012 revealed that, despite moves by the funding chiefs to smooth the transition, some English institutions will lose up to 46% of their direct grant in a single year. This is in no way “inconsequential” in terms of performance.
Similarly, when a university poaches a big name research superstar, usually with the entire team, in the ever-intensifying global academic transfer market, the effects on current and prospective students, on faculty and on potential investors, are immediate and are in no way “inconsequential”. Moreover, in a highly competitive global market, the less tangible element of a university’s profile – its academic reputation – can be subject to rapid change. A good reputation matters – it has real-world benefits, from helping to attract and retain the best students and faculty to encouraging the most generous benefactors – but it can be vulnerable in a multi-media information age.
So Times Higher Education will continue, as it has done for the past eight years, its annual World University Rankings. THE is clear that rankings have a sound utility: to students, faculty, university leaders, governments and industry. If they did not, we would not publish them – and they would not attract the many millions of internet visits they do. But to ensure we meet our obligations to our diverse global community of readers, THE is also committed to putting more rankings data into the public domain.
That is why as well as the Times World University Rankings, which uses 13 performance indicators across teaching, research, knowledge transfer and internationalisation, we also publish annually each March the World Reputation Rankings, which reveal the results of our Annual Academic Reputation Survey in isolation. And that is why I am delighted to announce this week an innovation in the field of global university rankings – the Times Higher Education 100 Under 50. The THE 100 Under 50 will, as its name suggests, rank the world’s top 100 universities under the age of 50. It will be published on 31 May 2012.
The vast majority of the world’s top research-led universities have at least one thing in common: they are old. Building upon centuries of scholarly tradition, institutions such as Oxford, which can trace its origins back to 1096, can draw on endowment income generated over many years and have been able to cultivate rich networks of loyal and successful alumni (including, in Oxford’s case, a string of British prime ministers) to help build enduring brands. Such advantages are reflected in the overwhelming dominance of older universities in the THE World University Rankings. But the focus of the THE 100 Under 50 is not on the traditional elites.
The analysis is about a new breed of global universities – those that have already managed to join the world’s top table in a matter of years, not centuries, and others that show great promise – institutions that could reach the top, in time. The 2012 THE 100 Under 50 will draw on the same comprehensive range of 13 performance indicators used to compile the THE World University Rankings, but will only rank those founded in 1962 or later. The indicators, all developed and provided by Thomson Reuters, will be carefully recalibrated to reflect the profile of younger institutions.
The report will show us which nations are challenging the US and UK as the next higher education powerhouses. It will give us a unique insight into which institutions may be the future ‘Harvard’ or ‘Cambridge’. Daniel Lincoln’s entertaining picture of the 17th century London establishment, mocking the pretentions of Harvard, demonstrates how established elites can be challenged by those who may at the time be dismissed as mere upstarts. We have seen this time and again, notably with the 1960s ‘plate glass’ universities in the UK which now rub shoulders with (and often surpass) the Victorian civic universities. We are seeing it again with a number of institutions founded in the 1980s and 1990s, notably in Asia, with a focus on science and technology backed by abundant resources and serious political will.
And the pace is stepping up. Even Lincoln, who argued that it takes “a few generations” to build world-class universities, acknowledged that “the processes of growth have accelerated enormously” since the time Harvard challenged the ancients.
Indeed, he noted that the book by Boston College’s Philip Altbach and Jamil Salmi, The Road to Academic Excellence: The making of world-class universities, “features some institutions that have made enormous advances in tiny amounts of time”.
The THE 100 Under 50 showcases such institutions – a new generation of globally competitive universities. It could offer a tantalising glimpse into the future and we look forward to it becoming a helpful addition to the annual round of rankings releases.
* Phil Baty is editor of the Times Higher Education World University Rankings.
2 mai 2012

Introducing the Times Higher Education 100 Under 50

Click here for THE homepageBy Phil Baty. Times Higher Education today announces the launch of an exciting new addition to its World University Rankings portfolio.
The Times Higher Education 100 Under 50 will – as its name suggests – rank the world’s top 100 universities under the age of 50. The table and analysis will be published online and as a special supplement to the magazine on 31 May, 2012.
The vast majority of the world’s top research-led universities have at least one thing in common: they are old. Building upon centuries of scholarly tradition, institutions such as the University of Oxford, which can trace its origins back to 1096, can draw on endowment income generated over many years and have been able to cultivate rich networks of loyal and successful alumni (including in Oxford’s case a string of British Prime Ministers) to help build enduring brands.
Such advantages are reflected in the overwhelming dominance of older universities in the Times Higher Education World University Rankings. But the focus of the THE 100 Under 50 is not on the traditional elites.
The analysis is about a new breed of global universities – those that have already managed to join the world’s top table in a matter of years, not centuries, and others showing great promise - institutions that could reach the top, in time. In a February 2012 article for the US website Inside Higher Ed, Daniel Lincoln, a visiting scholar at the Centre for International Higher Education, Boston College, painted a memorable image.
“Picture the year 1640,” he wrote. “You are an educated, upper-class Englishman, having a hearty laugh with your mates in London at the news that those religious fanatics in the colonies have now ‘founded their own university’ in Boston, led by the benefaction for a certain John Harvard – priceless!
“A few generations later, I’m guessing no one was laughing”.
Lincoln employed his image to argue against annual university rankings, on the grounds that “excellence, like all things of abiding value, is a marathon, not a sprint”. But his amusing illustration also demonstrates how established elites can be challenged by those who may at the time be dismissed as mere upstarts. We have seen this time and again, notably with the 1960s “plate glass” universities in the UK which now rub shoulders with (and often surpass) the Victorian civic universities. We are seeing it again with a number of institutions founded in the 1980s and 1990s, notably in Asia with a focus on science and technology, backed by abundant resources and serious political will.
Even Lincoln acknowledged that “the processes of growth have accelerated enormously” since the time Harvard challenged the ancients. The 2012 THE 100 Under 50 will draw on the same comprehensive range of 13 performance indicators used to compile the THE World University Rankings but will only rank those founded in 1962 or later. The indicators, all developed and provided by Thomson Reuters, will be carefully recalibrated to reflect the profile of younger institutions. The report will show us which nations are challenging the US and UK as the next higher education powerhouses. It will give us a unique insight into who the future Harvard and Cambridge universities may be. The THE 100 Under 50 showcases a new generation of global universities and offers a tantalising and invaluable glimpse into the future. Don’t miss it.
22 avril 2012

France - Classement Erasmus des universités 2011

http://www.europe-education-formation.fr/images/elements/2011/bandeau-agence.jpgDécouvrez le classement 2011 des établissements "Erasmus" les plus dynamiques.
Le Palmarès des 20 universités françaises les plus performantes en termes de mobilité Erasmus Etude est défini par la mobilité d’étude sortante Erasmus rapportée à l’effectif global de l’université. C'est l'Université de Savoie qui est en tête depuis trois ans. Téléchargez le classement.
En région PACA, deux universités sont classées: l'Université de Provence et l'Université d'Avignon et des Pays de Vaucluse.
L'Université d'Avignon et des Pays de Vaucluse est 8e pour 2010-2011. Elle régresse: 3e en 2009-2010 et 5e en 2008-2009.
L'Université de Provence est 19e pour 2010-2011.
Elle régresse: 14e en 2009-2010 et 13e en 2008-2009. Il faudra être attentif au résultat 2011-2012, Provence ayant fusionné au 1er janvier 2012 avec Méditerranée et Cézanne pour former Aix-Marseille Université (AMU).

http://www.europe-education-formation.fr/images/elements/2011/bandeau-agence.jpg Scopri le classifiche 2011 delle istituzioni "Erasmus" la più dinamica.
Le Top 20 università francesi di maggior successo in termini di studio Erasmus è definito dalla mobilità di studio Erasmus in uscita riportati nella dimensione complessiva dell'università.
Questa è l'Università di Savoia, che sta conducendo per tre anni. Scarica il ranking. Più...

7 avril 2012

Russia Moves to Improve Its University Rankings

http://graphics8.nytimes.com/images/misc/nytlogo152x23.gifBy Sophia Kishkovsky. After it was reported this month that not a single Russian university had cracked The Times Higher Education’s ranking of top 100 schools by academic reputation, Education Minister Andrei Fursenko said that Russia was in the process of creating its own rating system. A brain drain from Russia has been funneling its brightest minds to the West, while the nation’s embattled higher education system struggles to find its place in the post-Soviet world. Each new rating announcement sets off hand-wringing about the predominance of the United States and the rise of China, both sore points and models for Russia.
“Russia has had some internal debate about their academic community,” Phil Baty, the editor of the Times Higher Education rankings, said by telephone from London. “They have suffered from appalling brain drain, and there is also concern that their scholastic community is isolated.
“There are some schools that are extremely impressive, but it is also struggling, and it’s all down to resources,” Mr. Baty added.
Dr. Fursenko told the Interfax news agency that ratings were an “instrument of competitive battle and influence” and should not be monopolized. He said that Russia was working with international specialists to create its own “international and universally recognized” university rating, Interfax reported this month.
Two days earlier, the Kremlin chief of staff, Sergei Ivanov, said Russia should create its own corruption rating. The Times’s reputation rankings are based on responses by more than 17,000 academics, chosen in part according to Unesco data on the geographic spread of professors around the world. As early as February 2011, Prime Minister Vladimir Putin was calling on Dr. Fursenko and his ministry to work out its own ranking of foreign universities.
“You must know that certain experts think that these Western ratings are, in fact, an instrument for raising their competitiveness on the labor market,” Mr. Putin said at their meeting, where they discussed a law that would recognize foreign university diplomas in Russia. “That’s why we need to be very cautious about them, and work out our own objective method of evaluating the quality of education that graduates of these universities receive.”
Last August, Mr. Putin promised 70 billion rubles, or $2.38 billion, for higher education innovation in Russia over the next five years. Dr. Fursenko told Interfax that he would investigate why Russia had fallen on the Times Higher Education lists. Lomonosov Moscow State University, which is known for its mathematics and physical sciences programs, had been ranked 33rd by the Times last year, the first year it compiled a reputation ranking. Only two Russian universities — Moscow State and Saint Petersburg State University — made it onto The Times’s regular Top 400 ranking, placing in the 276-300 and 351-400 bands.
Russians take any blow against Moscow State very personally. It was founded in the 18th century by Mikhail Lomonosov, who is regarded as a Russian Leonardo da Vinci. The Stalin-era skyscraper that serves as its main building is one of the Russian capital’s landmarks, visible for kilometers around, and the vast university serves the function of Harvard, Oxford and the Massachusetts Institute of Technology all rolled into one. Viktor Sadovnichy, the rector of Moscow State, told Nezavisimaya Gazeta, a Moscow newspaper, that the university had suffered in the rankings because the quality of research at universities was weighted over teaching, but that such a year-on-year drop was too precipitous.
It was most likely set off, he said, by respondents being asked a new set of questions. Mr. Sadovnichy told the newspaper that the only question mentioned in the methodology described on The Times Higher Education’s Web site is “Which university would you send your most talented graduates to for the best postgraduate supervision?” which places Russian universities at an immediate disadvantage since there is no official “postgraduate” category in Russia.
In 2009, the Russian government designated a group of research universities for an influx of funds and development, something that China did years ago, said Martin Gilman, who was director of the International Monetary Fund’s Moscow office in the 1990s and has been director of the Higher School of Economics’ Center for Advanced Studies since 2006. The Higher School of Economics, one of the designated research universities and widely regarded as Russia’s most Western-style school, has lured 25 new faculty from the international academic job market — both overseas Russians and foreign-born nationals — with the promise of research opportunities and a smaller teaching burden than they would most likely encounter in the United States.
New hires are offered tenure-track positions and the resources to publish and to travel to conferences, which, in turn, raises the university’s international profile, said Dr. Gilman. The university has been transitioning to teaching and publishing in English, as well as introducing practices like blind peer-reviewed publications that are not yet the norm in Russia.
“We know that this is not going to have big payoffs in the short term in terms of international rankings, but we are hopeful that given our strategy of 2020” — by which time H.S.E. aims to be a world-class research university — “that over the longer term, this will be a much more solid basis in creating the kind of critical mass of faculty in certain disciplines,” Dr. Gilman said.
Russia has been faced with revamping its primary and secondary education systems as well in the wake of the collapse of Communism. The introduction of a standardized college entrance exam similar to the SAT in the United States, has been controversial, but is an important step to introducing national standards in Russia, Dr. Gilman said. On the university level, the social sciences were devastated during the Soviet era and are being built virtually from scratch.
“Those involved in higher education have a formidable challenge in this country because the dead weight of the past is enormous,” Dr. Gilman said.
Yefim Pivovar, rector of the Russian State University for the Humanities, one of the most prestigious liberal arts universities, known as R.G.G.U., said that Russian universities were still far behind in physical infrastructure, which, he said, also affected rankings. He said that R.G.G.U. had been exchanging students with Laval University in Canada for 20 years.
“They have kilometers of underground passageways between buildings,” he said. “We don’t have a single university with such passageways. I’m talking about the material base. I think they have seven rinks for Canadian hockey. That’s what we need to be doing. It’s not a question of ratings, but of the quality of our material base,” he said.
Dr. Pivovar said that Russia could not ignore rankings, but that R.G.G.U.’s 200 exchanges and agreements with schools like the University of California, Berkeley, and the universities of Bochum and Freiburg in Germany also proved its connection to international academia. Joyce Hor-Chung Lau contributed reporting.

6 avril 2012

Quel usage des palmarès et classements des grandes écoles

http://www.amge-jobs.com/wp-content/uploads/2010/06/logoFocusRH.jpgPropos recueillis par Brice Ancelin. Fin mars 2012, l’agence Quatre vents organisait une matinée d’échanges autour de la question des palmarès des grandes écoles, en particuliers ceux réalisés par les médias. Comment sont faits ces classements? Sont-ils sérieux? Dans quelle mesure peut-on s’appuyer dessus ? Retour sur ces questions.
« Beaucoup de choses ont été dites sur ces palmarès, parfois un peu fantasmées. Il ne faut pas oublier qu’il ne s’agit que d’un outil. » Gilbert Azoulay, animateur de la matinée, donne le ton. Celui-ci reconnaît que ces classements sont riches en informations, mais également complexes. S’il loue le travail réalisé par les journalistes pour les mettre en oeuvre, il souligne également l’exigence de rentabilité de ces produits éditoriaux qui coûtent chers à produire et doivent nécessairement "trouver leur public". Il appuie: « Le principe d’un palmarès réalisé par un journal est de fournir des informations. C’est donc cohérent pour un grand journal d’informations générales ou spécialisées de fournir un classement, à la fois sur le plan éditorial, mais aussi sur le plan commercial. » Premier élément, « il convient de savoir qui fait les palmarès et comment les comparer », conseille le spécialiste (sur ce point, voir les principaux palmarès dans le guide pratique de notre rubrique Relations écoles. Ainsi, les classements de Shanghai, des Mines ou le QS World university ranking adoptent une autre approche que celle des médias (Usine Nouvelle, L’Expansion, Le Figaro, Le Point, etc.).
Des classements sous influence
Des classements réalisés par les médias, et qui posent donc la question de la proximité entre écoles et journalistes. « Oui, les bonnes relations avec les journalistes peuvent jouer, mais ce n’est pas suffisant pour renverser un classement, répond Gilbert Azoulay. Vous avez également un ensemble de données chiffrées sur l’établissement. » Jean-Pierre Helfer, ancien président de la CEFDG (Commission d’évaluation des formations et diplômes de gestion), nuance: « Lorsqu’une école présente son dossier pour un classement, elle est sous son meilleur profil. Il n’est matériellement pas possible pour les journalistes d’aller chercher tous les contrats de travail pour vérifier le nombre de professeurs, par exemple. Il doit donc y avoir un certain degré de confiance entre les auditeurs et les audités. » Cécile Maillard, journaliste en charge de l’enseignement supérieur et de l’éducation auprès de l’Usine Nouvelle, ajoute: « Les critères de notre classement peuvent aussi évoluer en fonction de la demande des entreprises. L’année dernière, par exemple, nous avons accordé plus d’importance à l’international car les entreprises nous affirmaient qu’il n’était pas possible de faire une carrière d’ingénieur sans international. » Dans ce contexte, la question la pondération des données d’un classement se révèle centrale.
Des classements utiles

Faut-il pour autant jeter le bébé avec l’eau du bain? Rien n’est moins sûr. « Tout cela fonctionne car c’est utile, avance Jean-Pierre Helfer. Quand il n’y a pas d’utilité on ne fait pas. Les personnes qui vont acheter ces classements y trouvent une information supplémentaire. » Cécile Maillard reprend: « Ces classements sont un bon signal, pour les entreprises, de ce qui change et de quelle façon dans les écoles. » Plusieurs voix dans la salle se font entendre en ce sens. « Les écoles ne peuvent pas non plus gonfler les déclarations de salaire à la sortie, sinon quand les étudiants se retrouvent devant un RH, c’est un dur retour à la réalité », estime cette responsable relations écoles. Un autre d’ajouter: « Nous regardons les palmarès avec prudence. Ils changent beaucoup d’une année sur l’autre. Nous avons aussi nos propres critères en interne pour contrebalancer. » Gilbert Azoulay souligne une évolution dans ces classements qui permet aussi de contourner, en partie, la question de la pondération des différents critères: « Le Financial Times, L’Etudiant, le Nouvel Observateur et Le Point proposent de faire son propre classement, en fonction de ses critères, de ses priorités. »
Le vrai danger, selon Jean-Pierre Helfer, « c’est lorsque l’on licencie un directeur d’école parce qu’il est mal classé ou parce qu’il perd une accréditation. Cette grande famille de critères donne des stratégies de développement identiques pour les écoles ». Le mot de la fin reviendra finalement à la salle. Un professionnel RH s’interroge : « Je me demande si nous avons vraiment besoin d’un classement pour connaître les meilleures écoles. Je serais intéressé d’avoir un classement par formation ou par secteur d’activité… »
Si vous souhaitez approfondir ce sujet, vous pouvez Lire ici notre analyse des classements avec Studyrama Grandes écoles.

http://www.amge-jobs.com/wp-content/uploads/2010/06/logoFocusRH.jpg Interview by Brice Anceli n. End of March 2012, the agency organized a four winds of morning trading on the issue of results in big schools, especially those made by the media. How these rankings are done? Are they serious? To what extent can we rely on? Return on these issues.
"Much has been said about these charts, sometimes a little fantasized.
We must not forget that this is only a tool. "Gilbert Azoulay, host of the morning sets the tone. It recognizes that these rankings are rich in information, but also complex. If he praises the work of journalists to implement, it also underlines the need for profitability of editorial products that are expensive to produce and must necessarily "find an audience." It supports: "The principle of a record made by a newspaper is to provide information. This is consistent for a major newspaper of general or specialized information to provide a ranking, both editorially, but also commercially. "The first element," we need to know who made the charts and how to compare them, "advises the specialist (on this see the main winners in the practical guide relations section of our schools. More...
6 avril 2012

Les classements internationaux: enjeux, méthodologies et perspectives pour les universités françaises

http://www.obs-ost.fr/fileadmin/templates/img/logo.gifNouvelle e-publication de l'OST sur les classements internationaux. Ce premier numéro de la série « Résultats et recherches » de la e-publication de l’OST regroupe quatre articles sur le thème « Les classements internationaux: enjeux, méthodologies et perspectives pour les universités françaises »
Edito

Les classements internationaux sont devenus un enjeu majeur pour comparer les universités au niveau mondial. En quelques années, non seulement les classements (comme celui de Shanghai) ont répondu à de multiples usages (étudiants, politiques publiques, établissements) mais ils ont aussi connu une médiatisation croissante. Face à ce succès, les classements suscitent de nombreuses interrogations: quelles sont les méthodologies utilisées et comment évoluent-elles? Quelles sont les positions des universités françaises et européennes dans les classements et comment s’expliquentelles? Peut-on envisager des méthodologies nouvelles de classements, et si oui sur quelles bases?
Ce premier numéro de la collection « Résultats et recherches » des e-publications de l’OST vise à contribuer à la réflexion sur les classements. Quatre articles sont proposés pour apporter au lecteur des éclairages différents sur ce sujet éminemment complexe. L’article de G. Filliatreau (OST) retrace les dynamiques de succès des classements, et évoque les défis à venir et les réflexions en cours au niveau européen pour proposer des classements alternatifs. L’article de N. Carayol (GREThA et OST), A. Lahatte (OST) et G. Filliatreau (OST) propose une nouvelle méthodologie de classements des universités, fondée sur une théorie des relations de dominance. Appliqué aux universités américaines, cet article montre comment il est possible de proposer des méthodes alternatives de classements. L’article de P. Vidal (OST) propose d’analyser le positionnement des établissements français dans 3 classements internationaux, et d’expliquer leurs performances comparées à celles de leurs homologues britanniques et allemands. Le quatrième article de J.A Héraud (BETA) analyse la position des établissements de la région européenne transfrontalière du Rhin supérieur. La diversité des positions est la conséquence de la diversité des institutions au sein de la région, pour lesquelles bien des critères de Shanghai ne sont pas adaptés à capter leur rôle en matière de recherche, d’innovation et de transfert de connaissances sur un territoire.
Télécharger le document « Les classements internationaux: enjeux, méthodologies et perspectives pour les universités françaises. »

http://www.obs-ost.fr/fileadmin/templates/img/logo.gif Ny e-offentliggørelse af OST på den internationale ranglister. Denne første udgave af serien "forskning og resultater" af e-offentliggørelse af OST indeholder fire artikler om "internationale ranglister: emner, perspektiver og metoder til franske universiteter". Mere...

1 avril 2012

Classification of university types is key to building strength in diversity

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Claudia Reyes and Pedro Rosso. Over the past few decades, particularly during the 1980s and 1990s, most university systems in the developing world underwent an impressive transformation – with several-fold increases in the number of students enrolled and the opening of many new, mostly private, universities. One of the consequences of this expansive change has been a marked increase in the heterogeneity of the institutions comprising the various systems.
Beyond its academic dimensions, heterogeneity poses serious problems to systems attempting to classify the universities for research, ranking or public policy purposes. Chile is a good example. The first attempt to classify national universities – based on selectivity, size, prestige and nature (public or private) – resulted in eight categories. Despite some of its merits, this classification was criticised on conceptual and practical grounds, including the fact that the categories were not exclusive ones.
Other observers have tried to classify Chilean universities, using selectivity and annual publications as primary criteria, and the number of students and the years of accreditation granted to the institution as secondary criteria. They described seven categories of institutions – some improvements over the previous ones. However, this classification was also flawed on several accounts, including the use of selectivity as a main criterion. For example, one category listed selective research universities, while another group was described as non-selective, teaching, large-size and low-accreditation institutions.
A recent approach faced the challenge of classifying Chilean universities – using as main criteria the existence and number of accredited doctoral programmes and the annual number of internationally indexed publications. Applying the first criterion, the universities were divided into two groups: (1) without accredited doctoral programmes; and (2) with doctoral programmes. Then, the former were further divided, according to the number of publications, into two categories: (1a) with fewer than 20 annual publications; and (1b) with 20 or more annual publications.
The first category (1a) was named ‘teaching university’ and comprised 23 institutions. The second, called ‘teaching university with limited research’ (1b), included 11 universities. In turn, the universities with accredited PhD programmes were divided into two categories: (2a) those with up to five programmes, and (2b) those with more than five doctoral programmes.
The first category (2a) was called ‘university with research and doctoral programmes in selected areas’, and 11 institutions met this criterion. The second (2b) was named ‘research and doctoral programmes university’ and comprised six universities. As expected, the four categories had marked differences in the mean values of the variables used as ‘primary classification criteria’.
Thus, the teaching university group (1a) averaged four publications per year; the teaching university with research projection group (1b) averaged 41 publications per year; the ‘university with research and doctoral programmes in selected areas’ group (2a) averaged 94 annual publications; and the ‘research and doctoral programmes university’ group (2b) averaged 636 publications per year.
In turn, while the average number of doctoral programmes was 2.2 in the group of ‘university with research and doctoral programmes in selected areas’ (2a), it averaged 18.5 in the group of ‘research and doctoral programmes university’ (2b). Consequently, the primary classification criteria had successfully grouped Chilean universities in markedly different categories.
Particularly striking was the tenfold difference in the number of publications observed, between the two ‘teaching universities’ categories – indicating that in this aspect the category ‘teaching university’ (1a) is indeed quite different from its ‘teaching university with research projection’ (1b) partner. On the other hand, this difference implies that in approximately 30% of the Chilean universities practically no research is conducted.
The four categories were also compared in terms of the values of institutional size and academic performance (accreditation) – unrelated to the publications and doctoral programmes indicators used to define the four categories. The statistical significance of variations in mean values between categories was tested using a one-way analysis of variance. This test provides a method to establish whether or not the means of several groups are statistically different.
The analysis of variance test was complemented with post hoc tests, which do establish more specifically means that were significantly different from each other. Results indicated a major diversity in mean values in most of the indicators explored, including: number of students, number of faculties, percentage of faculties with advanced degrees, number of faculties per study programme, percentage of accredited study programmes and years of institutional accreditation. The main differentiations were found between the ‘teaching university’ (1a) and the ‘research and doctoral programmes university’ (2b) categories, with mean values of the other two categories falling in between.
The categories defined by the new classification are associated with basic institutional characteristics and academic performances. Thus, for comparison purposes the institutions included within a given category could be considered to be ‘academic peers’. The latter seems a relevant point, since most of the available comparative studies – including national and international university rankings – generally overlook this aspect. From this perspective, it is unfortunate that the research universities, especially those considered to be ‘world class’, have become the paradigm of academic quality. While recognising the need for any country to have a ‘critical mass’ of those institutions, from the standpoint of diversity and their intrinsic value, the only paradigm that a university should have is the best institution within its own category. The new classification used for Chilean universities can be applied in other countries, with some adaptation to local realities. For example, other cut-off points for annual publications or number of doctoral programmes accredited by a national agency could be used.
The new classification might also provide an overall diagnosis of a system, in terms of the percentage of teaching and research institutions present. In university systems diversity represents a value in itself, since it implies, both for the students and the faculty, more options to decide where to study or work. When classifying and comparing universities, particularly in developing systems, all classifications do freeze in time what are essentially dynamic situations. In the future, many institutions will reform their category, as research activities expand and new postgraduate programmes are created. By the same token, faithful to their missions, many other universities will remain in the same category, while improving their academic performance.
Ultimately, in the academic world what really counts is coherence between mission, human and financial resources and the will to achieve the highest possible quality standards. Thus, it is crucial to classify universities properly.
* Claudia Reyes is executive director of Red Universitaria Cruz del Sur in Santiago, Chile; Pedro Rosso is rector emeritus and professor of paediatrics at Pontificia Universidad Católica de Chile, in Santiago, Chile. email: Claudia Reyes and Pedro Rosso: barriga@uc.cl. This is an edited version of the article “A New Approach for Classifying Chilean Universities” published in International Higher Education, No 67, Spring, 2012. http://www.bc.edu/research/cihe/.
Newsletter
49 abonnés
Visiteurs
Depuis la création 2 783 885
Formation Continue du Supérieur
Archives