Canalblog
Suivre ce blog Administration + Créer mon blog
Formation Continue du Supérieur
11 juin 2012

Ranking the 100 under 50

http://profile.ak.fbcdn.net/hprofile-ak-snc4/174887_161806250531786_2075947517_q.jpgMost of the major rankings tend to be dominated by large research intensive institutions, and they also tend to be some of the older institutions in their respective countries. In order to examine what is going on with the newer institutions, Times Higher Education launched today a new ranking – calling it the “100 under 50″. In many ways this formulation somehow reminds of lists of young promising entrepreneurs, and one could perhaps argue that this resemblance in formulation is not completely coincidental.
In the complementary THE magazine these institutions are presented as having upward trajectories, little institutional baggage and opportunities for rapid response to societal needs – they are presented as somehow different and “doing their own thing“. Knowing the one Norwegian institution on  that list – University of Tromsø – this “being different” motto was quite prominent during its establishment, and they were reffered to as a regional experiment during their establishment in 1968.
However, as the magazine also points out, this newness can also provide its challenges – where research cultures might not be established quite yet and the institution needs to finds its place in the local and global higher education landscape.  Overall, it was well under half of these young universities, in fact only 19 of them, that rank amongst the top 200 in the world (according to World University Rankings). So – does this imply that when building a world class institution – age matters?
Another aspect is that this group includes institutions from 30 countries, with 20 of them coming from the UK. Depending on whether one counts Turkey as a part of Europe or not, there is 50 or 51 institutions that are from Europe. Australia can congratulate themselves with 14 institutions in this category, whereas the US only has 9 – a quite different image of the more general rankings where institutions from the US dominate. So – at least when examining the up and coming institutions, Europe is definitely not lagging behind. However, it is Asia who is the big winner overall, with the best institution on the list (Postech in Republic of Korea) and six institutions in top 20.
However, one should also bear in mind that this to an extent also says something about national policy landscape and the types of structural reforms higher education systems have gone through.  In some smaller countries the higher education systems and number of institutions has been well established, leaving little space for new institutions. This is for example the case with Switzerland where it was only one institution that made it to the list.
Another factor to be considered when looking at this list is that it uses the same 13 indicators that are used for the general ranking – perhaps implying that it does not quite capture the innovative nature of the new institutions, and the measurement goes on the lines of institutions that do well on the same scale as the “old ones”. You can view a number of articles on the topic on the THE magazine that can be viewed here and a pdf table with the whole list can be viewed here.
See also World's top 100 universities under the age of 50: ranked by Times Higher Education.
1 juin 2012

World's top 100 universities under the age of 50: ranked by Times Higher Education

http://static.guim.co.uk/static/b05b48a62321634f4c0395bffea3cb2437e98040/common/images/logos/the-guardian/news.gifSouth Korea's Pohang University of Science and Technology has been ranked the best university under the age of 50. Find out which universities have made the list.
Get the data.
Get the 2012 university reputation rankings.
More data journalism and data visualisations from the Guardian.
South Korea's Pohang University of Science and Technology has topped a list of the best universities under the age of 50. The inaugural rankings by Times Higher Education (THE) aim to show "which nations are challenging the US and UK as the next higher education powerhouses". Swiss university, École Polytechnique Fédérale de Lausanne follows in second place. Six countries are represented in the top 10 - South Korea, Switzerland, Hong Kong, US, France and the UK, making it more diverse than the traditional world top 100 which is usually dominated by US and UK institutions. Universities in East Asia are highlighted in the rankings, with South Korea taking two of the top five places. The Hong Kong University of Science and Technology comes in at third place and in total Hong Kong has four institutions within the top 50. The Google fusion map above shows the locations of all the universities on the top 100 under 50 rankings and includes individual scores for teaching, research, citations, income from industry and international mix. Click on an icon to explore. Click here to explore the full screen version of the map.

1 juin 2012

Rankings rivals slug it out over new universities

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy David Jobbins. The main rivals in the international higher education rankings business went head to head this week to launch league tables of the world’s top newer universities. Hours before Times Higher Education magazine was due to publish its Top 100 ranking of universities under 50 years old with data supplied by Thomson Reuters, QS leapt in with its own Top 50.
Both illustrate the impact of higher education investment in emerging economies, with universities from East Asia in particular challenging the dominant university systems of the United States and Britain. The QS ranking is headed by two Hong Kong universities – the Chinese University of Hong Kong and the Hong Kong University of Science and Technology (HKUST).
Other Far East universities dominate the top 10: Singapore’s Nanyang Technological University is fourth, the Korea Advanced Institute of Science and Technology is fifth, Korea’s Pohang University of Science and Technology is seventh, and a third Hong Kong institution, City University of Hong Kong, is ninth. UK universities perform strongly, with the universities of Warwick and York third and sixth respectively. Maastricht University in The Netherlands is the only continental European university and the University of California, Irvine (10th) is the only US university in the top 10.
The THE ranking places Korea's Pohang, a private university founded in 1986, at the top, with HKUST third and the Korea Advanced Institute of Science and Technology fifth. All other top 10 universities are from Europe and the US. In contrast with the QS ranking, European universities are well represented in the THE top 10. Switzerland’s École Polytechnique Fédérale de Lausanne is second, France’s Université Pierre et Marie Curie is sixth, and the UK’s York, Lancaster and East Anglia universities are eighth, ninth and 10th respectively. UC Irvine is fourth and the University of California, Santa Cruz is seventh.
The QS ranking seems to draw directly on the data used for its World University Ranking. Twenty-three countries are represented, led by Australia with 10 universities, followed by the UK with seven. While Asian universities are solidly represented at the top of the table, it is Australia that dominates in terms of the number of institutions listed in the Top 50, reflecting its economic position at the crossroads between East and West. In contrast, North America is represented by just one US university and three from Canada.
THE says it has 'recalibrated' its World University Rankings data “to better reflect the profile of younger institutions”. Although its best 'Under 50' universities fail to top the table, the UK has more institutions – 20 – in the THE list than any other nation. Australia follows with 14, while the US – which dominates the traditional World University Rankings – has just nine representatives.
In total, 30 countries or regions are represented in the top 100 – compared to just 26 in the THE World University Rankings top 200. Phil Baty, editor of the THE rankings, said the selective ranking was a clear warning to the traditional elites in the US and UK that “new powers in higher education and research are quickly emerging”.
“The heritage institutions need to watch their backs. With focused investment, innovation, strategic vision and lots of talent, some institutions have managed to achieve in a matter of years what the traditional elite universities have developed over many generations. The landscape is changing quickly and the old global hierarchies cannot rest on their laurels.
“Asian institutions are showing great strength, and investment taking place in the Gulf, for example, is very promising.”
He suggested that the Under 50 ranking was an “extraordinary example” to all those nations who aspire to develop world-class research-led global universities.
“Those of the top of this list show what can be achieved in a short time with the political will and the right resources, while those lower down give a real insight into which institutions could be future global stars.”
Ben Sowter, head of the QS Intelligence Unit, said: “Asia’s superior performance compared to Western universities established within the same time frame is testament to Asia’s dynamism.”
He added: “After Australia, the UK is the most represented country in this table – although this will change during the next few years as the youngest of the UK entrants were established in 1966.
“Perhaps the most interesting question is whether there is a next generation of post-1992 British universities ready to make a mark on the QS World University Rankings and whether recent funding reforms in British higher education will either help, or hinder, their ambitions.”
The Under 50 rankings from the two rivals are clearly directed less at would-be students, to whom the age of a university is likely to be much less relevant than its academic performance, and more at policy-makers – and, potentially, marketing departments.
31 mai 2012

Rankings Without Reason

http://www.insidehighered.com/sites/all/themes/ihecustom/logo.jpgBy Phil Baty. It was a rare spectacle: a senior administrator of a leading international university, speaking at a conference of peers, issued a public "thank you" to those who compile university rankings. The rankers – me included -- more typically face criticism of the power and influence we wield.
But Chen Hong, director of the office of overseas promotion at China's Tsinghua University, told the World 100 Reputation Network conference in Washington in May: "We should thank those organizations who publish these indicators. At least we can find something for comparison and benchmark our own performance."
Reflecting the approach that my magazine, Times Higher Education (THE), has taken to disaggregate the overall composite ranking scores in our publications, she explained: "What is useful for us is the detailed indicators within those rankings. We can find out comparable data, benchmarking various universities and use them for planning."
Indeed, there is growing evidence that global rankings – controversial as they are – can offer real utility. But those of us who rank must also be outspoken about the abuses, not just the uses, of our output. There is no doubt that global rankings can be misused. It was reported recently, for example, that a $165 million Russian Global Education program would see up to 2,000 Russian students each year offered “very generous” funding to attend institutions around the world – but that qualification for the generous scholarships will be dependent on the students attending an institution in the top 300 of the Times Higher Education World University Rankings. Brazil’s hugely ambitious Science Without Borders scholarship program to send 100,000 Brazilian students overseas similarly links the scholarships to THE-ranked institutions.
While such schemes offer a welcome endorsement of the rigor of THE’s rankings data (provided by Thomson Reuters) and its ranking methodology, speaking as the (rather flattered) editor of the THE rankings I'd still suggest that they are ill-advised. Global university ranking tables are inherently crude, as they reduce universities to a single composite score. Such rigid adherence to the rankings tables risks missing the many pockets of excellence in narrower subject areas not captured by institutionwide rankings, or in areas of university performance, such as knowledge transfer, that are simply not captured well by any ranking.
One of the great strengths of global higher education its extraordinarily rich diversity, which can never be captured by the THE World University Rankings, which deliberately seek only to compare those research-intensive institutions competing in a global marketplace and which include less than 1 percent of the world’s higher education institutions. In this context, a new declaration from a consortium of Latin American university rectors agreed in Mexico City last week must be welcomed as a sensible and helpful contribution to the rankings debate. The declaration, agreed at a two-day conference at the National Autonomous University of Mexico, entitled "Latin American Universities and the International Rankings: Impact, Scope and Limits," noted with concern that "a large proportion of decision makers and the public view these classification systems as offering an exhaustive and objective measure of the quality of the institutions."
The rectors’ concern is of course well-placed – no ranking can ever be objective, as they all reflect the subjective decisions of their creators as to which indicators to use, and what weighting to give them. Those of us who rank need to work with governments and policy makers to make sure that they are as aware of what rankings do not -- and can never -- capture, as much as what they can, and to encourage them to dig deeper than the composite scores that can mask real excellence in specific fields or areas of performance. That is why I was delighted to be in Mexico City last week to joint the debate. The meeting, which drew together rectors and senior officials from 65 universities in 14 Latin American countries, issued a call to policy makers to "avoid using the results of the rankings as elements in evaluating the institution’s performance, in designing higher education policy, in determining the amount of finance for institutions and in implementing incentives and rewards for institutions and academic personnel."
I would – to a large extent -- agree. Responsibly and transparently compiled rankings like THE’s can of course have a very useful role in allowing institutions, like Tsingua and many, many others, to benchmark their performance, to help them plan their strategic direction. They can help governments to better understand some of the modern policy challenges of mass higher education in the knowledge economy, and to compare the performance of their very best research-led institutions to those of rival nations. The rankings can help industry to identify potential investment opportunities and help faculty member make career and collaboration decisions. But they should inform decisions -- never drive decisions.
The Mexico declaration said: "We understand the importance of comparisons and measurements at an international level, but we cannot sacrifice our fundamental responsibilities in order to implement superficial strategies designed to improve our standings in the rankings."
Some institutional leaders are not  as sensible as those in Latin America. Speaking at the same Washington conference where Chen Hong gave thanks to the rankers, Pauline van der Meer Mohr, president of the executive board at Erasmus University, Rotterdam, confirmed frankly that proposals for a merger between her institution and Dutch counterparts the University of Leiden and the Delft University of Technology were “all about the rankings.” The three Dutch institutions calculated, she explained, that merged as one, they would make the top 25 of world rankings, while separately they languish lower down the leagues. "Why would you do it if it doesn't do anything for the rankings?" she asked.
But the merger did not take place.
It was dropped because of a mix of political unease, fierce alumni loyalty to the existing “brands,” and an “angry” response from research staff. Researchers at all three institutions, van de Meer Mohr admitted, had asked: "You are not going to merge universities just to play the rankings game?" To do so, they had concluded, would be "ridiculous."
I believe that those Dutch academics were quite right.
23 mai 2012

Mesurer pour classer - indicateurs d'insertion et classement des universités

http://www.cereq.fr/var/plain_site/storage/images/publications/net.doc/le-marche-du-travail-comme-lieu-d-evaluation-des-politiques-de-formation/44267-1-fre-FR/Le-marche-du-travail-comme-lieu-d-evaluation-des-politiques-de-formation_large.pngLe marché du travail comme lieu d’évaluation des politiques de formation?  Journée d'échanges "Evaluation Formation Emploi", Net.Doc, n° 92, 2012, 97 p. Le marché du travail peut-il être le lieu de l'évaluation des formations professionnelles, et plus généralement des politiques publiques touchant aux formations? Répondre à cette question incite d'abord à réfléchir sur la pertinence d'une évaluation des formations via les résultats obtenus par les formés sur le marché du travail. Cela conduit ensuite à s'interroger sur l'apport de cette évaluation sur la prise de décision politique. Ce document rassemble les résumés des contributions présentées lors d'une journée d'échanges autour de ces questions. Cette journée constitue la première manifestation d’un réseau de recherches interdisciplinaire en économie-gestion, sociologie, psychologie et sciences de l'éducation sur le thème Evaluation - Formation - Emploi. Télécharger la publication.
Mesurer pour classer - indicateurs d'insertion et classement des universités
Isabelle BORRAS
Centre Associé Céreq Grenoble, Université Pierre Mendès France

Depuis quelques années, un nouvel usage des indicateurs d'insertion professionnelle des étudiants produits par les Universités se développe, celui du classement et de la comparaison des formations et des établissements entre eux. En effet, jusqu’à ces dernières années, les indicateurs d’insertion étaient principalement utilisés à des fins de pilotage ou de communication auprès de différents publics.
Ce nouvel usage des indicateurs, avec le classement de Shangaï à son origine, suscite des réactions de la part de la communauté scientifique et des producteurs de ces indicateurs. Au sein du milieu scientifique (experts, statisticiens, économistes de l’éducation…), les réactions se concentrent essentiellement autour de la pertinence des indicateurs choisis et de leur construction, et proposent de développer des indicateurs plus complexes. Quant aux producteurs des indicateurs (essentiellement les observatoires), ils dénoncent une instrumentalisation des observatoires et la construction de ces indicateurs qui n’avaient pas comme usage originel de constituer un classement.
Deux questions se posent:
A quelles attentes ce nouvel usage de l’évaluation externe des formations supérieures répond-il, par quel acteur est-il porté et avec quels enjeux? Il faut analyser comment son déploiement s’articule avec les autres usages et remet en cause l’organisation des systèmes d’information et les relations entre les observatoires locaux et centraux.
Quelle est la posture évaluative qui conduit à mesurer pour comparer et classer ? Il s'agit ici de se demander quelles sont les postures alternatives.
Pour y répondre, il faut analyser le lien entre évaluation externe et classement en combinant une réflexion théorique sur les postures de l‘évaluation et une analyse des dispositifs existants (panorama des enquêtes d‘insertion nationales et internationales ainsi que des classements internationaux). Il faut également saisir la manière dont cet usage nouveau des indicateurs interroge les pratiques des observatoires universitaires, leur organisation et leur articulation avec le Ministère (enquête auprès du réseau des Observatoires Universitaires (RESOSUP).
Measure to classify - indicators of integration and university rankings
Isabelle BORRAS

Center Associate Céreq Grenoble, Université Pierre Mendes France

In recent years, a new use of indicators employability of students produced by the universities is growing, that of classification and comparison of training institutions and between them.
Indeed, until recent years, indicators of integration were mainly used for control or communication to different audiences.
This new use of indicators, with the Shanghai ranking in its origin, provokes reactions from the scientific community and producers of these indicators.
Within the scientific community (experts, statisticians, economists of education ...), the reactions are mainly concentrated around the relevance of the selected indicators and their construction, and plan to develop more complex indicators. As for the producers of indicators (mainly observatories), they denounce the instrumentalization of the observatories and the construction of these indicators were not as original use to establish a ranking.
Two questions arise:

What are expectations that new use of the external evaluation of higher education he answers, by which actor he is worn and what issues?
We must analyze how its deployment dovetails with other uses and challenges the organization of information systems and the relationships between central and local observatories.
What is the posture that leads to evaluative measure to compare and rank?
This is to ask what the alternative postures.
To answer this, we must analyze the link between external evaluation and ranking by combining theoretical reflection on the postures of the assessment and analysis of existing (overview of investigations of national and international integration as well as international rankings).
We must also understand the way this new use of indicators questions the practices of university observatories, their organization an. More...
21 mai 2012

Educators Debate Negative Effects of International Rankings on Latin American Universities

http://chronicle.com/img/global-header-logo.gifBy Steven Ambrus. The limitations of international higher-education rankings and their negative effect on universities in Latin America were key themes of a two-day conference held here at the National Autonomous University of Mexico.
The conference, "Latin American Universities and the International Rankings: Impact, Scope, and Limits," brought together 74 leaders of public and private universities from around Latin America, as well as representatives from some of the world's principal organizations that rank institutions of higher learning. The conference focused largely on the negative consequences that comparisons based on global rankings can have on Latin American universities, especially when used by the news media and governments to evaluate a university's overall performance. Participants said such a focus could affect not only universities' ability to attract students but also the public financing they receive.
Imanol Ordorika, the academic coordinator of the conference and director general of institutional evaluation at the National Autonomous University, recalled how several years ago his university found itself in an adversarial relationship with Mexico's Congress. The fact that the institution had a relatively high position in international rankings at the time played a significant role in keeping its financing at healthy levels.
"The problem is that when you go down in the rankings, the media can be critical, and policy makers view you negatively," he said. That is the case, he said, even when a drop in the rankings has nothing to do with performance but rather a change in the ranking's methodology or other universities' improvement in weighted indicators. Latin America has faired poorly in the global rankings. Though 8.5 percent of the world's people live in the region, only 11 of the world's top 500 universities—2.2 percent—are in Latin America, according to the most recent edition of Shanghai Jiao Tong University's closely watched Academic Ranking of World Universities. Only three universities from the region—the University of São Paulo, the State University of Campinas, in Brazil, and the Catholic University of Chile—make the top 400 universities in the Times Higher Education ranking, with the University of São Paulo placing highest, at 178.
Participants at the meeting here said that less-than-impressive showing was a reflection of the rankings' bias toward elite universities in the English-speaking world, which have lots of money to spend on natural sciences, medicine, and engineering. Most of the rankings, they pointed out, give a great weight to the number of publications and citations in bibliographical databases like the Thomson Reuters Web of Knowledge and SciVerse Scopus, where English-language articles in science dominate, or, in the case of the Shanghai ranking, to the number of Nobel Prizes awarded to alumni and faculty. But that methodology neglects the strengths of many Latin American universities in teaching, the social sciences, and the humanities, and in the training of future government leaders and the development of national institutions and culture, they said.
Rankings could also affect a university's core mission in reducing inequality and poverty. The Federal University of ABC, in the state of São Paulo, for example, was founded in 2006 with a mission to help lower- and middle-income students from Brazil's academically weak public schools gain access to higher education. The university reserves half its places for such students and spends much of its budget on scholarships for them. But there is talk now in Brazil's federal system of using rankings as a criterion for government financing, and the university's rector, Helio Waldman, is nervous. "Because we are committed to social inclusion, as well as academic excellence, we have to be less selective and spend less money on scientific research in favor of scholarships. If we are forced to emphasize our positions in the rankings, we might have to sacrifice that commitment."
Phil Baty, editor of the rankings for Times Higher Education, said readers of the rankings should keep in mind that The Times was looking at a particular type of institution. It would be a mistake, he said, for governments to not look at broader sets of data. "It wouldn't be appropriate for an extremely large, regional, teaching-focused university to be ranked with a set of criteria that are really designed for the globally competitive, research-intensive ones. You have to examine in detail what the indicators are and what they really show and draw on a wider range of materials in making decisions."
Nonetheless, global rankings can "hypnotize" policy makers in developing nations and make them forget that the rankings favor universities from wealthy countries with the resources to do high-end research in science, said Simon Marginson, a professor of higher education at the University of Melbourne. That is a mistake, he said. For those nations without fully industrialized economies, rankings do not provide a competition based on merit.
"Until a nation has the economic capacity to sustain a broad scientific infrastructure, it should use regional rankings and local benchmarks to drive improvement. Not global rankings," he said.
20 mai 2012

HEIK academic seminar on rankings and organisation of universities

http://uv-net.uio.no/wpmu/hedda/wp-content/themes/hedda/styles/blue/head-bg.jpgThis video features a presentation by dr. Kerstin Sahlin, titled “A rising interest in management and governance of universities: Rankings and organization models on the move”
In this presentation, Sahlin examines two influential global themes: the expansion of rankings and assessments, and how universities have become organisational actors. The two themes are interrelated and they are also connected to a number of other global developments, and multilevel analysis will be employed to explain why universities have lately become subject to such intense reforms of governance and organization.
Kerstin Sahlin is currently a professor of business administration at Uppsala University, and has extensive first hand knowledge about higher education governance in Nordic countries. She has earlier held the position of prorector at Uppsala University and her main research interests are linked to the organizational change in the public sector and the transnationalisation of management ideas.
The lecture was recorded in April 2012 as a part of the academic seminar series of the research group HEIK (Higher Education: Institutional Dynamics and Knowledge Cultures) at the university of Oslo.
See also New HEIK working paper on institutional transformation of a new university.
19 mai 2012

List ranks colleges by prominent alumni

http://www.chinadaily.com.cn/2011images/logo-e.jpgBy Cheng Yingqi. China's top colleges have become a breeding ground for the wealthy, with more than 1,500 billionaires appearing in an alumni-ranking list released on Wednesday. The list was released by cuaa.net, a website that provides services for college graduate associations across the country.
Based on research since 2003 on the careers of alumni of China's leading universities, the website ranks universities according to the number of their graduates who have become billionaires, top scientists, political leaders, and other prominent figures. Tsinghua University won the title of "the cradle of billionaires", with 84 super rich who studied there - and who have a combined wealth of 300 billion yuan ($47.4 billion). Besides rich people, Tsinghua produced 49 political leaders, the highest number among the 30 universities on the list.
http://www.chinadaily.com.cn/china/images/attachement/jpg/site1/20120518/001aa018f83f111fd4fc13.jpgNevertheless, Peking University ranked first. Though it produced fewer billionaires and politicians, it fostered 182 social scientists and 144 members of the Chinese Academy of Sciences or the Chinese Academy of Engineering - compared with Tsinghua's 18 social scientists and 141 academy members. After Tsinghua University and Peking University, the list showed the figures from 28 other universities with successful alumni, including Renmin University of China and Fudan University. Zhang Ming, a professor at Renmin University of China and an education columnist, said the rich alumni reflect trends in the society.
"We are in an era in which people's greatest pursuit is wealth," Zhang said. "The universities listed happen to be the most prestigious universities in China, so they naturally attract the elite from across the country.
"Although these universities may not have taught them the skills to create wealth, their campuses develop into platforms to allow the students to expand their social relationships, especially with wealthy people," Zhang said.
But Zhang also said that the primary mission of universities is not to teach students how to make more money.
"Most Chinese universities still need to improve, in cultivating students' ability for innovation and for scientific research," he said.
Xiang Danni, a graduate of Peking University, thinks it is not proper to use distinguished alumni as a standard for rating a university.
"People succeed for many reasons beyond university education. A university should never ignore its fundamental responsibility - education and research. So this ranking has little meaning, although it might briefly attract some readers' attention," Xiang said.
Zhao Wanwei, also a graduate of Peking University, thinks that the ranking is not convincing.
"The list released neither names of the alumni nor rating standards, but only the numbers of wealthy alumni from each university. But it does reflect one hot issue - the cultivation of society's elite," Zhao said.
Tang Shi contributed to this story.
12 mai 2012

UK is 10th among best global environments for universities

http://www.timeshighereducation.co.uk/magazine/graphics/mastheads/mast_blank.gifBy Simon Baker. The UK has been placed 10th in a ranking of the world’s best higher education systems, with the US topping the list.
Researchers from the University of Melbourne applied 20 different measures to data collected from 48 countries and territories to construct the ranking for Universitas 21, an international network of 23 research-intensive institutions. The ranking aims to show which countries create a “strong environment” that allows universities to contribute to growth, provide a high-quality student experience and help institutions compete globally.
The top 10 in the overall ranking were, in order, the US, Sweden, Canada, Finland, Denmark, Switzerland, Norway, Australia, the Netherlands and the UK. Measures used to compile the ranking were grouped into four broad areas: public and private investment; research and workforce output; international connectivity; and environment (such as government policy). Population size was also taken into account. As well as the overall results, the survey also found that investment in research and development was highest in Denmark, Sweden and Switzerland.
The US dominated total output of research journal articles, but Sweden led the measure of percentage of articles per head of population. According to the study’s authors – from the Melbourne Institute of Applied Economic and Social Research at the University of Melbourne – there is a strong relationship between investment and output. Of the top eight countries for output, only the UK and Australia are not among the top eight for resources.
Meanwhile, international students form the highest proportions of total student numbers in Australia, Singapore, Austria, the UK and Switzerland. International research collaboration is most prominent in Indonesia, Switzerland, Hong Kong, Denmark, Belgium and Austria. The report can be found at: http://bit.ly/JmRUaf.
11 mai 2012

U21 Rankings of National Higher Education Systems

http://www.universitas21.com/upload/collaboration/list/48RankingsFrontCover_Page_01.jpgA ranking of higher education systems based on resources, environment, connectivity and output. New research into national education systems gives the first ranking of countries and territories which are the ‘best’ at providing higher education.
Universitas 21
has developed the ranking as a benchmark for governments, education institutions and individuals. It aims to highlight the importance of creating a strong environment for higher education institutions to contribute to economic and cultural development, provide a high-quality experience for students and help institutions compete for overseas applicants. Research authors at the Melbourne Institute of Applied Economic and Social Research, University of Melbourne, looked at the most recent data from 48 countries and territories across 20 different measures. The range of measures is grouped under four headings: resources (investment by government and private sector), output (research and its impact, as well as the production of an educated workforce which meets labour market needs), connectivity (international networks and collaboration which protects a system against insularity) and environment (government policy and regulation, diversity and participation opportunities). Population size is accounted for in the calculations.
http://www.universitas21.com/upload/sidebar/full/0sidebar3.jpgOverall, in the Universitas 21 Ranking of higher education systems, the top five were found to be the United States, Sweden, Canada, Finland and Denmark. Further details can be found under “more information” below. Government funding of higher education as a percentage of GDP is highest in Finland, Norway and Denmark, but when private expenditure is added in funding is highest in the United States, Korea, Canada and Chile. Investment in Research and Development is highest in Denmark, Sweden and Switzerland. The United States dominates the total output of research journal articles, but Sweden is the biggest producer of articles per head of population. The nations whose research has the greatest impact  are Switzerland, the Netherlands, the United States, United Kingdom and Denmark. While the United States and United Kingdom have the world's top institutions in rankings, the depth of world class higher education institutions per head of population is best in Switzerland, Sweden, Israel and Denmark. The highest participation rates in higher education are in Korea, Finland, Greece, United States, Canada and Slovenia. The countries with the largest proportion of workers with a higher level education are Russia, Canada, Israel, United States, Ukraine, Taiwan and Australia. Finland, Denmark, Singapore, Norway and Japan have the highest ratio of researchers in the economy. International students form the highest proportions of total student numbers in Australia, Singapore, Austria, United Kingdom and Switzerland. International research collaboration is most prominent in Indonesia, Switzerland, Hong Kong SAR, Denmark, Belgium and Austria.  
China, India, Japan and the United States rank in the bottom 25 per cent of countries for international research collaboration. In all but eight countries at least 50 per cent of students were female, the lowest being in India and Korea. In only five countries were there at least 50 per cent female staff; the lowest being in Japan and Iran. More information, including the full report, a breakdown of the results and a commentary on the various measures used in these rankings will be available shortly.
Executive Summary and full report. Measure 1: Resources. Measure 2: Environment. Measure 3: Connectivity. Measure 4: Output. Menu of Measures and Data Tables.
Newsletter
49 abonnés
Visiteurs
Depuis la création 2 783 885
Formation Continue du Supérieur
Archives