21 avril 2013

Asian higher education revolution a long way off

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Richard Holmes. The Times Higher Education Asian University Rankings are out. Since they are based on data already gathered for the 2012 World University Rankings, there are no surprises in the top 57 that were already included in the world’s top 400 universities. There are, however, some interesting things in the bottom 43, since the scores for those universities have not previously been made public. Unlike QS, Times Higher Education and Thomson Reuters have used the same methodology for their World and Asian rankings. This is a pity since they have missed an opportunity to experiment with methodological changes particularly to the citations indicator, which has been throwing up some surprising results. These rankings show some differences from others such as the QS Asian and World University Rankings and those published by Shanghai Jiao Tong University – the Academic Ranking of World Universities, or ARWU – and the Middle East Technical University – University Ranking of Academic Performance, or URAP. Read more...

Posté par pcassuto à 10:14 - - Permalien [#]


Pressing need for more sophisticated rankings – EUA

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy David Haworth. This month’s annual conference of the European University Association, or EUA, debated how rankings systems needed to become more sophisticated benchmarking exercises, as higher education worldwide becomes ever more internationalised. Speaking to University World News after the event held at the University of Ghent in Belgium, EUA Secretary General Lesley Wilson said that while “everyone has a different view” about rankings, there was a need to deliver comprehensive benchmarking systems with which universities could compare themselves against other higher education institutions.
“In future it should be easier to see where one stands in relation to one’s peers and the regions in which they work,” she said.
Wilson agreed with a comment made at the conference by European Union Education Commissioner Androulla Vassiliou, that the predominant focus of existing research-weighted rankings did not necessarily help to improve the quality of higher education – which is about far more than just research excellence. She admitted that this was mainly because of a current lack of output indicators regarding the quality of teaching. Read more...

Posté par pcassuto à 10:06 - - Permalien [#]

Multidimensional university ranking system developed

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Marina Larionova. Russian universities, like universities from other countries, increasingly compete not only at the national level but also globally. This trend is reflected in the growing interest in global university rankings. Despite criticism, rankings are in demand and influence universities’ positioning in the international higher education market. In Russia almost 30 different approaches to ranking have been developed recently and tested in a bid to satisfy the needs of various stakeholders. All of these approaches are single-dimensional rankings that use a composite indicator and weight coefficients. They have drawn the interest of prospective students, universities and the academic community. They have also been criticised by various stakeholders. Read more...

Posté par pcassuto à 10:04 - - Permalien [#]

20 avril 2013

New Ranking Rules

HomeBy Scott Jaschik. Quacquarelli Symonds, one of the major groups conducting international rankings of universities, has banned universities from recruiting people to participate in the peer review surveys conducted for the evaluations of institutions. QS accepts academic volunteers to participate in its rankings reviews. Up until now, QS has permitted universities to recruit volunteers, provided that the institutions don't suggest how they should evaluate the universities. The action by QS, as the company is known, follows the news that the president of University College Cork sent a letter to all faculty members urging them each to ask three people they know at other universities -- people who would understand the university and its need to move up in the rankings -- to participate in the QS process. Read more...

Posté par pcassuto à 17:37 - - Permalien [#]

19 avril 2013

Global university rankings and their impact II

http://www.eua.be/images/logo.jpgA new report entitled “Global university rankings and their impact II” was published by EUA and launched in a special session during the EUA Annual Conference, on 12 April.
PART II: Methodological changes and new developments in rankings since 2011

1. The SRC ARWU rankings

SRC ARWU World University Ranking (SRC ARWU) is the most consolidated of the popular university-based global rankings. There have been no changes in the core methodology of this ranking since 2010.
2. National Taiwan University Ranking: performance ranking of scientific papers for world universities

The NTU Ranking aims to be a ranking of scientific papers, i.e. it deliberately uses publication and citation indicators only; therefore, data is reliable. However as no field normalisation is used the results are skewed towards the life sciences and the natural sciences. The original ranking strongly favours large universities. The “reference ranking” option changes indicators to relative ones but only shows the overall score, not the scores per academic staff member for individual indicators.
3. Times Higher Education

THE descriptions of methodology customarily refer solely to the methodological tools used, without always providing enough information about how the scores are actually calculated from raw data (Berlin Principle 6). Overall there were several – albeit discreet – changes in methodology in the THE World University Ranking in 2010 and 2011, but with nothing further since then. Most of them represent improvements, such as the slight decrease (from 34.5% to 33%) in the total weight of reputation indicators which thus account for one third of the overall score. The reputation indicators in THE World University Ranking and the 2012 THE Reputation Survey are discussed in more detail in the next section.
4. Thomson Reuters’ Global Institutional Profiles Project

The Thomson Reuters Global Institutional Profiles Project (GPP) is a Thomson Reuters’ copyright. The aim of Thomson Reuters is to create portraits of globally significant institutions in terms of their reputation, scholarly outputs, funding levels, academic staff characteristics and other information, in one comprehensive database (Thomson Reuters, 2012a). GPP is not a ranking as such; however one of the parameters used is the ranking position of institutions. These ranking positions are taken from THE rankings.
5. Quacqarelli-Symmonds rankings

Comparisons between universities (QS, 2012f ) on a subject basis can be much more useful for them than global university league tables that try to encapsulate entire institutions in a single score. Furthermore comparisons made within a single subject lessen the field bias caused by different publishing cultures and citation practices within different fields of research. In 2012 the QS subject rankings covered 29 of the 52 subject areas defined. These rankings are strongly based on reputation surveys. The methodology used is not sufficiently transparent for users to repeat the calculations and various mathematical adjustments are made before the final score is reached. In relation to the academic reputation survey QS admits that a university may occasionally be nominated as excellent and ranked in a subject in which it “neither operates programmes nor research” (QS, 2011b, p.11). In an attempt to address this, QS specifies thresholds and conducts a final screening to ensure that listed institutions are, indeed, active in the subject concerned. This demonstrates that academics risk nominating universities on the basis of their previous reputation or reputation in other areas, rather than based on their own real knowledge of the institution. While the measures taken may help to eliminate inappropriate choices, they prevent academics from sometimes nominating universities which have programmes, but no capacity or strength in a given subject.
6. CWTS Leiden Ranking

Identification of the bias in MNCS indicators given their unusual sensitivity to publications with extremely high citation levels, and the introduction of indicator stability intervals to detect high citation scores possibly resulting from such publications (rather than citations covering a university’s entire publications output) are both positive developments. Yet they are also a warning that new indicators always introduce fresh biases, so that rankings are constantly liable to distortion. Only time will tell whether the new indicator – the proportion of top 10% publications (PPtop 10%) – which currently seems the most reliable will be the best in the long term or will create fresh problems. However, the inclusion of full counting and proportional counting methods does enable users to select further options as they see fit.
7. Webometrics Ranking of World Universities

The increased coverage of Webometrics to include over 20,000 higher education institutions allows nearly all higher education institutions worldwide to compare themselves with others. Apart from the addition of the “excellence” indicator based on SCImago bibliometric data, all other indicators used by Webometrics are based on web analysis, and considerably less direct proxies than the indicators used by academic rankings. Webometrics’ continued focus thus remains on providing a rough indication of how an institution performs compared to others.
8. U-Map

According to the report on U-Map in Estonia (Kaiser et al., 2011), the resulting U-Map profiles largely match the expectations of higher education institutions and the Ministry of Education, while the most interesting differences and diversity are observable in the “knowledge exchange” and “international orientation” profiles. However, the country concedes that, because U-Map is designed as a European transparency tool, it is not fully compatible with all national institutional needs. Both Estonia and Portugal acknowledge that it has raised awareness among institutions of their own profile.
9. U-Multirank

If U-Multirank meets its objectives, based upon the experience with the feasibility study, and given that the intention is to integrate the already tested U-Map classification tool, it will be substantially different from existing global rankings. The implementation phase was launched in January 2013 with the financial support of the European Commission and the first rankings are expected for early 2014.
10. U21 Rankings of National Higher Education Systems

While the development of a systems’ level ranking is an interesting new approach, as indicated in Part I there are many open questions. For example, as the weights of the indicators in the overall ranking have not been provided, it is very hard to determine which indicators have the greatest and least impact on the overall score, as the description of indicator weights is also confusing. The required calculations have been performed and the weight of each indicator added in the course of preparing the present report. While it has been assumed that the two “connectivity” indicators are equal in weight, nothing is said about them either in the overall report (Williams et al., 2012) or on the U21 website.
11. SCImago Rankings

Tools offered by SCImago are useful and available free of charge. One key feature of SCImago is that it covers more than 3 000 institutions thus allowing a large group of institutions to compare themselves with others. Users will nevertheless have to take into account that SCImago does not distinguish between universities and other research organisations. SCImago tools make it possible to compare institutions or countries: in total, by 27 subject areas and numerous narrower subject categories, by countries or regions. Journal rankings are important in the choice of a journal for publication. SCImago also has its limitations, for example only bibliometric data is used. Hence most indicators are absolute numbers which means that it favours large institutions.
12. University Ranking by Academic Performance

The greater inclusiveness of URAP compared to the most popular global university rankings is of interest. Its results should be reliable because its content is drawn solely from international bibliometric databases. At the same time, and despite the words “academic performance” in its name, URAP uses indicators concerned exclusively with research. No indicators related to teaching are included; therefore once more the focus is on research-oriented institutions. Furthermore its six ranking indicators are absolute values and therefore size-dependant. As a result, URAP is strongly biased towards large universities.
13. EUMIDA

The development of EUMIDA corresponds to the growing need for policy makers to have more extensive Europe-wide, comparable data collection. EUMIDA can therefore be seen as a positive development. In principle, the aggregation of results into an index is a ranking.
14. AHELO
EUA has been closely involved in monitoring the progress of this feasibility study, along with its partner associations in the US and Canada. The joint concerns of the three associations were raised in a letter sent to the OECD in July 2012 on behalf of the university communities in all three regions.
15. IREG ranking audit

The success of audits will no doubt greatly depend on the qualifications of audit team members and their willingness to explore ranking methodologies in depth, as well as their ability to access the websites of the ranking organisations and specifically details of the methodology applied. Experience to date, as explained in the first EUA Report, has shown that frequent gaps in the published methodologies exist, and most notably the explanation of how indicator values are calculated from the raw data. As a result, those wishing to repeat the calculation to verify the published result in the ranking table have been unable to do so. There are also cases in which the methodological content posted in more than one section of the ranking provider’s website is not consistent. While such variations are usually attributable to content relevant to ranking tables in different years, the precise years concerned are not clearly specified. Other rankings refer to the “normalisation” of data but without stating what kind of “normalisation” is meant. The term could thus denote many different things, ranging from the field normalisation of bibliometric indicators to the “normalisation” of indicators to make them relative rather than size-dependent, or to “normalisation” involving the division of a university’s result by that of the “best” university to make the former “dimensionless”. It is to be hoped that the IREG audit will be thorough, and also take these concerns into account and lead to substantial improvements in ranking methodologies and the quality of the information provided. More will only be known on how this works in practice when the first audit results are available.
SEE PART II: Methodological changes and new developments in rankings since 2011
1. The SRC ARWU rankings
ARWU Ranking Lab and Global Research University Profiles (GRUP)
Macedonian University Rankings
Greater China Ranking
2. National Taiwan University Ranking: performance ranking of scientific papers for world universities
3. Times Higher Education
Times Higher Education World University Ranking
THE academic reputation surveys and THE World Reputation Ranking
THE 100 under 50 ranking
4. Thomson Reuters’ Global Institutional Profiles Project
5. Quacqarelli-Symmonds rankings
QS World University Ranking
Additional league table information
The QS classification
QS Stars
QS World University Rankings by subject
QS Best Student Cities Ranking
QS top-50-under-50 Ranking
6. CWTS Leiden Ranking
7. Webometrics Ranking of World Universities
8. U-Map
9. U-Multirank
10. U21 Rankings of National Higher Education Systems
11. SCImago Rankings
SCImago Institutional Rankings
Other SCImago rankings and visualisations
12. University Ranking by Academic Performance
13. EUMIDA
14. AHELO
15. IREG ranking audit.

Posté par pcassuto à 17:58 - - Permalien [#]


EUA Annual Conference focuses on internationalisation strategies and global rankings

http://www.eua.be/images/logo.jpgAround 450 university leaders and higher education representatives gathered at Ghent University last week (11-12 April) for the 2013 EUA Annual Conference entitled “European Universities - Global Engagement”.
Discussions throughout the conference confirmed that internationalisation is an issue which affects all elements of the university mission, which is why the development of strategic approaches has become a necessity for all European universities. Therefore, internationalisation will continue to be an integral part of EUA’s membership activities in the years to come.
To feed into the conference discussions, EUA published the results of a survey of its member universities on HE internationalisation, which also gauged their expectations for EUA’s future international activities and for the European Union’s forthcoming strategy for the internationalisation of higher education, which is due to be presented in the coming months. This strategy will focus in particular on European higher education engagement beyond European borders, with global partners.
The first session focused on “New models of internationalisation: European policies, national priorities and institutional strategies” and European Commissioner Androulla Vassiliou was invited to present a European perspective on the topic.
She highlighted the need to be prepared to take on educational challenges that go beyond national borders (such as changes in the labour market). The Commissioner added that universities needed broader strategies that go beyond mobility and cover many other types of academic cooperation, such as joint degrees, support for capacity-building, joint research projects and distance learning programmes. The concept of "internationalisation at home" continued to be key to ensuring that the majority of students who are not in a position to study abroad can nevertheless enjoy the benefits associated with international exposure.
Institutional perspectives were then provided by Ihron Rensburg, Vice-Chancellor of the University of Johannesburg, South Africa, and Luc Soete, Rector of Maastricht University in the Netherlands. The former provided a view of higher education growth in Africa and a snapshot of what internationalisation means in particular to South Africa before concluding with a series of reflections on the necessity to engage in mutually beneficial higher education partnerships which value the perspectives and the contributions of all actors, whether they be big or small, or from the North or South. Luc Soete, meanwhile, addressed the multifaceted nature of the globalisation of higher education and research, focusing in particular on the importance of tackling global research challenges and the enormous impact of communications and technological developments.
The working groups on the second day were an opportunity for HE representatives to discuss in more depth how they were implementing institutional internationalisation strategies and positioning themselves in the global research landscape. Contributions and case studies were provided by a variety of European university leaders, including the Rector of Ghent University Paul van Cauwenberge.
In the final plenary which focused on responses to international competition, the audience was provided with an overview of the Monash Warwick Alliance launched in 2012 by the University of Warwick (UK) and Monash University (Australia). Its Director Andrew Coats described the development of this collaboration, its aims over the next years and the potential risks and challenges. He was followed by Thomas Schöck, Chancellor of Friedrich-Alexander University Erlangen-Nürnberg (FAU), who presented the university’s internationalisation strategy (including the FAU Busan campus in Seoul) in the context of more general internationalisation developments in Bavaria and in Germany.
The last session of the conference was dedicated to the launch presentation and discussion on EUA’s new report on global university rankings and their impact. The results of the second EUA report on this topic were provided by author Andrejs Rauhvargers, whose presentation was followed by a discussion with participants on a wide range of issues relating to the methodologies, impact and institutional responses to rankings. Presentations from the conference are available on the conference website.
EUA is also pleased to announce that next year’s Annual Conference will take place at the Université Libre de Bruxelles (ULB), Belgium, from 3 to 4 April 2014.

Posté par pcassuto à 17:54 - - Permalien [#]

EUA publishes second rankings review report

http://www.eua.be/images/logo.jpgA new report entitled “Global university rankings and their impact II” was published by EUA and launched in a special session during the EUA Annual Conference, on 12 April.
Authored by Andrejs Rauhvargers, the report underlines that there have been significant new developments in the field of international rankings since EUA’s first rankings review report, in 2011. It reveals that the number of international university rankings and other “transparency tools” continues to grow, with the arrival of new rankings and the development of new products by ranking providers. The growing volume of information that is being gathered on universities and the new “products” on offer also strengthen both the influence of the ranking providers and the potential impact of rankings.
The report shows that rankings are also impacting on public policy making. The developments outlined in the report also indicate the need for all stakeholders to reflect on the extent to which global rankings are no longer a concern only for a small number of elite institutions but have become a reality for a much broader spectrum of universities as they seek to be included in, or improve their position in one or the other rankings.
Discussions that followed the presentation also underlined the continued lack of indicators for addressing teaching quality in an appropriate way, and concluded on the difficulty of conceiving a totally objective ranking. Nevertheless, it was noticed that some rankings providers have themselves started to draw attention to the biases and flaws in the data underpinning rankings, and thus to the dangers of misusing rankings. EUA will now take this work on rankings forward with its new pan-European project (RISP) designed to study the impact of rankings on institutional strategies in more detail and to provide recommendations on how rankings can promote institutional development while also identifying potential pitfalls that universities should avoid. The Global University Rankings and Their Impact Report II is available here.
The EUA Rankings Review project was made possible by funding from the Robert Bosch Stiftung and the Calouste Gulbenkian Foundation.

Posté par pcassuto à 17:51 - - Permalien [#]

14 avril 2013

Are university rankings too powerful?

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Alan Osborn. A new report by the European University Association, or EUA, on global university rankings confirms what most higher education leaders will have known for some time – the dramatic growth in the number and scope of rankings tables in recent years has begun to shape the ways in which higher education is developing worldwide. The EUA report, Global Universities and their Impact – Report II, says that besides increasing the pressure on universities – and the risk of overburdening them – the rankings “are also now beginning to impact on public policy making”.
The new publication was released at the association’s annual conference at Ghent University in Belgium last Friday. Read more...

Posté par pcassuto à 10:41 - - Permalien [#]

Japan leads in new Asia top 100 university rankings

http://enews.ksu.edu.sa/wp-content/uploads/2011/10/UWN.jpgBy Yojana Sharma. Japan is Asia’s top country for higher education and research, according to the ‘inaugural’ Asian top 100 university rankings unveiled last week by Times Higher Education magazine, which also produces annual global university rankings. In a league table dominated by specialised science institutions, Japan has 22 institutions in the top 100 – more than any other Asian country – with Tokyo University the region’s number one institution. But several countries are snapping at Japan’s heels. Read more...

Posté par pcassuto à 10:37 - - Permalien [#]

The Rankings in Institutional Strategies and Processes (RISP) project

GUNi LogoCall for Participation in the Rankings in Institutional Strategies and Processes (RISP) project
The European University Association (EUA) has issued to call to participate in an online survey in the context of the Rankings in Institutional Strategies and Processes (RISP) project.
In the perspective of its recently launched project entitled “Rankings in Institutional Strategies and Processes” (RISP), the European University Association (EUA) – together with its partners the Dublin Institute of Technology, the French Rectors’ Conference and the Latvian Academic Information Centre - aims to analyze the impact of rankings on institutional decision-making. This represents the first pan-European study of the influence of rankings on European universities.
An increasing number of university rankings are being published every year, and there is a growing consensus that rankings are becoming a part of the higher education realm. All higher education institutions are invited to participate in completing a survey by 17 June 2013.
To fill in the survey, follow this link.
For more information, follow this link.

Posté par pcassuto à 01:41 - - Permalien [#]