Canalblog
Suivre ce blog Administration + Créer mon blog
Formation Continue du Supérieur
asie-pacifique
30 juin 2012

More education does not make you more employable

http://resources2.news.com.au/cs/australian/paid/images/sprite/logos.pngBy Anna McHugh. Anna Bellamy-McIntyre's (HES, June 20) situation is similar to my own, but what response does Australia offer those humanities postgraduates who can neither find a job in universities nor are welcomed by the secondary education sector?
After migrating from Scotland in 1994, I scored 99.95 in my HSC. I took a first-class honours degree in English at the University of Sydney and an Australian Postgraduate Award funded me through a PhD there. I loved doing my degree, and at 24, I wasn't too upset when it became pretty clear that there was no job at the end of a doctorate about Chaucer.
Oxford seemed like a good idea, so I dragged myself and a long-suffering husband to England for another doctorate (in 15th century history, this time). I was thrilled to be given a junior research fellowship at Oxford, but less so when I found out that it came with the princely salary of £5000 ($7700). After 18 months of trying to live on that, I gave up and returned (without the husband) to Australia.
I enjoyed the teaching aspect of my fellowship, but I wanted to teach younger students (if you've ever taught Oxford undergraduates you'll know it's a largely over-rated experience). I decided that, if I worked and studied, my bank account could bear one last qualification. A GradDipEd by distance from UNE completed a string of letters longer than my own name.
I discovered quite quickly that if one PhD makes you undesirable, two are just plain unfortunate. State schools (at least in NSW) are a lot less impressed with high-flown academic experience; they want knowledge and use of current pedagogical theory. Very sensible, except that it's passion for the subject, and the personal experience which takes graduates to dizzy doctoral heights, that help teachers to connect with those students who most need to be inspired.
Independent schools weren't much better. Very few heads of department want a 33 year-old woman with two PhDs under them, even if she's as new to teaching as anyone 10years younger and as much in need of their help. I found this out in my first teaching job, which was nasty, brutish, and short.
I was (or rather, my qualifications were) offered a job teaching English in an independent boys' school in the inner west. I was the principal's project, much to the dismay of the head of department (a man older than me who apparently had no formal teaching qualifications). He left me alone with enough rope to hang myself and after a horrendous term, I almost did.
As an example of a bullying culture, it left Oxford in the shade. Unfortunately, it left me in the lurch and I'm now on the bench with no job and a portfolio of library cards to flip through.
Bellamy-McIntyre is right; you can let the lack of societal follow-through on the research they've funded make you jaded and disillusioned. My heart goes out to scientists who can only remain current in their by research if it's performed in expensive laboratory settings.
But humanities postgraduates are in a slightly different boat. Unemployed in my field after two PhDs, I've come to a few conclusions:
You write a doctoral thesis for yourself. 'Mnemonic theory in fourteenth-century poetry' isn't going to set the world on fire, but it interests and fulfils me. It's not a ticket to an academic position but a chance to investigate truth, wisdom, and virtue -- the things that learning was once about.
That said, the abstract skills you've developed will be useful in a great many jobs. You'll work smarter, faster, and probably harder. If you become a teacher, it'll help you see what the most exciting parts of your subject are. But you should be judicious and realistic about what you can offer and expect from your colleagues. PhDs may seem ten-a-penny now, but the majority doesn't have one (let alone two). If you cherished dreams of being grovelled to as Doctor Fantastic, you're likely to meet some resistance in a non-academic setting.
Get used to meeting society's self-centredness and contrariety. Unless you've researched the secret to eternal youth, you'll have to prove yourself to every new potential employer. You're a living paradox; overqualified but underskilled, you have valuable critical thinking skills but are expected to walk off any job that doesn't exercise them. So when someone wants you, be happy! Don't think of it as second-best to an academic job; (chances are, you won't have to publish half-baked stuff just to bump up your research points).
Remember that the academic industry is a wheel. Your field will come back into vogue, even if it looks slightly different. You can keep up with it by reading your peers and contributing sensible things yourself. This is the real test of your stamina and passion as a scholar: do you do it for love of your subject, or for your own glory?
Our generation could become very miserly towards our parents, which nibbles at our stipend of certainty and satisfaction even as it ages and demands our care. We could easily regret investing lonely, difficult, impecunious years in study with no return in home ownership, children, or careers.
I chased academic success for 15 years but only found myself as a scholar when schools and universities had no job for me. I study for love of my subject, in which I find a greater truth.
The flush of academic glory comes with the title and disappears with the funding. In love of your subject is wisdom and excellence, which no one can take from you. These belong to a person, not an institution, and are taken by them wherever they go.
12 mai 2012

The Nomura Center for Lifelong Integrated Education

http://www.nomuracenter.or.jp/image/index/e_nclie_title_bk.gifThe Nomura Center for Lifelong Integrated Education has organised  international conferences since 1977:  the first International Forum was held in Tokyo, followed, in 1978, by a conference in Paris at UNESCO, in which HQ and UIL participated and which has been organised every four years since.
The Centre celebrated the 50th anniversary of its foundation by Ms. Yoshiko Nomura, on 4 March, in Tokyo. The Centre is based on humanistic and spiritual ideas about human development and its harmony with nature.
Japan is still focusing much of its efforts on the aftermath of the earthquake and the destruction of the Fukushima Nuclear Plant in 2011. UIL’s director Arne Carlsen in his congratulatory speech stressed that Japan is also concentrating on the field of adult education in 2012 since the recommendation adopted by CONFINTEA III in Tokyo in 1972 was developed into a Recommendation on Developing Adult Education, adopted by the UNESCO General Conference (Nairobi, 1976). The Executive Board of UNESCO decided in February 2012 to adopt a Plan of Action to make observing this Recommendation’s actual implementation part of  the monitoring of CONFINTEA VI follow-up.
http://www.nomuracenter.or.jp/image/publication/pub_proceeding_10forum_j.jpgLifelong Integrated Education as a Creator of the Future Human Restoration of the 21st Century
The Collection of Records of the 10th commemorative International Forum on Lifelong Integrated Education.
To leave the beautiful planet to our children

... All of us living in the 21st century have an inescapable duty to create the future notwithstanding the collapse of human values we endure at the micro level and the destruction of the global environment we face at the macro level.
How precious it is to have a place where we can come together from different backgrounds and cultures and to be able to communicate candidly...
We invite you to join us at the commemorative 10th Forum to address together this epic theme that spans the world and all walks of life across the generations.
Nomura Center for Lifelong Integrated Education
Yoyogi 1-47-13, Shibuya-ku, Tokyo 151-0053 JAPAN
Tel: +81 (0) 3-3320-1861 / Fax: +81 (0) 3-3320-0360
Email: intl@nomuracenter.or.jp
16 avril 2012

Skills reforms coast through

http://savevca.org/wp-content/uploads/2010/04/the-australian_logo1.jpgBy John Ross. ANTICIPATED skills reforms sailed through Friday's Council of Australian Governments meeting without a hitch, despite state complaints that the federal government's new funding deal left them hundreds of millions of dollars out of pocket.
Western Australia agreed to the new reforms, which include HECS-style loans for diploma students and a guaranteed ‘entitlement’ to government-subsidised training, despite threatening to scuttle the deal over a $75m funding shortfall.
Unknown quantity Queensland, whose new government is yet to reveal its skills policies, also approved the reforms. A spokeswoman for Education and Training Minister John-Paul Langbroek said Queensland supported measures to reduce upfront cost to students as a means of increasing participation in training.
“But [Queensland] needs to be assured that the commonwealth’s proposals are affordable, will address skills shortages in Queensland and do not represent a financial risk to the state,” she added.
The reforms also include a pilot of “independent validation of training provider assessments” – essentially, third-party checks that training graduates have the skills their colleges claim to have taught them. COAG also agreed to implement strategies enabling TAFEs “to operate effectively in an environment of greater competition”, as well as improving information about the vocational training system. A communiqué issued after the meeting said the reforms would help an extra 375,000 students gain qualifications over the next five years.
This included “improving training enrolments and completions in high-level skills and among key groups of disadvantaged students, including indigenous Australians”.
“These reforms will support Australian businesses and drive improvements in productivity by growing the pool of skilled workers, encouraging existing workers to up skill and supporting higher levels of workforce participation.”
TAFE Directors Australia said the “historic COAG agreement” made public providers the key drivers of the reform agenda. But CEO Martin Riordan said the extension of income contingent loans should be “accompanied by transparent and sustained investment by government, as a co-contribution to training”.
“We will be seeking input into the negotiations between commonwealth and the states on the detail of the final national partnership agreement,” Mr Riordan said.
The Australian Industry Group said COAG had reached “an important agreement that ramps up the effort in tackling endemic skill shortages and forges overdue reforms to our national training system”.
Chief executive designate Innes Willox congratulated the commonwealth, states and territories for putting aside their differences “to achieve essential reforms that will underpin the development of Australia’s current and future skilled workforce”.
“Industry will need to be involved with all levels of government to advance and shape these reforms, ensuring the required quality improvement measures deliver the outcomes sought by both industry and individuals,” Mr Willox said. Theaustralian.
25 mars 2012

Training tracking up

http://savevca.org/wp-content/uploads/2010/04/the-australian_logo1.jpgBy John Ross. MORE Australians are undertaking government-funded training, and they're training later in life and for higher qualifications, according to the 2010 annual national report on the vocational education and training system.
The soon-to-be-published report shows that the number of people in training jumped by over 100,000 in 2010, boosting the training participation rate by 3 per cent in a single year. And on top of a 10 per cent rise in students over four years, which pushed overall numbers above 1.75 million, the proportion in medium or high level courses also rose 10 percentage points to 58 per cent.
Diploma-level study increased particularly sharply, with a shift by women to higher-level study raising the overall proportion of diploma students from 10 to 13 per cent. The percentage of students aged over 25 also rose, while the proportion of teenage students contracted slightly. The federal government said the report proved its skills funding was paying dividends and that its skills reforms were on track.
“We have invested almost $4 billion more in vocational education and training than the Howard Government did in its last three years, and it is paying off,” said Tertiary Education Minister Chris Evans.
“Our investment has resulted in more Australians than ever before undertaking vocational studies and, importantly, we are seeing an increase in those finishing their studies and getting the qualifications and skills they need to enter the workforce.”
However the figures don’t tell the whole story because they exclude full-fee training by private colleges.
Consequently it’s not clear how much of the increase simply reflects privately funded training being shifted onto the public purse. RMIT University policy analyst said the move to improve data collection and dissemination, to overcome this type of problem, was one of the most significant of the federal government’s skills reform proposals outlined on Monday. The government wants to provide data on course enrolments and completions for all accredited training, irrespective of whether it’s publicly or privately funded. But Dr Moodie said it wasn’t clear how or when the government proposed to achieve this.
“Comprehensive data collection has been resisted strenuously by many private companies as adding to red tape, and has substantial methodological challenges,” he said.
Dr Moodie also endorsed the federal government’s plans to develop the unique student identifier into a national student record, allowing students to keep track of their own qualifications as well as helping in analysis and fund distribution.
“While it would face several bureaucratic obstacles and technical issues it would be an important development for students,” he said.
The report shows that the proportion of fee-for-service training by TAFEs declined slightly in 2010, possibly because Victoria’s open-training market made full-fee training less attractive in that state.
8 mars 2012

A new era for research education in Australia?

http://www.nteu.org.au///var/files/thumbs/a780532dd116f8da145bac8c4c7961bc_79f545256e2c9684e8c64eaad663e61f_w190_.pngThe latest issue of Australian Universities' Review, vol. 54, no. 1, is now available online at www.aur.org.au. Special issue: Contemporary issues in doctoral education.

A new era for research education in Australia? Helene Marsh, James Cook University, Bradley Smith, James Cook University, Max King, Monash University, Terry Evans, Deakin University, pp. 83-93.

Use of the Australian research assessment exercise, Excellence in Research for Australia (ERA) to influence the policy and practice of research education in Australia will undoubtedly have many consequences, some of them unintended and potentially deleterious. ERA is a retrospective measure of research quality; research education is prospective. There is a lack of alignment between the 2- and especially the 4- digit Fields of Research used for ERA and university organisational units. While numerous Fields of Research were rated as world class in multiple institutions in the capital cities of New South Wales, Victoria and Queensland, the other states and regional Australia have significant gaps. The Sciences, Technology, Engineering and Medical (STEM) fields were generally rated higher than the Humanities, Arts, and Social Sciences (HASS) disciplines. Thus using ERA results to allocate higher degree by research places will have highly variable consequences in different disciplines and locations, given the obstacles to the mobility of the largely mature-aged doctoral cohort and the forecast impending academic skills shortage. ERA provides an incentive for Australian academics to eschew publishing in low impact journals and is likely to disadvantage some research students for whom co-authorship in a lower impact journal is more advantageous than no publication. There are many ways in which ERA results could be used to improve the quality of research education in Australia. Nonetheless, simplistically limiting doctoral education to Fields of Research where an institution scored at or better than national or world averages in ERA is unlikely to be in the national interest because our future research and academic workforce needs to be well prepared to operate across the nation in areas of emerging research, including cross-disciplinary and applied research.
Excellence in Research for Australia (ERA) is designed to provide a comprehensive review of the quality of research undertaken in Australian higher education institutions at regular intervals. The first ERA was conducted in 2010 (Australian Research Council, 2011a), the second will be conducted in 2012 and the third is planned for 2016. ERA was a successor to the Research Quality Framework (RQF) (DEST, 2005); an initiative prompted by political scepticism about the claims/assertions that universities made about the value of and returns on national investment in research. In implementing ERA, Australia follows several other countries, including the United Kingdom (RAE, 2008), New Zealand (PBRF, 2012), Hong Kong (French, Massy & Young 2001), which have conducted national assessments of the quality of research based on various criteria. These overseas assessment exercises have been used to guide research funding in response to concerns about the affordability of funding all higher education institutions for research as higher education has moved from an elite to a mass system (Elton, 2000). However, the outcomes have not always been as policy makers intended. For example, in the United Kingdom, the exercise, which was aimed at concentrating research in fewer institutions and departments, confirmed that many of the newer universities were producing quality research and many universities used their freedom of virement to fund lower-rated departments at the expense of higher-rated ones (Elton, 2000).
In ERA 2010, each of the 41 Australian Higher Education Providers was invited to provide evidence of research quality, volume, application and esteem across eight disciplinary clusters: (1) Physical, Chemical and Earth Sciences; (2) Humanities and Creative Arts; (3) Engineering and Environmental Sciences; (4) Social, Behavioural and Economic Sciences; (5) Mathematical, Information and Computing Sciences; (6) Biological Sciences and Technology; (7) Biomedical and Clinical Health Sciences; (8) Public and Allied Health Sciences. The disciplines within each cluster were defined by the 2 and 4-digit Fields of Research identified by the Australian and New Zealand Standard Research Classification (ANZSRC, 2008).
ERA 2010 was an academic rather than an end-user evaluation of Australia’s research. The evaluation was undertaken by eight Research Evaluation Committees, each of which was broadly representative of its discipline cluster group. Each committee’s assessment was based on a ‘dashboard’ of indicators of research quality, research volume and activity, research applications and recognition (Australian Research Council, 2011a). Each Field of Research was evaluated on a five-point scale ranging from ‘1’ (well below world standard) to ‘5’ (well above world standard) with a rating of ‘3’ representing world standard. If an institution did not meet the low volume threshold for critical mass for a Field of Research, it was rated as ‘not assessed’ for that field. The indicators were largely metric-based with an emphasis on citation analysis the vast majority of Sciences, Technology, Engineering and Medical (STEM) disciplines and peer review by international experts in the remaining discipline clusters. Thus the range of disciplines was split into peer-review disciplines and citation disciplines. The evaluation processes were not transparent and attempts to determine the relative importance of the input factors through retrospective analysis have largely failed. Some bodies including the Australian Academy for the Technological Sciences and Engineering (ATSE, 2009) expressed concern that applied and cross-disciplinary research would be undervalued, a concern supported by analyses of British Research Assessment Exercises (e.g. Elton, 2000).
The ways in which ERA will be incorporated into the drivers that determine the Research Training Scheme, the block grant provided to Australian universities to fund research training, have yet to be determined. In ‘Research skills for an innovative future’ (DIISR, 2011a), the Australian government stated that the Excellence in Research for Australia (ERA) initiative will support the ‘identification and recognition of research strengths within universities’ as a vital component of research education (page 23). Despite intuitive appeal, this approach may have the unintended consequence of reducing research education in areas of national or regional importance, especially areas of applied, cross-disciplinary or emerging research. The purpose of our paper is to explore possible consequences of ERA for research education in Australia and to suggest ways in which ERA results could be used to enhance research education in Australia while minimising deleterious, unintended consequences ‘before they become apparent, let alone researchable’ (Elton, 2000).
Methods

Our analysis is largely based on the National Report of ERA 2010 (Australian Research Council, 2011a). ERA 2010 scores were based on 25 2-digit and 157 4-digit Fields of Research as defined by the ANZSRC classification (ANZSRC 2008), a pragmatic taxonomy of research across all research and development sectors in Australia and New Zealand including industry, Government agencies, private not for profit organisations and universities. This classification was not designed as a taxonomy of university research per se and includes Fields of Research that are largely undertaken outside the sector e.g., automotive engineering and medical biotechnology. Thus it is questionable whether an analysis such as ours should include all these fields. Twenty-two of the 4-digit codes are ‘XX99’ or ‘other’ codes e.g., 699 Other Biological Sciences and 1499 Other Economics. There were only 28 Units of Evaluation (a 2-digit or 4-digit Field of Research for one institution) across the 22 ‘other’ Fields of Research compared with 1708 Units of Evaluation for the substantive Fields of Research (Commonwealth of Australia 2011a). The purpose of the ‘other’ codes is to pick up research not adequately captured by the main 4-digit Fields of Research. Therefore, including these 22 Fields of Research in an analysis of ERA distorts consideration of breadth, as a ‘not assessed’ within these codes simply indicates there is adequate alignment of research codes and actual activity, whereas a ‘not assessed’ for a substantive code indicates that either there is no research activity at that Higher Education Provider, or if there is, it has not produced the requisite outputs to meet the threshold for assessment.
There is also an argument that 1802 Maori Law should not be included in Australian assessments as the inclusion of this code in ANZSCR is a function of ANZSRC being a joint classification for Australia and New Zealand. No Higher Education Provider met the threshold for assessment for Maori Law in ERA 2010.
In addition, nine 4-digit Fields of Research did not record any assessment. Whether that result indicates real gaps in the fabric of Australian Higher Education Research is beyond the scope of this paper. Thus ERA 2010 was not, in practice, an analysis of 157 4-digit Fields of Research but of 125 – 134 Fields of Research depending on whether the fields for which no returns were received are included. We used 134 Fields of Research in our analysis below by omitting the 22 ‘other’ Fields of Research and Maori Law.
Results and Discussion
Challenges of ERA for research education
Temporal scale mismatch
ERA is a retrospective measure of research quality, volume, application and esteem aggregated into an overall performance rating. Based on data from eligible staff from each institution employed at the census date of 31 March 2010, ERA 2010 applied to research outputs from 1 January 2003 to 31 December 2008; research income, commercialisation and esteem measures between 1 January 2006 and 31 December 2008; citation measures from 1 January 2003 to 1 March 2010. Thus some of the research assessed must have predated the publications reference period by several years. The reference periods for ERA 2012 will be updated, for example publications will be limited to the period 1 January 2005 – 31 December 2010, however, the exercise is inevitably retrospective.
Most universities are investing in emerging areas of research to meet perceived future needs in the context of their institutional mission. Current doctoral candidates are the researchers of the future and their research should be aligned with research needs of the future rather than the research strengths of the past. Doctoral candidates should be well represented in an institution’s areas of emerging research including applied and cross-disciplinary research. Experience in the United Kingdom suggests that these areas may not rate well (or at all) in ERA (Elton, 2000).
Organisational scale mismatch

There is a lack of alignment between the 2- and especially the 4- digit Fields of Research used for ERA and university organisational units. Most Australian universities are now organised in large multi-disciplinary schools that conduct research in many Fields of Research (e.g., Environmental Science staff at Griffith University contributed to 82 Fields of Research in ERA 2010 (Tony Shiel, pers comm 2011). Similarly at James Cook University, all of the assessed Fields of Research relied on inputs from at least two and typically five to eight of that institution’s 25 academic organisational units (Chris Cocklin, pers comm 2011). In ERA 2010, this organisational scale mismatch was exacerbated by the inevitable attempt by every university to optimise its ERA returns. As a result, many staff, particularly those undertaking cross-disciplinary research, contributed to their university’s return in several different Fields of Research, which may have received very different ERA evaluations. Alternatively, some institutions score well in Fields of Research not represented by their organisational units. For example, the Australian National University was rated as world class in Education at the 2-digit level without having a unit in this discipline (Margaret Kiley pers comm. 2011).
Although ERA 2012 will incorporate changes designed to improve the capacity to accommodate cross-disciplinary research (Australian Research Council, 2011b), the changes are unlikely to improve this mismatch of organisational scale. The revised methodology will allow each institution to code journal articles with significant content (66 per cent or greater) not represented by a journal’s Fields of Research to another appropriate Field of Research code of its choice (Australian Research Council, 2011b). However, institutions will still code publications to maximise their ERA scores rather than to align with organisational units. Thus using ERA results as a blunt instrument to define the fields, in which a university may offer doctorates or award Australian Postgraduate Awards for example, will almost certainly increase the perverse incentive to ‘optimise’ the coding of the Fields of Research in which research higher degree candidates are working, reducing the robustness of the data on this important topic.
Perverse incentives

ERA 2010 produced at least one perverse incentive that anecdotal evidence indicates has had an impact on research training already. Because ERA 2010 was focused on all publications (or research outputs), it was perceived as emphasising publishing in highly ranked (A* and A) journals in the case of the peer-review disciplines or journals with high impact factors in the case of citation disciplines. The Research Evaluation Committees were presented with percentages of A*, A, B and C publications in their dashboards, along with other research indicators. Consequently until recently, some Australian academics were strongly encouraged to publish only in A* and A journals by senior university staff concerned that any publications in lower ranked journals inevitably reduced the percentages of publications in A* and A journals for the relevant ERA Unit of Evaluation. Thus some academics, particularly in the peer-review disciplines, perceived a strong disincentive to publish with a research higher degree candidate in a B or C journal. For the citation disciplines, there was a similar disincentive to publish in low impact journals.
ERA 2012 will not use the controversial system of ranking journals used in ERA 2010 (Australian Research Council, 2011b). Rather the Australian Research Council will use a refined journal quality indicator and evaluation committees will use their expert judgement to assess the appropriateness of the journals for the disciplinary unit concerned. This new approach is less transparent than its predecessor and is unlikely to change the unwillingness of some supervisors to publish with their research students if it means publishing in low impact journals or their equivalent.
Showing a research higher degree candidate how to publish is very much part of good practice in research training. Consequently, some doctoral programmes require all research students to publish a paper (or in some cases two papers) in order to satisfy the requirements for the degree. Research does not always work out as planned – there is an element of risk. When research does not work out or yields negative results, it is typically not possible to publish the results in high impact journals. This practice reflects the interest in the results to the readers of the journal, rather than the quality of the research. Journals are ranked on the basis of impact factor and it is inevitable that this information will be used in ERA 2012. Because ERA is currently an assessment of all publications, any publication in a journal with a relatively low impact factor (including most journals in emerging fields and many journals that publish cross-disciplinary research) will still have the potential to dilute the quality of publications in the eyes of a Research Evaluation Committee. Thus many supervisors may be reluctant to publish in such journals with their research students, a practice that is likely to disadvantage the student. In addition, established journals can be quite conservative and reluctant to publish new work in emerging, cross-disciplinary or applied areas. Systemic variables affecting the use of ERA in Research Education. There are three broad variables associated with ERA outcomes that will have consequences if ERA is used to allocate higher degree by research places or government funded stipend scholarships: institutional grouping, geography and discipline. We consider each of these variables below.
Institutional Grouping

The performance of Australia’s 41 Higher Education Providers was predictably uneven in ERA 2010 (Table 1), although all but two universities were rated as at world class or better in at least one Field of Research indicating that as in the United Kingdom (Elton, 2000; RAE, 2008), some of the newer universities are producing some ‘outstanding’ research (at least one university outside the Group of Eight universities achieved a maximum score in eight of the 18 2-digit Fields of Research).
As expected, ERA confirmed the research standing of the Group of Eight universities which were collectively assessed in 692 Units of Evaluation of which 91.3 per cent were rated at world standard or better. The seven Innovative Research Universities collectively had 62.5 per cent of 296 Units of Evaluation rated at world class or better, a result similar to that of the five Australian Technology Network universities (59.8 per cent of 225 Units of Evaluation rated at world class or better). The performance of the 21 non-aligned institutions (42.9 per cent of 496 Units of Evaluation rated at world class or better), was more diverse, ranging from Macquarie with 75.6 per cent of 45 Units of Evaluation rated as world class, to Batchelor Institute of Indigenous Tertiary Education, University of Notre Dame and the University of the Sunshine Coast with none. The lowest performing 15 universities were assessed for 234 Units of Evaluation although only 20.9 per cent of these were at world standard or better with the discipline of Nursing being the strongest performer with four of six universities being rated at or above world class in this Field of Research.
Geography
Geography matters. While New South Wales, Victoria and Queensland have numerous Units of Evaluation rated as world class in their capital cities, the other States (Table 2) and regional Australia have significant gaps. South Australia does not have any institutions rated world class in two 2-digit Fields of Research: (1) Education and (2) Commerce, Management, Tourism and Services. In the three 4-digit Fields of Research in the discipline of Education, only two of the eight South Australian Units of Evaluation were rated as world class and only three of the nine Units of Evaluation across the seven 4-digit codes in the Commerce cluster were considered world class. There are no world class providers in Western Australia in Law. There was only one institution (Murdoch) rated at world class in Studies in Human Society at the 2-digit level and only three of 15 Units of Evaluation were rated as world class across the eight 4-digit Fields of Research in the Commerce discipline-cluster. We analysed the performance of 14 ‘regional’ higher education providers; Ballarat, Batchelor, Central Queensland, Charles Darwin, Charles Sturt, Deakin, James Cook, Newcastle, New England, Southern Cross, Southern Queensland, Sunshine Coast, Tasmania and Wollongong. This grouping is a heterogeneous mix as it includes four institutions with no or one worldclass ratings, three members of the Innovative Research University grouping (Charles Darwin, James Cook and Newcastle) while Tasmania and Wollongong are wellestablished non-aligned research universities. There were 15 Fields of Research where the ‘regional’ universities scored relatively well, including Analytical Chemistry and Environmental Science and Management. Of the 33 world class Units of Evaluation across these 15 Fields, all but five were located at the older institutions: Deakin, James Cook, Newcastle, Tasmania or Wollongong. The five 4-digit Fields of Research with the highest number of Units of Evaluation in regional institutions are listed in Table 3; only 10 of 61 (16 per cent) Units of Evaluation were rated as world class or above. The result for Business and Management was particularly concerning; this Field of Research was not rated as world class at any of the 13 regional institutions that claimed critical mass.
Discipline Matters

One feature of ERA 2010 was the generally higher rating of the Sciences, Technology, Engineering and Medical (STEM) fields compared with the Humanities, Arts, and Social Sciences (HASS). The extent to which this result is an artefact of ERA methodology or reflects levels of maturity and/or investment in those fields is beyond our consideration. Thus using ERA results to allocate higher degree by research places will have highly variable consequences in different disciplines (Table 4).
All Units of Evaluation were rated as world class or better for 40 (32 per cent) of 4-digit Fields of Research; 66 Fields of Research (49 per cent) had >80 per cent of Units of Evaluation rated at world class or higher (Commonwealth of Australia 2011a). For example, both Chemical Sciences (100 per cent world class or better at the 2-digit level) and Earth Sciences (100 per cent world class or better at 4-digit level), would be largely unaffected by limiting higher degree by research students to institutions rated as world class in these disciplines. The alternative approach of limiting higher degree by research places to institutions performing at or above national average in these disciplines would deprive world class groups of research students, policy that could not be in the national interest.
However, less than half the Units of Evaluation were rated as world class for 18 Fields of Research, including some fields that were offered by numerous institutions: 13 of these 18 low-rated Fields of Research were offered by between 27 and 39 institutions, one was offered by 22 institutions and four were offered by between five and 13 institutions (Australian Research Council, 2011a). The 4-digit Fields of Research with the lowest percentage of world class ratings were Policy and Administration (18.5 per cent - 27 Units of Evaluation), Marketing (27.6 per cent - 29 Units of Evaluation), Education Systems (31.3 per cent - 32 Units of Evaluation), Applied Economics (33.3 per cent - 33 Units of Evaluation), and Business and Management (33.3 per cent - 39 Units of Evaluation). Thus any mechanistic application of ERA to research education is likely to significantly affect Economics, Commerce, Management, Tourism and Services and Studies in Human Society. Limiting access to Australian Postgraduate Awards to institutions scoring a world class ERA rating would clearly be problematic, especially as 61.9 per cent of doctoral candidates in 2009 were older than 30 (Table 5) and often have family arrangements that limit mobility. Although institutions could award university scholarships to doctoral candidates in the disciplines in which they did not score well in ERA, this practice would reduce the attractiveness of Australia to international research students because of the consequential reduction in the number of scholarships available to them. This approach would be counter-productive public policy because of the well documented impending shortage of academics in Australia (Edwards, 2010; Edwards, Bexley & Richardson, 2011; Edwards, Radloff & Coates, 2009; Edwards & Smith, 2010; Hugo, 2008; Hugo & Morriss, 2010), the planned expansion of the sector (DEEWR, 2009; DIISR, 2009) and the increased international competition for the best and the brightest doctoral students.
This problem is exemplified by the discipline of Education in which 3415 doctoral candidates were enrolled in 2009; 7.7 per cent of all Australian doctoral candidates (Table 6). Nearly 60 per cent of research students in Education surveyed in 2010 (Edwards, Bexley & Richardson, 2011) were aged above 40, suggesting limited mobility. Only 15 of 39 institutions scored at or above the world average for the 2-digit Education Field of Research; no unit of evaluation received a maximum score. Thirty to 50 per cent of the Units of Evaluation for each the four 4-digt codes were also assessed at less than world average (Table 7). Our comparison of the ERA 2010 data at the 2-digit level and official higher education statistics purchased from the Australian government indicate that about one third of the total research students in Education were enrolled at institutions that were not rated as world class in ERA 2010, including 80 per cent of the domestic research students studying at regional institutions.
Thus limiting research education in Education to institutions rated as world class at the 2-digit level will not only require the world class institutions to service a significant additional supervisory load (>1000 extra doctoral students) but would risk seriously downgrading Education research outside the mainland capital cities, particularly in Tasmania and regional Queensland. Given the importance of Australian educational practice being evidence-based and the impending shortage of academics in this field (64.9 per cent of staff are aged above 50; Edwards, Bexley & Richardson 2011), we consider that it is important to introduce mechanisms to promote high quality doctoral training in Education across the nation rather than to limit it based on past performance, a conclusion that we consider applies to many other disciplines as well.
In ERA 2010, world-class critical mass was limited to five or fewer institutions in 39 4-digit Fields of Research (Australian Research Council 2011a). Nine 4-digit Fields of Research including Classical Physics had only one institution with a world class ERA rating. Only seven institutions were rated as world class in Atomic, Molecular, Nuclear, Particle and Plasma Physics, a Field that is likely to be very important to Australia’s clean energy future and in which doctoral study should presumably be encouraged.
To ensure that there was a ‘meaningful amount of data’ to be evaluated, ERA 2010 had a low volume threshold for each Unit of Evaluation (Australian Research Council, 2009). This threshold meant that an unknown number of ‘isolated scholars’ were not assessed, particularly in the Humanities where single scholars are the norm and in small institutions. There is anecdotal evidence that at least some of these scholars are very successful doctoral supervisors. Critical mass is very important in doctoral education to protect the interests of research higher degree candidates, especially if the principal supervisor becomes unavailable, institutional supervision using virtual technologies and visits is an increasingly-recognised practice, recently endorsed by changes to the Research Training Scheme to allow the recognition of joint completions (DIISR, 2011b). We question the wisdom of excluding high performing scholars who were not rated in ERA from research supervision and suggest that they should be encouraged to engage in cross-institutional supervision as discussed further below.
Possible solutions
Changes to ERA to reduce the perverse student publication incentive

A simple solution to overcome the negative impact of ERA on research student publications would be to require institutions to submit all publications (or research outputs) as at present, but to present the data on only the top 80 per cent of publications for each Unit of Evaluation to the Research Evaluation Committees. Such a change would enable supervisors to publish a less interesting paper with a research student in a low impact journal without a negative consequence when the relevant Unit of Evaluation is assessed for ERA. This reform could be introduced for ERA 2012.
Using ERA to improve institutional practice in research education.

The research environment is a necessary but not sufficient component of quality research education as acknowledged by the basket of indicators of doctoral training quality being developed by the Australian Council of Deans and Directors of Graduate Studies (Table 8). We consider that the planned revision of the Research Training Scheme, the establishment of the Tertiary Education Quality Standards Agency (TEQSA), and the Compacts Process, together offer an opportunity for the Australian Government to require universities to explicitly take the quality of the research environment into account in developing their policy and practices for research education and to audit their response. However, any policy change that uses the data from ERA should be designed to explicitly address the challenges outlined above.
Mission-based Compacts are three-year agreements that show how each university’s mission contributes to the Australian Government’s goals for higher education, and include details of major higher education and research funding and performance targets (DEEWR & DIISR, 2009). Requiring universities to stipulate how they plan to take their ERA results into account when awarding Australian Postgraduate Awards in their Compact Agreement and to audit this through the Tertiary Education Quality Standards Agency would enable Higher Education Providers to respond in a more nuanced and positive way than if they were banned from awarding Australian Postgraduate Awards to doctoral candidates in Fields of Research that had been retrospectively evaluated by ERA as below world standard. Universities should also be able to identify emerging Fields of Research that currently are ‘not assessed’ or assessed below world standard, provide strategic reasons why they wish to accept research higher degree candidates or allocate Australian Postgraduate Awards to those Fields of Research, indicate how the research students will be provided with an appropriate research environment and negotiate how their performance should be evaluated. Several recent initiatives could be used in conjunction with ERA to improve doctoral education in fields of research in which there is a national or regional lack of critical mass. ERA offers a mechanism to identify such fields. Groups of universities can now share completions under the Research Training Scheme (DIISR, 2011b). Although this initiative has removed a significant barrier to cross-institutional co-operation in research education in Australia, it is likely to provide a niche rather than an institutional solution to the problems identified here. For example, over the last seven years, the Australian National University has developed several Memoranda of Understanding regarding joint PhDs with other Australian universities. To date, there has only been one cross-institutional PhD enrolment (Mandy Thomas, pers comm 2011) although recent Collaborative Research Network agreements should improve this situation. Experience with developing joint degrees between Australian and overseas institutions indicate that the uptake and success of these arrangements is dependent on established individual collaborations rather than institutional Memoranda of Understanding. Institutional improvements to the quality of research education in a discipline could be achieved using structures that are less formal than joint degrees:
• Joint arrangements for embedded students with other providers e.g. CSIRO.
• Sharing of physical and virtual resources;
• Incentives to encourage cross-institutional supervision and mentoring.
• International collaboration with established research centres.
• On-going collaboration (joint grants, papers, students, Collaborative Research Network agreements).
• External input to milestones e.g. Confirmation of Candidature proposals.
• External input into courses/skills development for research students.
Nonetheless, such initiatives are likely to be expensive and need to be factored into the revision of the funding for research training. In particular, research higher degree candidates may need assistance to travel between geographically separate institutions when distances are large, an inevitable feature of arrangements involving institutions in different states, especially the isolated regional institutions.
Doctoral Training Centres are an increasingly-recognised approach to improving the quality of doctoral education by training cohorts of students while emphasising transferable skills. In the United Kingdom, the Engineering and Physical Sciences Research Council and the Economic and Social Research Council have committed to fund more than 70 such centres, many in cross-disciplinary and applied areas. The five Australian Technology Network (ATN) universities (only three of which were rated as world class in 2-digit Mathematics) have recently established a national Industry Doctoral Training Centre in Mathematical Sciences (ATN, 2011) and its initial cohort of 20-25 PhD students will commence in early 2012, in nodes across the five ATN universities. Cross-institutional supervision can also be achieved using less formal structures but research higher degree candidates will need travel assistance as explained above. ERA as a block fund moderator The Australian government has indicated that the results of ERA will inform the allocation of funding to support the costs of research through the Sustainable Research Excellence Programme and research education through a modified Research Training Scheme. The Research Training Scheme is the most valuable of the research block funding schemes, representing 41 per cent of the total allocation in 2011. For the Research Training Scheme, Australian Postgraduate Awards, and International Postgraduate Research Scheme, the calculation methodology (DIISR, 2011c) is relative institutional performance in research income (40 per cent), publications (10 per cent) and research student completions (50 per cent) and it expected that ERA results will be used to moderate these drivers. However at present, there is no agreed method of assessing overall institutional performance in ERA and some of the measures used are simplistic, including the measures such as percentage of Fields of Research at world standard or better used here. Of particular concern, especially for the large research intensive universities, is the failure of the present ERA rating scheme to include any measure of critical mass provided that the institution meets the low volume threshold. An institution that achieves an ERA rating of ‘5’ based on 50 publications in a Field of Research will provide a very different environment for research high degree candidates to an institution that achieves the same rating based on 1000 publications. Nonetheless, bigger is not necessarily better, which is why a basket of indicators of research education quality is needed (Table 8).
However, the volume of output that has gone into achieving an ERA rating has to be taken into account in the funding formula. It will be challenging to develop an agreed measure of overall institutional performance in ERA and use it to have a positive impact on research training while taking the following additional factors into account: (1) most universities in Australia produce some excellent research outputs as ERA 2010 demonstrated, (2) as in the United Kingdom (Elton, 2010), universities are likely to use their freedom of virement to fund lower-rated department at the expense of higherrated ones, (3) the challenges of Australia’s dispersed geography, (4) the impending shortage of academic staff identified by various scholars (Edwards, 2010; Edwards, Bexley & Richardson, 2011; Edwards, Radloff & Coates, 2009; Edwards & Smith, 2010; Hugo, 2008; Hugo & Morriss, 2010), and (5) the need for quality academic staff to service the planned expansion of the sector (DEEWR, 2009; DIISR, 2009).
Conclusions

ERA 2010 was a comprehensive academic evaluation of the research conducted by Australia’s higher education providers in the first decade of the 21st century and subsequent rounds promise similar insights. Nonetheless, use of ERA to influence the policy and practice of research education in Australia will undoubtedly have many unintended consequences, some potentially deleterious. It is important to anticipate deleterious consequences before they become apparent. Our analysis of the results of ERA 2010 demonstrates a lack of alignment between the Fields of Research and university organisational units and that using ERA results to allocate higher degree by research places will have variable consequences in different locations as a result of Australia’s geography and in different disciplines. In addition, ERA provides an incentive for Australian academics to eschew publishing in low impact journals, a practice which is likely to disadvantage some research students for whom co-authorship in a lower ranked journal is more advantageous than not publishing.
Given these challenges, simplistically limiting doctoral education to Fields of Research where an institution scored at or better than national or world averages in ERA is unlikely to be in the national interest, especially given that ERA is retrospective and will not reflect the current situation. Doctoral students should be well represented in areas of emerging research including applied and crossdisciplinary research.
There are many ways in which ERA results could be used to improve the quality of research education in Australia. We suggest that requiring Higher Education Providers to describe how they plan to deliver quality research education in all disciplines relevant to their mission in their Compact Agreement with the Commonwealth would be a positive reform. Institutions could also be required to report on their research education inputs and outcomes against an agreed basked of quality training indicators for each of these disciplines to the Tertiary Education Quality Standards Agency.
Helene Marsh is Distinguished Professor of Environmental Science and Dean Graduate Research Studies, James Cook University, Queensland. Bradley Smith is the Manager of Research Strategy, Division of Research and Innovation, James Cook University, Queensland. Maxwell King is Pro Vice-Chancellor(Research and Research Training) and a Sir John Monash Distinguished Professor at Monash University, Victoria. Terry Evans is a Professor of Education at Deakin University, Victoria.
References.

31 mai 2011

End of an ERA: journal rankings dropped

http://www.zone.uwa.edu.au/__data/assets/image/0011/258356/the-australian.gifBy Jill Rowbotham. JOURNALS will no longer be assigned rankings in a radical shake up of the Excellence in Research for Australia initiative, announced by Innovation, Industry, Science and Research Minister Kim Carr today.
The ranking of journals as A*, A, B and C was the most contentious aspect of the ERA exercise devised and administered by the Australian Research Council, with the first results published in January.
"I wished to explore ways in which we could improve ERA so the aspects of the exercise causing sector disquiet, especially issues around the ranked journals list, could be minimised or even overcome,'' Senator Carr said in a ministerial statement.
He chastised the research community, saying: "There is clear and consistent evidence that the rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings.
"One common example was the setting of targets for publication in A and A* journals by institutional research managers.
"In light of these two factors - that ERA could work perfectly well without the rankings, and that their existence was focussing ill-informed  undesirable behaviour in the management of research -  I have made the decision to remove the rankings, based on the ARC's expert advice.''
Senator Carr said lists of journals would still be important, and each journal would be provided with a publication profile, that is, an indication of how often it was chosen as the forum of publication by academics in a given field.
"These reforms will strengthen the role of the ERA Research Evaluation Committee members in using their own, discipline-specific expertise to make judgments about the journal publication patterns for each unit of evaluation.''
ARC chief executive Margaret Sheil said the change empowered "committee members to use their expert judgement to take account of nuances in publishing behaviour''.
"This approach will allow experts to make judgements about the quality of journals in the context of each discipline,'' Professor Sheil said.
Other changes announced include: increasing the capacity to accommodate multi-disciplinary research and investigating strategies to strengthen the peer review process, including improved methods of sampling and review assignment.
18 octobre 2010

Forum Franco-Chinois de l’Enseignement Supérieur

http://www.univ-metz.fr/img/cpu-slogan.pngDestiné à favoriser le dialogue académique entre la France et la Chine, le 1er forum franco-chinois de l’enseignement supérieur se tiendra le 22 octobre 2010 à Shanghai (Université de Tongji), comme l’avait annoncé officiellement le Président de la République lors de sa visite officielle en Chine en mai dernier. Ce forum est organisé par CampusFrance et l’Ambassade de France en Chine, en partenariat avec les ministères de tutelle, la CPU, la CDEFI, la CGE et le China Education Association for International Exchange (CEAIE). Voir le programme.
Forum franco-chinois de l’Enseignement Supérieur. Université Tongji - Shanghai - 22 octobre 2010. Programme prévisionnel:
Allocutions des Ministres: Monsieur YUAN Guiren, Ministre chinois de l’Education, Madame Valérie Pécresse, Ministre française de l’Enseignement supérieur et de la Recherche.
Introduction par M. Gérard Binder, Président de l’Agence CampusFrance
M. ZHOU Qifeng, président de l’Université de Pékin : les défis auxquels sont confrontés les établissements d’enseignement supérieur chinois dans le processus de la mondialisation
Jean-Pierre Gesson, Président de l’Université de Poitiers et Président de la Commission des Relations Internationales et Européennes de la CPU : les conséquences des réformes sur l’internationalisation des universités françaises
La mobilité étudiante entre la Chine et la France
Modérateur: DONG Qi, vice-président de l’Université Tongji, Rapporteur : Sophie Béjean, Présidente de l’Université de Bourgogne. Thèmes : reconnaissance des diplômes ; transferts de crédits ; mobilité des étudiants français vers la
Chine et des étudiants chinois vers la France
Modèles de coopération franco-chinoise en Chine. Modérateur: TANG Xiaoqing, vice-présidente de Beihang, Rapporteur: Michel Mudry, ancien Président de l’Université d’Orléans et Délégué général de la CDEFI. Thème : exemples de coopérations engagées.
Les formations professionnelles
Modérateur: Pierre Tapie, Directeur général de l’ESSEC et Président de la CGE, Rapporteur : JIANG Guoping, directeur de l’Institut des Technologies industrielles de Nankin. Thèmes : l’insertion professionnelle: de la Licence au Doctorat.
Le lien formation / valorisation de la recherche/ innovation

Modérateur : Gilbert Casamatta, Président de l’INP Toulouse, Rapporteur: OU Jinping, président de l’Université des Technologies de Dalian. Thèmes : la recherche, les transferts de technologies.

http://www.univ-metz.fr/img/cpu-slogan.png 旨在促进大学)学术法国和中国之间的对话, 第一佛朗哥,中国高等教育论坛将于2010年10月22日在上海 (同济大学,作为正式宣布共和国总统期间他对中国进行正式访问去年五月。 本次论坛是由CampusFrance和法国驻中国大使馆,在CPU与有关部委合作的,CDEFI,专家咨询小组和中国教育国际交流协会(教育国际交流协会)。  见方案 更多...

15 décembre 2009

L’AERES a reçu la China Education Association for International Exchange

Le 1er décembre 2009, l’AERES a reçu une délégation chinoise conduite par l’association de l’enseignement chinois pour les échanges internationaux (China Education Association for International Exchange). Etaient également représentés : l’institut de Shanghai d’évaluation de l’enseignement (Shanghai Education Evaluation Institute) et le département provincial d’enseignement de Zhejiang, division de la coopération et des échanges Internationaux (Division of International, Cooperation and Exchange ; Zhejiang Provincial Education Department).

Portant une attention particulière aux systèmes européens d’évaluation de l’enseignement supérieur et de la recherche, les représentants chinois souhaitaient situer le système français au regard du système anglo-saxon. Après une présentation des évolutions des pratiques françaises en la matière, cette rencontre fut l’occasion d’échanger sur les missions de l’agence et sur le principe de l’évaluation intégrée qui fait son originalité. La démarche de l’AERES a notamment suscité l’intérêt de la délégation chinoise en raison du respect de la diversité des institutions qu’elle requiert et de la nécessité qu’elle induit de responsabiliser les établissements au sein desquels la culture de l’évaluation est encore peu présente.
12月1日到2009年,AERES收到了中方代表团由中国教育协会领导的国际贸易国际交流(中国教育学会)也派代表:教育评价(上海教育评估研究所),以及浙江,合作与国际贸易司省教育厅(司国际合作与交流;浙江上海研究院省教育厅). AERES的做法引起了中方代表团特别感兴趣的是,由于机构需要多样性的尊重,以及诱导需要赋予该机构评价文化仍然不存在. 更多...
13 novembre 2009

Partenariat franco-vietnamien: nouvelle université des sciences et technologies au Vietnam

Ministère de l'Enseignement Supérieur et de la RechercheUn accord de partenariat  pour la création d’une nouvelle université des sciences et technologies à Hoa Lac (U.S.T.H.),  a été signé entre Valérie Pécresse et Thien Nanh Nguyen, ministre de l’Éducation et de la Formation du Vietnam. Six thématiques scientifiques pluridisciplinaires ont été retenues : Biotechnologie et pharmacologie; Aéronautique et espace; Énergie; Sciences et technologie de l’information et de la communication (STIC); Matériaux, nanotechnologies; Environnement/eau/océanographie.
La ministre a rappelé que plus de trente établissements d’enseignement supérieur français ont constitué un consortium afin d’envisager et de détailler ce que la France pourrait apporter à la constitution de cette université, notamment en matière de gouvernance et d’orientation des enseignements. La ministre a de nouveau souligné l’aspect « gagnant-gagnant » de ce projet scientifique pour nos équipes de recherche, qui, déjà fortement implantées au Vietnam, bénéficieront du dynamisme de ce pays émergent tout en y diffusant aussi l’influence française.
Coopération universitaire franco-vietnamienne: 5 031 étudiants vietnamiens inscrits dans les universités françaises en 2008-2009, dont 45 % en licence, 41 % en master, 14 % en doctorat. Leur répartition par disciplines est caractérisée par une prédominance de la gestion, du droit et des sciences économiques (47 % des effectifs), suivies de près par les disciplines scientifiques et technologiques (40 %) ; les lettres, langues et sciences humaines ne représentant que 13 %. Ces effectifs ont triplé depuis 2001-2002, avec une augmentation de 11% par rapport à 2007-2008.  Si l’on considère l’accueil des étudiants étrangers dans les universités françaises, le Vietnam se situe au 2ème rang des pays asiatiques, après la Chine. * La France est le 4ème pays de destination des étudiants vietnamiens après l’Australie, les Etats-Unis et la Grande Bretagne. * 165 accords interuniversitaires franco-vietnamiens (sur 890 pour la zone Asie).
Bộ Giáo dục Đại học và Nghiên cứuMột thỏa thuận hợp tác cho sự sáng tạo của một trường đại học mới của khoa học và công nghệ tại Hòa Lạc (USTH) đã được ký kết giữa Valerie Pécresse nành và Thiên Nguyên, Bộ trưởng Bộ Giáo dục và Đào tạo của Việt Nam. Sáu đề tài khoa học đa ngành đã được xác định: Công nghệ sinh học và Dược, Vu trụ, Năng lượng, Khoa học và Công nghệ Thông tin và Truyền thông (STIC); vật liệu, công nghệ nano, môi trường / nước / hải dương học. Xem thêm...
25 septembre 2009

L'UTSEUS: l'Université de technologie sino-européenne de l’université de Shanghai

http://www.educpros.fr/fileadmin/www.educpros.fr/templates/img/logoHeader.pngEn Chine, le plus difficile pour un établissement d’enseignement supérieur n’est pas de signer un partenariat mais bien de le faire vivre. Retour d’expérience avec l’UTSEUS, l’établissement lancé conjointement par les trois universités de technologie françaises et l’université de Shanghai.
Après quatre ans de discrète existence, cette école d’ingénieurs, qui compte 750 étudiants chinois, veut désormais s’ouvrir à la recherche et l’international.
Tous les deux mois, les présidents des universités de technologie se rendent à Shanghai pour faire avancer l’UTSEUS (Université de technologie sino-européenne de l’université de Shanghai), l’école d’ingénieurs qu’ont lancée conjointement les universités de technologie de Belfort-Montbéliard, Compiègne et Troyes et l’université de Shanghai en Chine. Début juillet 2009, c’était au tour de Ronan Stéphan, alors président de l’UTC (Université de technologie de Compiègne), et Louis Coté, son conseiller devenu depuis administrateur provisoire de l'Université de technologie, de venir en Chine rencontrer un maximum de contacts : enseignants-chercheurs de l’université de Shanghai, industriels français, représentants de l’incubateur de la ville. Un marathon de rendez-vous pour franchir une nouvelle étape dans le développement de l’UTSEUS.
Créée en 2005, cette « université » dans l’université occupe modestement le troisième étage d’un des bâtiments du campus de Baoshan, à quarante-cinq minutes du centre ville de Shanghai. Un campus moderne et verdoyant à l’américaine de près d’un kilomètre carré. Toutes promotions confondues, près de 750 étudiants chinois suivent aujourd’hui leurs études d’ingénieur à l’UTSEUS. Ils y entrent après avoir passé le Gaokao, le concours national d’entrée à l’université. Ces étudiants passent les trois premières années à Shanghai. Les 140 meilleurs ont ensuite la possibilité de venir effectuer leurs deux années de master dans l’une des trois UT françaises. Les autres restent à l’université de Shanghai. Parallèlement, une vingtaine d’étudiants français terminent leur dernière année à Shanghai.
http://www.educpros.fr/fileadmin/www.educpros.fr/templates/img/logoHeader.pngIn China, the most difficult for an institution of higher education is not to sign a partnership but rather to make it live. Feedback with UTSEUS, establishment launched jointly by three universities of technology and the French University in Shanghai.
After four years of discreet existence, the engineering school, which has 750 Chinese students, will now be open to research and international. More...
<< < 1 2 3 4 5 6 7 8 9 10 > >>
Newsletter
51 abonnés
Visiteurs
Depuis la création 2 795 532
Formation Continue du Supérieur
Archives