Pour vous informer sur la démarche de VAE et vous conseiller sur sa faisabilité par rapport à votre expérience, il existe différents réseaux qui pourront vous appuyer dans cette première étape vers le diplôme, le titre professionnel ou le certificat de qualification professionnelle (CQP) qui vous intéresse. Ce type de prestation est proposé généralement à titre gratuit à proximité de chez vous.
Une seule condition est exigée : justifier d’une expérience acquise dans l'exercice d'une activité salariée, non salariée ou bénévole de minimum 36 mois en continu ou en discontinu, en rapport avec le contenu de la certification (diplôme, titre à finalité professionnelle ou CQP de branche) envisagée. Tout public - réseau relais d'information. Entreprise - réseau CAPEB.
Démarche - Étape 2 : Constituer et déposer ma demande de VAE - Décision de recevabilité
Vous avez peut être bénéficié d'un appui-conseil pour vous faciliter le choix de la certification la plus proche de votre expérience et vous souhaitez à présent engager officiellement une demande de VAE auprès du certificateur qui vous a été recommandé. Pour cela, vous devez prendre contact avec ce dernier et solliciter un dossier de recevabilité. Une fois complété, ce dossier est à envoyer à l'adresse qui vous aura été communiquée avec obligatoirement les justificatifs demandés.
Vous recevrez en retour de cet organisme « certificateur » une décision de recevabilité :
* Favorable : Vous justifiez d'une expérience suffisante pour poursuivre la démarche VAE mais cela ne vous garantie pas l'obtention de la certification visée. Le courrier de notification de cette décision vous apportera des informations utiles pour la suite de la démarche. La décision de recevabilité sera valable uniquement pendant 1 an.
* Défavorable : Votre expérience professionnelle n’est pas remise en cause. Rapprochez-vous de votre certificateur pour en connaître la raison ou les autres alternatives qui s’offrent à vous (appui-conseil pour mieux cibler la certification, bilan de compétences ou une formation, …)
Démarche - Étape 3 : Financer mon accompagnement - Financement et accompagnement
Dès avis favorable par l’organisme certificateur, ce dernier vous indiquera la possibilité de bénéficier de prestations d'accompagnement pour réaliser votre dossier de preuve. C’est une prestation qui a un coût (variable en fonction du certificateur et de la certification choisie) et qui peut être pris en charge en totalité ou partiellement selon votre statut par votre entreprise ou un organisme financeur selon votre situation professionnelle (exemple: Pôle Emploi, Conseil régional, Fongecif, ...) Votre demande de financement devra être obligatoirement formulée avant de commencer les heures d’accompagnement.
Démarche - Étape 4 : Bénéficier d'un accompagnement - Prestations d'accompagnement
Sous la forme d’entretien individuel ou de groupe de travail, ce type de prestations d’accompagnement est fortement recommandé par les certificateurs. Il s'agit de prestations facultatives mais payantes qui visent à finaliser votre démarche dans de bonnes conditions de réussite et surtout s’assurer que votre dossier de preuve et votre retour d'expériences puissent correspondre aux attentes du jury et ainsi augmenter vos chances d’obtenir la certification en totalité.
Il est fortement recommandé de s'assurer d'une prise en charge de ce type de prestations avant de les engager.
Démarche - Étape 5 : Préparer le dossier de preuve - Dossier de preuve
Vous disposez à présent d'un avis favorable de l’organisme certificateur. Ce dernier aura pris le soin de vous communiquer également des informations utiles pour la suite de la démarche VAE. L'étape suivante dite de "validation" varie suivant le certificateur mais vise toujours à valider la réalité de votre expérience.
Vous devrez constituer un « dossier de preuve » dans lequel vous décrirez de façon très précise et détaillée vos activités professionnelles ou extra-professionnelles en lien avec la certification visée. Ce dossier est dit de « preuve » car il comprend d'un côté votre partie rédigée qui valorise votre retour d'expérience et de l'autre les justificatifs qui l'attestent. L'ensemble de ces éléments seront pris en considération par le jury.
Pour vous aider dans votre démarche de VAE, l’organisme certificateur peut vous proposer de bénéficier d’un accompagnement, (prestation facultative mais payante) pour la réalisation de votre dossier de preuve et vous préparer à l’entretien avec le jury. Les prestations d'accompagnement peuvent être prises en charge par différents organismes selon votre statut.
Démarche - Étape 6 : S'entretenir avec le jury et le suivi post-jury - Jury et post-jury
A ce stade de votre VAE, vous devrez soumettre votre dossier de preuve et le complément d’information que vous pourrez apporter de vive voix à la bienveillance du jury lors d'un entretien avec ce dernier. Le jury est par définition souverain, c'est-à-dire qu'il a la responsabilité pour le certificateur représenté de vous indiquer si votre expérience correspond à une validation totale de la certification visée (diplôme, titre professionnel ou CQP), partielle ou de décider d'un refus de validation.
Le jury est également dit "paritaire" (c’est-à-dire constitué d’enseignants et de professionnels de votre corps de métiers) qui devront s’assurer que vous présentez bien les compétences professionnelles requises par rapport au référentiel de certification et/ou d'activités.
Il ne s'agit pas d'une soutenance mais d'un échange qui permettra au jury de vérifer l'adéquation de vos compétences avec le référentiel de certification et/ou d'activités. La durée de cet entretien est donc très variable (Elle dépend souvent de ce que vous aurez à dire et de la qualité de votre dossier.)
En cas de validation partielle, un suivi post-jury peut être envisagé pour vous permettre d’obtenir le complément de compétences nécessaire à la validation totale de la certification visée. Vous disposerez de 5 ans pour représenter votre dossier avec le complément de compétences souhaité par le jury suivant les modalités qui vous seront précisées par le certificateur.
For information om processen med VAE og rådgive dig om dets gennemførlighed i forhold til din erfaring, er der forskellige netværk, der kan støtte dig i dette første skridt i retning af eksamensbeviset, faglig eller faglige kvalifikationer, certifikat (CQP), som du interesseret. Denne type service tilbydes normalt gratis i nærheden af dig. Mere...
The latest issue of Australian Universities' Review, vol. 54, no. 1, is now available online at www.aur.org.au. Special issue: Contemporary issues in doctoral education.
A new era for research education in Australia? Helene Marsh, James Cook University, Bradley Smith, James Cook University, Max King, Monash University, Terry Evans, Deakin University, pp. 83-93.
Use of the Australian research assessment exercise, Excellence in Research for Australia (ERA) to influence the policy and practice of research education in Australia will undoubtedly have many consequences, some of them unintended and potentially deleterious. ERA is a retrospective measure of research quality; research education is prospective. There is a lack of alignment between the 2- and especially the 4- digit Fields of Research used for ERA and university organisational units. While numerous Fields of Research were rated as world class in multiple institutions in the capital cities of New South Wales, Victoria and Queensland, the other states and regional Australia have significant gaps. The Sciences, Technology, Engineering and Medical (STEM) fields were generally rated higher than the Humanities, Arts, and Social Sciences (HASS) disciplines. Thus using ERA results to allocate higher degree by research places will have highly variable consequences in different disciplines and locations, given the obstacles to the mobility of the largely mature-aged doctoral cohort and the forecast impending academic skills shortage. ERA provides an incentive for Australian academics to eschew publishing in low impact journals and is likely to disadvantage some research students for whom co-authorship in a lower impact journal is more advantageous than no publication. There are many ways in which ERA results could be used to improve the quality of research education in Australia. Nonetheless, simplistically limiting doctoral education to Fields of Research where an institution scored at or better than national or world averages in ERA is unlikely to be in the national interest because our future research and academic workforce needs to be well prepared to operate across the nation in areas of emerging research, including cross-disciplinary and applied research.
Excellence in Research for Australia (ERA) is designed to provide a comprehensive review of the quality of research undertaken in Australian higher education institutions at regular intervals. The first ERA was conducted in 2010 (Australian Research Council, 2011a), the second will be conducted in 2012 and the third is planned for 2016. ERA was a successor to the Research Quality Framework (RQF) (DEST, 2005); an initiative prompted by political scepticism about the claims/assertions that universities made about the value of and returns on national investment in research. In implementing ERA, Australia follows several other countries, including the United Kingdom (RAE, 2008), New Zealand (PBRF, 2012), Hong Kong (French, Massy & Young 2001), which have conducted national assessments of the quality of research based on various criteria. These overseas assessment exercises have been used to guide research funding in response to concerns about the affordability of funding all higher education institutions for research as higher education has moved from an elite to a mass system (Elton, 2000). However, the outcomes have not always been as policy makers intended. For example, in the United Kingdom, the exercise, which was aimed at concentrating research in fewer institutions and departments, confirmed that many of the newer universities were producing quality research and many universities used their freedom of virement to fund lower-rated departments at the expense of higher-rated ones (Elton, 2000).
In ERA 2010, each of the 41 Australian Higher Education Providers was invited to provide evidence of research quality, volume, application and esteem across eight disciplinary clusters: (1) Physical, Chemical and Earth Sciences; (2) Humanities and Creative Arts; (3) Engineering and Environmental Sciences; (4) Social, Behavioural and Economic Sciences; (5) Mathematical, Information and Computing Sciences; (6) Biological Sciences and Technology; (7) Biomedical and Clinical Health Sciences; (8) Public and Allied Health Sciences. The disciplines within each cluster were defined by the 2 and 4-digit Fields of Research identified by the Australian and New Zealand Standard Research Classification (ANZSRC, 2008).
ERA 2010 was an academic rather than an end-user evaluation of Australia’s research. The evaluation was undertaken by eight Research Evaluation Committees, each of which was broadly representative of its discipline cluster group. Each committee’s assessment was based on a ‘dashboard’ of indicators of research quality, research volume and activity, research applications and recognition (Australian Research Council, 2011a). Each Field of Research was evaluated on a five-point scale ranging from ‘1’ (well below world standard) to ‘5’ (well above world standard) with a rating of ‘3’ representing world standard. If an institution did not meet the low volume threshold for critical mass for a Field of Research, it was rated as ‘not assessed’ for that field. The indicators were largely metric-based with an emphasis on citation analysis the vast majority of Sciences, Technology, Engineering and Medical (STEM) disciplines and peer review by international experts in the remaining discipline clusters. Thus the range of disciplines was split into peer-review disciplines and citation disciplines. The evaluation processes were not transparent and attempts to determine the relative importance of the input factors through retrospective analysis have largely failed. Some bodies including the Australian Academy for the Technological Sciences and Engineering (ATSE, 2009) expressed concern that applied and cross-disciplinary research would be undervalued, a concern supported by analyses of British Research Assessment Exercises (e.g. Elton, 2000).
The ways in which ERA will be incorporated into the drivers that determine the Research Training Scheme, the block grant provided to Australian universities to fund research training, have yet to be determined. In ‘Research skills for an innovative future’ (DIISR, 2011a), the Australian government stated that the Excellence in Research for Australia (ERA) initiative will support the ‘identification and recognition of research strengths within universities’ as a vital component of research education (page 23). Despite intuitive appeal, this approach may have the unintended consequence of reducing research education in areas of national or regional importance, especially areas of applied, cross-disciplinary or emerging research. The purpose of our paper is to explore possible consequences of ERA for research education in Australia and to suggest ways in which ERA results could be used to enhance research education in Australia while minimising deleterious, unintended consequences ‘before they become apparent, let alone researchable’ (Elton, 2000).
Our analysis is largely based on the National Report of ERA 2010 (Australian Research Council, 2011a). ERA 2010 scores were based on 25 2-digit and 157 4-digit Fields of Research as defined by the ANZSRC classification (ANZSRC 2008), a pragmatic taxonomy of research across all research and development sectors in Australia and New Zealand including industry, Government agencies, private not for profit organisations and universities. This classification was not designed as a taxonomy of university research per se and includes Fields of Research that are largely undertaken outside the sector e.g., automotive engineering and medical biotechnology. Thus it is questionable whether an analysis such as ours should include all these fields. Twenty-two of the 4-digit codes are ‘XX99’ or ‘other’ codes e.g., 699 Other Biological Sciences and 1499 Other Economics. There were only 28 Units of Evaluation (a 2-digit or 4-digit Field of Research for one institution) across the 22 ‘other’ Fields of Research compared with 1708 Units of Evaluation for the substantive Fields of Research (Commonwealth of Australia 2011a). The purpose of the ‘other’ codes is to pick up research not adequately captured by the main 4-digit Fields of Research. Therefore, including these 22 Fields of Research in an analysis of ERA distorts consideration of breadth, as a ‘not assessed’ within these codes simply indicates there is adequate alignment of research codes and actual activity, whereas a ‘not assessed’ for a substantive code indicates that either there is no research activity at that Higher Education Provider, or if there is, it has not produced the requisite outputs to meet the threshold for assessment.
There is also an argument that 1802 Maori Law should not be included in Australian assessments as the inclusion of this code in ANZSCR is a function of ANZSRC being a joint classification for Australia and New Zealand. No Higher Education Provider met the threshold for assessment for Maori Law in ERA 2010.
In addition, nine 4-digit Fields of Research did not record any assessment. Whether that result indicates real gaps in the fabric of Australian Higher Education Research is beyond the scope of this paper. Thus ERA 2010 was not, in practice, an analysis of 157 4-digit Fields of Research but of 125 – 134 Fields of Research depending on whether the fields for which no returns were received are included. We used 134 Fields of Research in our analysis below by omitting the 22 ‘other’ Fields of Research and Maori Law.
Results and Discussion
Challenges of ERA for research education
Temporal scale mismatch
ERA is a retrospective measure of research quality, volume, application and esteem aggregated into an overall performance rating. Based on data from eligible staff from each institution employed at the census date of 31 March 2010, ERA 2010 applied to research outputs from 1 January 2003 to 31 December 2008; research income, commercialisation and esteem measures between 1 January 2006 and 31 December 2008; citation measures from 1 January 2003 to 1 March 2010. Thus some of the research assessed must have predated the publications reference period by several years. The reference periods for ERA 2012 will be updated, for example publications will be limited to the period 1 January 2005 – 31 December 2010, however, the exercise is inevitably retrospective.
Most universities are investing in emerging areas of research to meet perceived future needs in the context of their institutional mission. Current doctoral candidates are the researchers of the future and their research should be aligned with research needs of the future rather than the research strengths of the past. Doctoral candidates should be well represented in an institution’s areas of emerging research including applied and cross-disciplinary research. Experience in the United Kingdom suggests that these areas may not rate well (or at all) in ERA (Elton, 2000).
Organisational scale mismatch
There is a lack of alignment between the 2- and especially the 4- digit Fields of Research used for ERA and university organisational units. Most Australian universities are now organised in large multi-disciplinary schools that conduct research in many Fields of Research (e.g., Environmental Science staff at Griffith University contributed to 82 Fields of Research in ERA 2010 (Tony Shiel, pers comm 2011). Similarly at James Cook University, all of the assessed Fields of Research relied on inputs from at least two and typically five to eight of that institution’s 25 academic organisational units (Chris Cocklin, pers comm 2011). In ERA 2010, this organisational scale mismatch was exacerbated by the inevitable attempt by every university to optimise its ERA returns. As a result, many staff, particularly those undertaking cross-disciplinary research, contributed to their university’s return in several different Fields of Research, which may have received very different ERA evaluations. Alternatively, some institutions score well in Fields of Research not represented by their organisational units. For example, the Australian National University was rated as world class in Education at the 2-digit level without having a unit in this discipline (Margaret Kiley pers comm. 2011).
Although ERA 2012 will incorporate changes designed to improve the capacity to accommodate cross-disciplinary research (Australian Research Council, 2011b), the changes are unlikely to improve this mismatch of organisational scale. The revised methodology will allow each institution to code journal articles with significant content (66 per cent or greater) not represented by a journal’s Fields of Research to another appropriate Field of Research code of its choice (Australian Research Council, 2011b). However, institutions will still code publications to maximise their ERA scores rather than to align with organisational units. Thus using ERA results as a blunt instrument to define the fields, in which a university may offer doctorates or award Australian Postgraduate Awards for example, will almost certainly increase the perverse incentive to ‘optimise’ the coding of the Fields of Research in which research higher degree candidates are working, reducing the robustness of the data on this important topic.
ERA 2010 produced at least one perverse incentive that anecdotal evidence indicates has had an impact on research training already. Because ERA 2010 was focused on all publications (or research outputs), it was perceived as emphasising publishing in highly ranked (A* and A) journals in the case of the peer-review disciplines or journals with high impact factors in the case of citation disciplines. The Research Evaluation Committees were presented with percentages of A*, A, B and C publications in their dashboards, along with other research indicators. Consequently until recently, some Australian academics were strongly encouraged to publish only in A* and A journals by senior university staff concerned that any publications in lower ranked journals inevitably reduced the percentages of publications in A* and A journals for the relevant ERA Unit of Evaluation. Thus some academics, particularly in the peer-review disciplines, perceived a strong disincentive to publish with a research higher degree candidate in a B or C journal. For the citation disciplines, there was a similar disincentive to publish in low impact journals.
ERA 2012 will not use the controversial system of ranking journals used in ERA 2010 (Australian Research Council, 2011b). Rather the Australian Research Council will use a refined journal quality indicator and evaluation committees will use their expert judgement to assess the appropriateness of the journals for the disciplinary unit concerned. This new approach is less transparent than its predecessor and is unlikely to change the unwillingness of some supervisors to publish with their research students if it means publishing in low impact journals or their equivalent.
Showing a research higher degree candidate how to publish is very much part of good practice in research training. Consequently, some doctoral programmes require all research students to publish a paper (or in some cases two papers) in order to satisfy the requirements for the degree. Research does not always work out as planned – there is an element of risk. When research does not work out or yields negative results, it is typically not possible to publish the results in high impact journals. This practice reflects the interest in the results to the readers of the journal, rather than the quality of the research. Journals are ranked on the basis of impact factor and it is inevitable that this information will be used in ERA 2012. Because ERA is currently an assessment of all publications, any publication in a journal with a relatively low impact factor (including most journals in emerging fields and many journals that publish cross-disciplinary research) will still have the potential to dilute the quality of publications in the eyes of a Research Evaluation Committee. Thus many supervisors may be reluctant to publish in such journals with their research students, a practice that is likely to disadvantage the student. In addition, established journals can be quite conservative and reluctant to publish new work in emerging, cross-disciplinary or applied areas. Systemic variables affecting the use of ERA in Research Education. There are three broad variables associated with ERA outcomes that will have consequences if ERA is used to allocate higher degree by research places or government funded stipend scholarships: institutional grouping, geography and discipline. We consider each of these variables below.
The performance of Australia’s 41 Higher Education Providers was predictably uneven in ERA 2010 (Table 1), although all but two universities were rated as at world class or better in at least one Field of Research indicating that as in the United Kingdom (Elton, 2000; RAE, 2008), some of the newer universities are producing some ‘outstanding’ research (at least one university outside the Group of Eight universities achieved a maximum score in eight of the 18 2-digit Fields of Research).
As expected, ERA confirmed the research standing of the Group of Eight universities which were collectively assessed in 692 Units of Evaluation of which 91.3 per cent were rated at world standard or better. The seven Innovative Research Universities collectively had 62.5 per cent of 296 Units of Evaluation rated at world class or better, a result similar to that of the five Australian Technology Network universities (59.8 per cent of 225 Units of Evaluation rated at world class or better). The performance of the 21 non-aligned institutions (42.9 per cent of 496 Units of Evaluation rated at world class or better), was more diverse, ranging from Macquarie with 75.6 per cent of 45 Units of Evaluation rated as world class, to Batchelor Institute of Indigenous Tertiary Education, University of Notre Dame and the University of the Sunshine Coast with none. The lowest performing 15 universities were assessed for 234 Units of Evaluation although only 20.9 per cent of these were at world standard or better with the discipline of Nursing being the strongest performer with four of six universities being rated at or above world class in this Field of Research.
Geography matters. While New South Wales, Victoria and Queensland have numerous Units of Evaluation rated as world class in their capital cities, the other States (Table 2) and regional Australia have significant gaps. South Australia does not have any institutions rated world class in two 2-digit Fields of Research: (1) Education and (2) Commerce, Management, Tourism and Services. In the three 4-digit Fields of Research in the discipline of Education, only two of the eight South Australian Units of Evaluation were rated as world class and only three of the nine Units of Evaluation across the seven 4-digit codes in the Commerce cluster were considered world class. There are no world class providers in Western Australia in Law. There was only one institution (Murdoch) rated at world class in Studies in Human Society at the 2-digit level and only three of 15 Units of Evaluation were rated as world class across the eight 4-digit Fields of Research in the Commerce discipline-cluster. We analysed the performance of 14 ‘regional’ higher education providers; Ballarat, Batchelor, Central Queensland, Charles Darwin, Charles Sturt, Deakin, James Cook, Newcastle, New England, Southern Cross, Southern Queensland, Sunshine Coast, Tasmania and Wollongong. This grouping is a heterogeneous mix as it includes four institutions with no or one worldclass ratings, three members of the Innovative Research University grouping (Charles Darwin, James Cook and Newcastle) while Tasmania and Wollongong are wellestablished non-aligned research universities. There were 15 Fields of Research where the ‘regional’ universities scored relatively well, including Analytical Chemistry and Environmental Science and Management. Of the 33 world class Units of Evaluation across these 15 Fields, all but five were located at the older institutions: Deakin, James Cook, Newcastle, Tasmania or Wollongong. The five 4-digit Fields of Research with the highest number of Units of Evaluation in regional institutions are listed in Table 3; only 10 of 61 (16 per cent) Units of Evaluation were rated as world class or above. The result for Business and Management was particularly concerning; this Field of Research was not rated as world class at any of the 13 regional institutions that claimed critical mass.
One feature of ERA 2010 was the generally higher rating of the Sciences, Technology, Engineering and Medical (STEM) fields compared with the Humanities, Arts, and Social Sciences (HASS). The extent to which this result is an artefact of ERA methodology or reflects levels of maturity and/or investment in those fields is beyond our consideration. Thus using ERA results to allocate higher degree by research places will have highly variable consequences in different disciplines (Table 4).
All Units of Evaluation were rated as world class or better for 40 (32 per cent) of 4-digit Fields of Research; 66 Fields of Research (49 per cent) had >80 per cent of Units of Evaluation rated at world class or higher (Commonwealth of Australia 2011a). For example, both Chemical Sciences (100 per cent world class or better at the 2-digit level) and Earth Sciences (100 per cent world class or better at 4-digit level), would be largely unaffected by limiting higher degree by research students to institutions rated as world class in these disciplines. The alternative approach of limiting higher degree by research places to institutions performing at or above national average in these disciplines would deprive world class groups of research students, policy that could not be in the national interest.
However, less than half the Units of Evaluation were rated as world class for 18 Fields of Research, including some fields that were offered by numerous institutions: 13 of these 18 low-rated Fields of Research were offered by between 27 and 39 institutions, one was offered by 22 institutions and four were offered by between five and 13 institutions (Australian Research Council, 2011a). The 4-digit Fields of Research with the lowest percentage of world class ratings were Policy and Administration (18.5 per cent - 27 Units of Evaluation), Marketing (27.6 per cent - 29 Units of Evaluation), Education Systems (31.3 per cent - 32 Units of Evaluation), Applied Economics (33.3 per cent - 33 Units of Evaluation), and Business and Management (33.3 per cent - 39 Units of Evaluation). Thus any mechanistic application of ERA to research education is likely to significantly affect Economics, Commerce, Management, Tourism and Services and Studies in Human Society. Limiting access to Australian Postgraduate Awards to institutions scoring a world class ERA rating would clearly be problematic, especially as 61.9 per cent of doctoral candidates in 2009 were older than 30 (Table 5) and often have family arrangements that limit mobility. Although institutions could award university scholarships to doctoral candidates in the disciplines in which they did not score well in ERA, this practice would reduce the attractiveness of Australia to international research students because of the consequential reduction in the number of scholarships available to them. This approach would be counter-productive public policy because of the well documented impending shortage of academics in Australia (Edwards, 2010; Edwards, Bexley & Richardson, 2011; Edwards, Radloff & Coates, 2009; Edwards & Smith, 2010; Hugo, 2008; Hugo & Morriss, 2010), the planned expansion of the sector (DEEWR, 2009; DIISR, 2009) and the increased international competition for the best and the brightest doctoral students.
This problem is exemplified by the discipline of Education in which 3415 doctoral candidates were enrolled in 2009; 7.7 per cent of all Australian doctoral candidates (Table 6). Nearly 60 per cent of research students in Education surveyed in 2010 (Edwards, Bexley & Richardson, 2011) were aged above 40, suggesting limited mobility. Only 15 of 39 institutions scored at or above the world average for the 2-digit Education Field of Research; no unit of evaluation received a maximum score. Thirty to 50 per cent of the Units of Evaluation for each the four 4-digt codes were also assessed at less than world average (Table 7). Our comparison of the ERA 2010 data at the 2-digit level and official higher education statistics purchased from the Australian government indicate that about one third of the total research students in Education were enrolled at institutions that were not rated as world class in ERA 2010, including 80 per cent of the domestic research students studying at regional institutions.
Thus limiting research education in Education to institutions rated as world class at the 2-digit level will not only require the world class institutions to service a significant additional supervisory load (>1000 extra doctoral students) but would risk seriously downgrading Education research outside the mainland capital cities, particularly in Tasmania and regional Queensland. Given the importance of Australian educational practice being evidence-based and the impending shortage of academics in this field (64.9 per cent of staff are aged above 50; Edwards, Bexley & Richardson 2011), we consider that it is important to introduce mechanisms to promote high quality doctoral training in Education across the nation rather than to limit it based on past performance, a conclusion that we consider applies to many other disciplines as well.
In ERA 2010, world-class critical mass was limited to five or fewer institutions in 39 4-digit Fields of Research (Australian Research Council 2011a). Nine 4-digit Fields of Research including Classical Physics had only one institution with a world class ERA rating. Only seven institutions were rated as world class in Atomic, Molecular, Nuclear, Particle and Plasma Physics, a Field that is likely to be very important to Australia’s clean energy future and in which doctoral study should presumably be encouraged.
To ensure that there was a ‘meaningful amount of data’ to be evaluated, ERA 2010 had a low volume threshold for each Unit of Evaluation (Australian Research Council, 2009). This threshold meant that an unknown number of ‘isolated scholars’ were not assessed, particularly in the Humanities where single scholars are the norm and in small institutions. There is anecdotal evidence that at least some of these scholars are very successful doctoral supervisors. Critical mass is very important in doctoral education to protect the interests of research higher degree candidates, especially if the principal supervisor becomes unavailable, institutional supervision using virtual technologies and visits is an increasingly-recognised practice, recently endorsed by changes to the Research Training Scheme to allow the recognition of joint completions (DIISR, 2011b). We question the wisdom of excluding high performing scholars who were not rated in ERA from research supervision and suggest that they should be encouraged to engage in cross-institutional supervision as discussed further below.
Changes to ERA to reduce the perverse student publication incentive
A simple solution to overcome the negative impact of ERA on research student publications would be to require institutions to submit all publications (or research outputs) as at present, but to present the data on only the top 80 per cent of publications for each Unit of Evaluation to the Research Evaluation Committees. Such a change would enable supervisors to publish a less interesting paper with a research student in a low impact journal without a negative consequence when the relevant Unit of Evaluation is assessed for ERA. This reform could be introduced for ERA 2012.
Using ERA to improve institutional practice in research education.
The research environment is a necessary but not sufficient component of quality research education as acknowledged by the basket of indicators of doctoral training quality being developed by the Australian Council of Deans and Directors of Graduate Studies (Table 8). We consider that the planned revision of the Research Training Scheme, the establishment of the Tertiary Education Quality Standards Agency (TEQSA), and the Compacts Process, together offer an opportunity for the Australian Government to require universities to explicitly take the quality of the research environment into account in developing their policy and practices for research education and to audit their response. However, any policy change that uses the data from ERA should be designed to explicitly address the challenges outlined above.
Mission-based Compacts are three-year agreements that show how each university’s mission contributes to the Australian Government’s goals for higher education, and include details of major higher education and research funding and performance targets (DEEWR & DIISR, 2009). Requiring universities to stipulate how they plan to take their ERA results into account when awarding Australian Postgraduate Awards in their Compact Agreement and to audit this through the Tertiary Education Quality Standards Agency would enable Higher Education Providers to respond in a more nuanced and positive way than if they were banned from awarding Australian Postgraduate Awards to doctoral candidates in Fields of Research that had been retrospectively evaluated by ERA as below world standard. Universities should also be able to identify emerging Fields of Research that currently are ‘not assessed’ or assessed below world standard, provide strategic reasons why they wish to accept research higher degree candidates or allocate Australian Postgraduate Awards to those Fields of Research, indicate how the research students will be provided with an appropriate research environment and negotiate how their performance should be evaluated. Several recent initiatives could be used in conjunction with ERA to improve doctoral education in fields of research in which there is a national or regional lack of critical mass. ERA offers a mechanism to identify such fields. Groups of universities can now share completions under the Research Training Scheme (DIISR, 2011b). Although this initiative has removed a significant barrier to cross-institutional co-operation in research education in Australia, it is likely to provide a niche rather than an institutional solution to the problems identified here. For example, over the last seven years, the Australian National University has developed several Memoranda of Understanding regarding joint PhDs with other Australian universities. To date, there has only been one cross-institutional PhD enrolment (Mandy Thomas, pers comm 2011) although recent Collaborative Research Network agreements should improve this situation. Experience with developing joint degrees between Australian and overseas institutions indicate that the uptake and success of these arrangements is dependent on established individual collaborations rather than institutional Memoranda of Understanding. Institutional improvements to the quality of research education in a discipline could be achieved using structures that are less formal than joint degrees:
• Joint arrangements for embedded students with other providers e.g. CSIRO.
• Sharing of physical and virtual resources;
• Incentives to encourage cross-institutional supervision and mentoring.
• International collaboration with established research centres.
• On-going collaboration (joint grants, papers, students, Collaborative Research Network agreements).
• External input to milestones e.g. Confirmation of Candidature proposals.
• External input into courses/skills development for research students.
Nonetheless, such initiatives are likely to be expensive and need to be factored into the revision of the funding for research training. In particular, research higher degree candidates may need assistance to travel between geographically separate institutions when distances are large, an inevitable feature of arrangements involving institutions in different states, especially the isolated regional institutions.
Doctoral Training Centres are an increasingly-recognised approach to improving the quality of doctoral education by training cohorts of students while emphasising transferable skills. In the United Kingdom, the Engineering and Physical Sciences Research Council and the Economic and Social Research Council have committed to fund more than 70 such centres, many in cross-disciplinary and applied areas. The five Australian Technology Network (ATN) universities (only three of which were rated as world class in 2-digit Mathematics) have recently established a national Industry Doctoral Training Centre in Mathematical Sciences (ATN, 2011) and its initial cohort of 20-25 PhD students will commence in early 2012, in nodes across the five ATN universities. Cross-institutional supervision can also be achieved using less formal structures but research higher degree candidates will need travel assistance as explained above. ERA as a block fund moderator The Australian government has indicated that the results of ERA will inform the allocation of funding to support the costs of research through the Sustainable Research Excellence Programme and research education through a modified Research Training Scheme. The Research Training Scheme is the most valuable of the research block funding schemes, representing 41 per cent of the total allocation in 2011. For the Research Training Scheme, Australian Postgraduate Awards, and International Postgraduate Research Scheme, the calculation methodology (DIISR, 2011c) is relative institutional performance in research income (40 per cent), publications (10 per cent) and research student completions (50 per cent) and it expected that ERA results will be used to moderate these drivers. However at present, there is no agreed method of assessing overall institutional performance in ERA and some of the measures used are simplistic, including the measures such as percentage of Fields of Research at world standard or better used here. Of particular concern, especially for the large research intensive universities, is the failure of the present ERA rating scheme to include any measure of critical mass provided that the institution meets the low volume threshold. An institution that achieves an ERA rating of ‘5’ based on 50 publications in a Field of Research will provide a very different environment for research high degree candidates to an institution that achieves the same rating based on 1000 publications. Nonetheless, bigger is not necessarily better, which is why a basket of indicators of research education quality is needed (Table 8).
However, the volume of output that has gone into achieving an ERA rating has to be taken into account in the funding formula. It will be challenging to develop an agreed measure of overall institutional performance in ERA and use it to have a positive impact on research training while taking the following additional factors into account: (1) most universities in Australia produce some excellent research outputs as ERA 2010 demonstrated, (2) as in the United Kingdom (Elton, 2010), universities are likely to use their freedom of virement to fund lower-rated department at the expense of higherrated ones, (3) the challenges of Australia’s dispersed geography, (4) the impending shortage of academic staff identified by various scholars (Edwards, 2010; Edwards, Bexley & Richardson, 2011; Edwards, Radloff & Coates, 2009; Edwards & Smith, 2010; Hugo, 2008; Hugo & Morriss, 2010), and (5) the need for quality academic staff to service the planned expansion of the sector (DEEWR, 2009; DIISR, 2009).
ERA 2010 was a comprehensive academic evaluation of the research conducted by Australia’s higher education providers in the first decade of the 21st century and subsequent rounds promise similar insights. Nonetheless, use of ERA to influence the policy and practice of research education in Australia will undoubtedly have many unintended consequences, some potentially deleterious. It is important to anticipate deleterious consequences before they become apparent. Our analysis of the results of ERA 2010 demonstrates a lack of alignment between the Fields of Research and university organisational units and that using ERA results to allocate higher degree by research places will have variable consequences in different locations as a result of Australia’s geography and in different disciplines. In addition, ERA provides an incentive for Australian academics to eschew publishing in low impact journals, a practice which is likely to disadvantage some research students for whom co-authorship in a lower ranked journal is more advantageous than not publishing.
Given these challenges, simplistically limiting doctoral education to Fields of Research where an institution scored at or better than national or world averages in ERA is unlikely to be in the national interest, especially given that ERA is retrospective and will not reflect the current situation. Doctoral students should be well represented in areas of emerging research including applied and crossdisciplinary research.
There are many ways in which ERA results could be used to improve the quality of research education in Australia. We suggest that requiring Higher Education Providers to describe how they plan to deliver quality research education in all disciplines relevant to their mission in their Compact Agreement with the Commonwealth would be a positive reform. Institutions could also be required to report on their research education inputs and outcomes against an agreed basked of quality training indicators for each of these disciplines to the Tertiary Education Quality Standards Agency.
Helene Marsh is Distinguished Professor of Environmental Science and Dean Graduate Research Studies, James Cook University, Queensland. Bradley Smith is the Manager of Research Strategy, Division of Research and Innovation, James Cook University, Queensland. Maxwell King is Pro Vice-Chancellor(Research and Research Training) and a Sir John Monash Distinguished Professor at Monash University, Victoria. Terry Evans is a Professor of Education at Deakin University, Victoria.
Eve is a tool for the dissemination and exploitation of results of projects supported by programmes managed by the European Commission. Through Eve you can access a wide range of learning materials, handbooks, websites, policy papers, photos … and much more! Eve has something for everyone!
What is Eve
Eve is the electronic platform for the dissemination and exploitation of results of projects supported by programmes managed by the European Commission in the fields of Education, Training, Culture, Youth and Citizenship.
Eve is a new tool available for project beneficiaries of the "Lifelong learning", "Culture", "Youth in Action" and "Citizenship" programmes, to have visibility on the European Union website. Eve will acquire much information as the projects develop and the results are introduced by the project coordinators.
Through its collaborative approach, the Eve platform is an innovative tool offering users a centralized vision of the majority of funded projects.
Eve is a new tool available to project beneficiaries. Eve will acquire much information through the introduction of results by the project coordinators and the progress of its recently launched projects belonging to the current Education and Culture programmes (Culture, Youth in Action, Citizenship, Lifelong Learning, ...).
Eve is not only a tool for the future: from its inception, projects funded in the past have been introduced in the platform. Thus, Eve already contains hundreds of projects funded under the 2000-2006 Education and Culture Programmes: Leonardo da Vinci, Culture 2000, European Active Citizenship, Youth and Socrates (including Comenius, Grundtvig, Erasmus, ...).
Information you can find
Eve hosts information about projects and results from the Education and Culture DG, such as:
* Learning materials, handbooks, manuals, CDs,...
* Projects websites, link to different databases
* Personal testimonials from project participants
* Documents and guidelines
* Associations and European partnerships
For more specific information on a certain project and its results, you can contact the project co-ordinator, whose details are included in the specific details of the project.
Eve is not only a tool for the future: from its inception, projects funded in the past have been introduced in the platform. Thus, Eve already contains hundreds of projects funded under the 2000-2006 Education and Culture Programmes: Leonardo da Vinci, Culture 2000, European Active Citizenship, Youth and Socrates (including Comenius, Grundtvig, Erasmus, ...).
Origin of information
In the medium term, the main source of information for Eve will be the direct contribution given by beneficiaries of projects within Eve. Information comes also from specific sub-programmes database:
1. ADAM for Leonardo da Vinci multilateral projects
2. and EST for projects from partnerships of Comenius, Grundtvig and Leonardo da Vinci.
Advantages of Eve
The Eve platform...
* is a promotional tool for project coordinators and the Education and Culture DG
* offers a single access point to results with plenty of useful information about projects funded by all the different programmes
* provides multiple benefits for project promoters and their results:
o Better visibility for projects
o Enhanced exploitation and Improved dissemination of results
o Rich source of information
o Tool for improved networking.
Par Olivier Rollot. La Journée de la femme est l'occasion de faire le point sur les choix d'orientation des filles et des garçons. Et il est toujours clair que devant les études, devant l’orientation, ils ne sont pas égaux. « Les filles anticipent mieux les choix et impliquent mieux leur famille qui va les accompagner de manière plus fluide. Les garçons laissent souvent filer et ne se rendent compte que trop tard qu’ils ont laissé passer leur chance d’intégrer un jour la filière de leurs rêves », analyse Christine Ducamp-Mayolle, ancienne conseillère d’orientation psychologue et aujourd'hui coach spécialiste de l’accompagnement scolaire et professionnel. Plus scolaires, elles vivent mieux les contraintes de l’institution quand les garçons se rebellent au contraire souvent.
Un choix des filières au lycée qui reste sexué
Ces quelques statistiques édifiantes tirées de «Filles et garçons sur le chemin de l’égalité, de l'école à l'enseignement supérieur »le démontrent bien, les filles sont bien meilleures que les garçons tout au long de leur scolarité:
* à 14 ans, 69% des filles scolarisées sont en classe de troisième, contre 60% des garçons du même âge;
* à 17 ans, 38% des filles scolarisées sont en classe de terminale générale et technologique contre 26% des garçons du même âge;
* 69% des filles obtiennent le bac contre 58% des garçons....
Un rééquilibrage dans le supérieur
Les choses peuvent se renverser dans l’enseignement supérieur où des garçons jusqu’alors très dilettantes peuvent prendre de la maturité en quelques mois. « Ils se recentrent alors sur leur objectif professionnel quand, au contraire, beaucoup de jeunes filles marquent plus d’intérêt pour leur vie personnelle », reprend Christine Ducamp-Mayolle. La pyramide des réussites se rééquilibre alors.
Si 55% des étudiants sont des étudiantes la féminisation des filières d’enseignement supérieur reste très inégale. Ainsi, les femmes sont majoritaires à l’université avec plus de 59% des effectifs toutes filières confondues mais ne sont qu’un peu plus de 40% dans les IUT, un peu moins de 43% en classes préparatoires et un peu plus d’un quart dans les écoles d’ingénieurs. Où leur nombre est même en baisse puisqu’elles étaient 26,6% en 2009. En BTS, elles représentent un plus de la moitié des effectifs mais sont largement plus nombreuses dans les services (69%) que dans la production (41%).
A l’université si c’est en langues (75%) qu’elles atteignent la proportion la plus importante, elles sont également les plus nombreuses dans les disciplines les plus prestigieuses que sont la médecine-odontologie et le droit (près de 65%) alors qu’elles sont très minoritaires en sciences fondamentales (30%). En dépit de toutes les campagnes de promotion comme « Tu seras ingénieure », les femmes restent bien à l’écart de la plupart des filières scientifiques, médecine et biologie exceptée. Un désamour qu’on retrouve en prépas où les femmes sont très majoritaires en lettres (75%!) et en économie (55%) mais seulement 30% en sciences...
Ved Olivier Rollot. De kvindedag er en mulighed for at gennemgå de politiske valg piger og drenge. Og det er altid klart, at inden undersøgelsen med den retning, at de ikke er ens. "Pigerne forventer flere valgmuligheder og bedre inddrage deres familier, der vil ledsage dem mere jævnt. De drenge ofte efterlader ingen spin og indser for sent, at de savnede deres chance for at integrere branchen en dagen af deres drømme, "Christine analyse Ducamp-Mayolle, tidligere vejleder og psykolog coach i dag specialist akademiske støtte og professionel. Mere skole, de lever bedre begrænsninger i den institution, hvor drengene er oprør i stedet for ofte.
Et udvalg af kurser i high school, der er kønssorteret
Disse få statistik taget fra opbyggelige "Piger og drenge på vejen til ligestilling, fra uddannelse til videregående uddannelse" viser tydeligt, pigerne er meget bedre end drengene i hele deres skolegang:
* Til 14 år, 69% af pigerne i skolen er i tredje klasse, mod 60% af drengene i samme alder;
* Til 17 år, er 38% af pigerne i skole i skole 6. Almindelighed og teknologi mod 26% af drengene i samme alder;
* 69% af pigerne får bakken mod 58% af drengene. Mere...
Most universities derive income from a broad range of sources, such as knowledge transfer, commercial operations, public-private partnerships and philanthropic giving. The latest HESA Finance data (2008/09) shows that 'other income' comprised close to 20%, or nearly £5 billion, of total UK HEI income. There seems little doubt that financial pressures in future will increase, as funding body grants reduce, tuition fee income becomes less stable and the demands to pay for new and better facilities continue to grow. And then there's the pension fund liabilities.
A well-structured and resourced strategic marketing team is essential help to any institution decide where the best opportunities for developing new income streams lie. In many organisations, marketing either encompasses or sits alongside business development and its processes and disciplines can help the development team structure its approach. In an academic environment, many of the ideas for new businesses and incomes will emerge from academic staff engaged in research. Marketing can add value by helping staff turn their ideas into to fully fledged business propositions, well positioned to take advantage of the market opportunities available.
Support for business development
An experienced marketing team, with a strategic capability, can add value to business development in a number of ways:
A well-structured market assessment process will help identify
• Who are the potential customers?
• Where is the competition and what are they offering?
• How does our proposition stack up against others?
• What is the scale of the market?
• What market trends are apparent?
When developing a proposition the university must take into account market demand, feasibility and sustainable competitive advantage. A full marketing launch takes careful planning and execution. The marketing and communications team can help staff plan how to get from a proposition, idea or prototype to a full marketing launch. They should also be able to advise on the on-going level of resource needed to manage engagement with the market once the business is up and running.
Is this happening already?
The answer is probably 'Yes and No'. 'Yes', in that many of these activities are being conducted although on a piecemeal, not planned and structured, basis. 'No', in that a market and customer oriented perspective is not typically the start point. A more normal approach is that organisations try to 'sell' their existing capabilities ('inside-out'), rather than understanding market needs and tailoring their product or service accordingly ('outside-in'). Also, the marketing and communications team is not generally included in institution's business development, despite the fact that they often can provide the necessary insights.
What are the benefits of this approach?
Firstly, diversifying income streams beyond the well understood core university activities of teaching students and winning research grants is still relatively new territory for many institutions; and it's clearly important to gain a good understanding of potential customers. After all, they are going to pay for your product and service in future.
Second, it's valuable to know who else is already targeting the market, particularly if you're planning to launch something new. What services already exist, what are customers paying for and where are the gaps?
Finally, it helps to have a longer-term perspective and consider how sustainable income can be built beyond the initial product or service. How might services need to adapt and grow in the future; how can a portfolio be built?
All of this can be achieved without marketing expertise and resource but shouldn't it be an integral part of their role? It's certainly part of the rationale for having a strategic marketing as well as communications capability. William Annandale is managing partner at Quadrant Consultants, a strategy and marketing consultancy.