Senator Kim Carr media release banner

Ministerial statement to the Senate Economics Legislation Committee: Improvements to Excellence in Research for Australia (ERA)

30 May 2011

After several years of development, the first round of the Excellence in Research for Australia (ERA) initiative was run in 2010, with results published by the Australian Research Council (ARC) earlier this year in the ERA National Report. The exercise has been an overwhelming success in meeting its objective of providing institutions, researchers, industry and students with a sound, evidence-based means of identifying areas of strength and potential, as well as areas where we need to do better. These assessments were made against international benchmarks using the indicators that have been developed over time—in many instances over many decades—by the disciplines themselves. This has underpinned the strong support for the ERA methodology across the higher education research sector.

I have said all along that we are keen to undertake meaningful consultation. We remain open to suggestions on enhancements to what we know to be a very good scheme. I have been aware for some time of concerns within the sector about certain aspects of the exercise, particularly the ranked journal lists. These concerns have been communicated to me directly, reported in the sector media, and voiced in the ARC’s extensive sector consultations ahead of preparations for the second iteration of ERA in 2012. Additional matters that have been raised include the strength of the peer review process and the capacity of ERA to adequately capture applied and interdisciplinary research.

The ARC has advised me that consultation has revealed that there is a widespread preference for limited change, to ensure that ERA 2010 and ERA 2012 outcomes can be compared. Overall, however, the ARC considers that making a small number of changes to the ERA 2010 methodology could substantially enhance the integrity and acceptance of the ERA 2010 evaluation exercise, without compromising comparability.

As always, we are in the business of making refinements that improve the operation of ERA. I therefore commissioned the ARC to produce an options paper outlining different ways we might be able to utilise these indicators to address these concerns, and to consider any implications arising from the potential adoption of alternatives. I placed particular emphasis on the absolute need to maintain the rigour of the ERA exercise, to ensure the comparability of the results of the next iteration with ERA 2010, and to pay close attention to the detailed concerns of the sector. Within those parameters, however, I wished to explore ways in which we could improve ERA so the aspects of the exercise causing sector disquiet—especially issues around the ranked journals list—could be minimised or even overcome.

As the result of this process, I have approved a set of enhancements recommended by the ARC that deal substantially with those sector concerns while maintaining the rigour and comparability of the ERA exercise. These improvements are:

  • The refinement of the journal quality indicator to remove the prescriptive A*, A, B and C ranks
  • The introduction of a journal quality profile, showing the most frequently published journals for each unit of evaluation
  • Increased capacity to accommodate multi-disciplinary research to allow articles with significant content from a given discipline to be assigned to that discipline, regardless of where it is published (this method was successfully trialed in ERA 2010 within Mathematical Sciences)
  • Alignment across the board of the low volume threshold to 50 outputs (bringing peer-reviewed disciplines in line with citation disciplines, up from 30 outputs)
  • The relaxation of rules on the attribution of patents, plant breeders’ rights and registered design, to allow those granted to eligible researchers to also be submitted
  • The modification of fractional staff eligibility requirements to 0.4 FTE (up from 0.1 FTE), while maintaining the right to submit for staff below this threshold where affiliation is shown, through use of a by-line, for instance).

I have also asked the ARC to continue investigating strategies to strengthen the peer review process, including improved methods of sampling and review assignment.

As with some other aspects of ERA, the rankings themselves were inherited from the discontinued Research Quality Framework (RQF) process of the previous government, and were developed on the basis of expert bibliometric advice. Patterns of their utilisation by the RECs and detailed analysis of their performance in the ERA 2010 exercise, however, have made it clear that the journal lists themselves are the key contributor to the judgements made, not the rankings within them.

There is clear and consistent evidence that the rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings. One common example was the setting of targets for publication in A and A* journals by institutional research managers.

In light of these two factors—that ERA could work perfectly well without the rankings, and that their existence was focussing ill-informed, undesirable behaviour in the management of research—I have made the decision to remove the rankings, based on the ARC’s expert advice.

The journals lists will still be of great utility and importance, but the removal of the ranks and the provision of the publication profile will ensure they will be used descriptively rather than prescriptively.

These reforms will strengthen the role of the ERA Research Evaluation Committee (REC) members in using their own, discipline-specific expertise to make judgments about the journal publication patterns for each unit of evaluation.

It is important to note that these changes will be exposed to public comment during July as part of the draft submission guidelines. I am confident that these improvements will strengthen the ERA methodology and minimise the unintended consequences arising from inappropriate external use of the indicators, while maintaining the comparability of future rounds with the ERA 2010 results. 

I would like to thank the ARC, led by Professor Margaret Sheil, for the extensive development work that went into producing these improvements, and the ERA 2010 REC members and other key academic leaders for their invaluable advice. I particularly thank the university research sector, whose detailed feedback informed the work, and whose support for ERA overall has been so positive.

Media contact:
Aban Contractor, Minister’s office, 0457 989 842