ARC Assessment Process

ARC Assessment Cycle

This flowchart shows the ARC’s assessment cycle.

Grant Guidelines

  • Grant Guidelines published

Applications

  • Application submission

Assessment

  • Detailed assessment
  • Rejoinder
  • General assessment

Selection

  • Considered by Selection Advisory Committee

Approval of funding

  • Approval of funding
  • Outcomes released to Administering Organisations under embargo 

ARC Assessment Cycle (detail including National Interest Test)

Grant Guidelines

  • Grant Guidelines are approved by the Minister for Education
  • Published on GrantConnect in the Forecast and Grant Opportunity
  • Sector is advised of availability via network messages, GrantConnect notifications, Twitter

Applications

  • Applications are submitted by applicants (Eligible Organisations) by the relevant scheme closing date

Assessment

  • Applications are considered against eligibility criteria and compliance with the Grant Guidelines
  • Applications are assessed by independent assessors in the initial assessment (Conflict of Interest applies)
  • Applicants are given the opportunity to respond to assessors' written comments
  • Applications are assessed by the Selection Advisory Committee (SAC) (Conflict of Interest applies)

Selection

  • The SAC considers all applications, ranks each application relative to other applications and recommends budgets for the highly ranked applications (Conflict of Interest applies)
  • The recommendations from the SAC are provided to the ARC CEO
  • The ARC CEO considers the recommendations from the SAC and the applicant’s response to the National Interest Test 
  • (the ARC CEO will seek information from Administering Organisations on applications where there is concern about how they meet the National Interest Test based on the information provided in the application form)

Approval of funding

 

  • The ARC CEO provides recommendations to the Minister for Education with applications to be approved for funding, applications not recommended for funding, and the level of funding and duration of projects (only applications that satisfy the National Interest Test and which are eligible for funding will be recommended to be approved for funding)
  • The Minister for Education considers the CEO’s recommendations and determines which applications will be funded (the Minister may consider the National Interest Test in determining which applications to approve)
  • Outcomes released to Administering Organisations under embargo
  • The Minister announces funding outcomes (if any applications are recommended to but not funded by the Minister, applicants will be notified)

ARC Assessment Process

Flow chart of the ARC Assessment Process.

Application > Panel > External Assessment > Selection meeting > Outcomes

[back to top]

Assessment Overview

  • The peer review process is designed to be fair, thorough and transparent
  • All Applications are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Applications are generally assigned to two types of assessors:
    • at least two General Assessors called Carriages (with the lead assessor termed Carriage 1), and 
    • at least two Detailed Assessors (external assessors)
  • ARC staff assess eligibility etc., but do not decide which Applications should be funded

[back to top]

ARC College of Experts

  • Play a key role in identifying research excellence, moderating external assessments and recommending fundable Applications
  • Assist the ARC in:
    • assigning Detailed Assessors, 
    • identifying new assessors especially international nominees
    • provide feedback/advice on ARC processes, and 
    • implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • Experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations.
  • Nominations usually open around May each year.
  • www.arc.gov.au » About » ARC Profile » ARC Committees »  ARC College of Experts

[back to top]

The Role of General Assessors

  • General Assessors form the Selection Advisory Committee(s) (SAC—no. varies by scheme) which may consist of:
    • ARC College of Experts, or 
    • Other experts as required e.g. Industry members
  • In the Assessment phase:
    • Assign Detailed Assessors to allocated applications(not for all schemes)
    • consider the Application, the Detailed Assessors input, and the Applicant’s Rejoinder, and 
    • assign their own scores for the application
  • In the selection meeting:
    • Review all submitted applications
    • Provide recommendations on application approval.

[back to top]

Number of Panels

 

[back to top]

ARC Assessors

  • We encourage every active researcher to apply to become an assessor for the ARC
  • If you are not currently an assessor for the ARC and would like to become one then send your CV to:  
    ARC-Peer_Review@arc.gov.au
  • ARC Executive Directors will review all applications looking at employment history, publications, previously held grants and overall experience and standing in the research field of speciality.

[back to top]

Detailed assessments

  • Detailed Assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed Assessors complete in-depth assessments of Applications by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General Assessors in the later stages of the peer review process.

[back to top]

How are Assessors assigned? 

  • RMS generates a “word cloud” of an Application based on the:
    • Application summary
    • Application title
    • Impact statement
    • FoR codes
    • SEO codes
  • RMS generates Assessor suggestions based on Assessor codes, expertise and history
    Make sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their judgment.
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional spread
    • Gender balance
    • Assessor experience
  • For Fellowship/Award schemes, Applicants in that round cannot assess others
  • As with Assessors, RMS proposes a broad discipline panel for each Application, but these suggestions are reviewed and can be changed.

[back to top]

A Quality Assessment…

  • is a thoughtful, meaningful and balanced assessment that provides fair and objective information about the key merits or otherwise of the application with respect to the assessment criteria
  • includes relevant comments and criticisms that are justified and aligned to the scores (enables understanding of strengths and weaknesses)
  • enables applicants to undertake an informative and reasonable rejoinder
  • enables the Selection Advisory Committee to make informed decisions about funding recommendations
  • does not include any inappropriate elements as outlined in the Assessor Handbook.

[back to top]

Conflict of Interest

  • In addition to institutional conflicts, an Assessor may be deemed to have a C.o.I. with a named participant on a funding Application for a number of reasons including, but not limited to, if that Assessor:
    • has a close personal relationship (including enmity) with that named participant
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant
      • has a current Application or is negotiating an Application for funding with that named participant
      • has been a collaborator or co-author with that named participant on a research output within the past four years
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years
      • has been a postgraduate student or supervisor of that named participant within the past five years
  • could otherwise be perceived to benefit materially from the awarding of funding to the Application involving that named participant
  • www.arc.gov.au  » Policies & Strategies » Policy » ARC Conflict of Interest and Confidentiality Policy
  • RMS takes into account a great deal of data (e.g. institutional), but it doesn’t know everything
  • Assessors reviewing ARC Applications who identify a conflict of interest must reject the Application in RMS
  • If in any doubt, contact the ARC to confirm whether a conflict exists under our policies
  • Assessing Applications despite a conflict of interest is in breach of ARC rules and of the Australian Code for the Responsible Conduct of Research
  • www.arc.gov.au » Policies & Strategies » Strategy » ARC Research Integrity Policy

[back to top]

Scoring Matrix

Rating Scale

Criteria

Recommendation

 

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Applications should receive ratings in this band.

Recommended unconditionally

 

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Applications should receive ratings in this band.

Strongly support recommendation of funding

 

C

Very Good: Interesting, sound and compelling. Approximately 20% of Applications should receive ratings in this band.

Support recommendation of funding with reservation

 

D

Good: Sound, but lacks a compelling element. Approximately 35% of Applications are likely to fall into this band.

Unsupportive of recommendation for funding

 

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Applications are likely to fall into this band.

Not recommended for funding

[back to top]

Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program
  • The ARC considers that Research Opportunity comprises two separate elements:
  • Career experiences (relative to opportunity)
  • Career interruptions
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant 
  • www.arc.gov.au » Policies & Strategies » Policy » The ROPE Statement (released Feb 2014).

[back to top]

Rejoinder

  • Where the ARC seeks Detailed assessments, Applicants are given the opportunity to submit a Rejoinder to respond to comments made by Detailed assessors
  • Rejoinders are not viewed by the Detailed Assessors but are considered by the General Assessors
  • The ARC respects the Peer review process, interventions are rare and carefully considered.

[back to top]

What happens after Rejoinder?

  • General Assessors view all Detailed assessments and consider the Rejoinder carefully
  • General Assessors confer with each other to finalise scores
  • General Assessors submit their final scores and ranks for their group of Applications
  • General Assessors’ scores in large schemes are then normalised to reduce the impact of different “marking styles”
  • Applications are then ranked within the discipline panel (where relevan—some schemes have a single panel).

[back to top]

Application Score/Rank Calculation

  • ‘Grouped Average’ of all submitted assessments for the Application
  • This calculation results in an “Application Score”
  • Application ranks are derived for each panel
  • Any Applications (within same panel) with equal Application Scores will have equal ranks

 

[back to top]

Before a Selection Meeting

  • Panels are given access to final scores and rankings, and can review all (non-conflicted) Applications, not just those they have had carriage of
  • Panel members are encouraged to note any issues they believe may have skewed the assessment/ ranking of a particular Application, or are noteworthy for panel discussion
  • Members are also invited to closely scrutinise ROPE issues
  • Panel members’ attention is drawn particularly to Applications around the likely funding cut-off, as these will need detailed discussion.

[back to top]

The National Interest Test

  • The text box on “Benefit & Impact” in ARC application forms replaced with a text box on “National Interest”.
  • 100 to 150 words, in plain English
  • National interest is defined as:
    “ the extent to which the research contributes to Australia’s national interest through its potential to have economic, commercial, environmental, social or cultural benefits to the Australian community ”
  • Applications that satisfy the N.I.T. & score highly will be recommended to the Minister for funding.

[back to top]

Feedback on unsuccessful Applications

  • For most schemes the ARC provides two types of feedback:
    • Overall ranking band
    • Ranking band within specific scheme selection criteria
  • An example:

Proposal ID

Lead CI

Outcome

Unsuccessful Band

Investigator(s) (40%)

Project Quality and Innovation (25%)

Feasibility and Benefit (20%)

Research Environment (15%)

DPxxxxxx

Example, Prof G.

Not Recommended

This Application is in the band 26% to 50% of unsuccessful proposals within the discipline panel.

Band 26% to 50% of unsuccessful Applications within the discipline panel.

Band 26% to 50% of unsuccessful Applications within the discipline panel.

Top 10% of unsuccessful Applications within the discipline panel.

Band 11% to 25% of unsuccessful Applications within the discipline panel.=

  • There are different approaches to feedback for some schemes due to their size and application process. 
  • For example:
    • For the Linkage Projects scheme, the ARC provides a statement about whether or not the Application was fast-tracked or went to selection meeting
    • For Industrial Transformation Research Program, each Application receives written feedback.
       
  • [back to top]

Feedback on unsuccessful Applications—Linkage Projects

 

Situation

Linkage Projects Standard feedback comment (Unsuccessful)

 

Applications with a score high enough to be progressed to a meeting.

As a result of assessment scores, Application LPxxxxxxxxx progressed to a selection meeting, however it was not recommended for funding. Whilst of merit it did not meet the threshold for ARC funding at this time.

Higher score

 

Arrow up and down

Lower score

Applications which had a lower score but were progressed to a meeting for discussion.

As a result of assessment scores, Application LPxxxxxxxxx progressed to a selection meeting, however it was not recommended for funding.

Applications which were fast-tracked as low-ranked applications, but which had a score close to progressing to a meeting.

As a result of assessment scores, Application LPxxxxxxxxx was fast-tracked as a low ranked application, did not progress to a selection meeting, and was not recommended for funding. This Application fell just short of the threshold for progressing to a selection meeting at this time.

Applications which were fast-tracked as low-ranked applications.

As a result of assessment scores, Application LPxxxxxxxxx was fast-tracked as a low-ranked application, did not progress to a selection meeting, and was not recommended for funding.

[back to top]

Peer Review section on the ARC website