ARC Assessment Process

NCGP Assessment Cycle

Grant Guidelines

  • Grant Guidelines are approved by the Minister for Education
  • Published on GrantConnect in the Forecast and Grant Opportunity
  • Sector is advised of availability via network messages, GrantConnect notifications, Twitter


  •   Applications are submitted by Eligible Organisations by the relevant scheme closing date


  • Applications are considered against eligibility criteria and compliance with the Grant Guidelines
  •   Applications are assessed by independent assessors in the initial assessment (Conflict of Interest applies)
  •   Applicants may be given the opportunity to respond to assessors' written comments
  •   Applications are assessed by the Selection Advisory Committee (SAC) (Conflict of Interest applies)


  •   The relevant SAC considers all applications, ranks each application relative to other applications and recommends budgets for the highly ranked applications (Conflict of Interest applies)

Approval of funding

  • ARC CEO provides recommendations to the Minister for Education with applications to be approved for funding, applications not recommended for funding, and the level of funding and duration of projects
  • Minister for Education considers recommendations and approves and announces funding outcomes

ARC Assessment Process

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel 
  3. External Assessment
  4. Selection Meeting
  5. Outcomes


Application assessment—overview

  • The peer review process is designed to be fair, thorough and transparent
  • All Applications are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Applications are generally assigned to two types of assessors:
    • at least two General Assessors (usually College of Experts members), and 
    • at least two Detailed Assessors
  • ARC staff assess eligibility etc., but do not decide which Applications should be funded

ARC College of Experts

  • Play a key role in identifying research excellence, moderating external assessments and recommending fundable Applications
  • Assists the ARC in recruiting and assigning Detailed Assessors and in implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • Experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations
  • Nominations open usually around May each year
  • ARC College of Experts

General assessment

  • General assessors are members of
    • the College of Experts or
    • a Selection Advisory Committee

 (NB: expanded College—not all members sit on all panels)

  • General Assessors 
    • assign their own ratings against the relevant scheme selection criteria
    • consider the Application, the ratings and comments provided by Detailed Assessors, and the Applicant’s Rejoinder; and 
  • Once all assessments submitted to the ARC, Detailed and General assessments and Rejoinders are considered by the panels at the final selection meeting (more on this later)

Forming selection panels

  • The ARC recognises the need to have a flexible approach to suit volume and disciplinary spread in each scheme
  • The number of discipline panels varies by scheme.
    • (Funds are apportioned according to demand)
      For example, Discovery Projects typically has five panels:
      • BSB (Biological Sciences and Biotechnology)
      • EIC (Engineering, Information and Computing Sciences)
      • HCA (Humanities and Creative Arts)
      • MPCE (Mathematics, Physics, Chemistry and Earth Sciences)
      • SBE (Social, Behavioural and Economic Sciences)
        However, applications can be assigned across two panels to ensure appropriate expertise, and assigned to a breadth of detailed reviewers.
  • Some other schemes use a single multi-disciplinary panel (e.g. Australian Laureate Fellowships, ITRP).

National Competitive Grants Program

Graphical representation of schemes in the ARC's National Competitive Grants Program. The colour code shows how many selection panels are allocated to assess applications in each scheme.

ARC Assessors

  • We encourage every active researcher to become an assessor for the ARC.
  • If you are not currently an assessor for the ARC and would like to become one then send:
    • a brief CV
    • list of five recent publications
    • or a web link to this information.


Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed assessors complete in-depth assessments of applications by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process (more on this later).

How are assessors assigned? 

  • RMS generates a “word cloud” of a application based on:
    • Application summary
    • Application title
    • Impact statement
    • FoR codes
    • SEO codes.
  • RMS generates assessor suggestions based on assessor codes, expertise and history – make sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their judgment
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional spread
    • Gender balance
    • Assessor experience.
  • For fellowship/award schemes, applicants in that round cannot assess others
  • As with assessors, RMS makes a suggestion about the broad discipline panel for each application, but these suggestions are reviewed and can be changed.

Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding application for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant;
      • has a current application or is negotiating an application for funding with that named participant;
      • has been a collaborator or co-author with that named participant on a research output within the past four years;
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years;
      • has been a postgraduate student or supervisor of that named participant within the past five years;
    • could otherwise be perceived to benefit materially from the awarding of funding to the application involving that named participant.
  • ARC Conflict of Interest and Confidentiality Policy
  • RMS takes into account a great deal of data (e.g. institutional), but it doesn’t know everything
  • Assessors reviewing ARC applications who identify a conflict of interest must reject the application in RMS
  • If in any doubt, contact the ARC to confirm whether a conflict exists under our policies
  • Assessing applications despite a conflict of interest is in breach of ARC rules and of the Australian Code for the Responsible Conduct of Research
  • ARC Research Integrity and Research Misconduct Policy.

Rating Scale

Rating Scale




Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Applications should receive ratings in this band.

Recommended unconditionally


Excellent: Of high quality and strongly competitive. Approximately 15% of Applications should receive ratings in this band.

Strongly support recommendation of funding


Very Good: Interesting, sound and compelling. Approximately 20% of Applications should receive ratings in this band.

Support recommendation of funding with reservation


Good: Sound, but lacks a compelling element. Approximately 35% of Applications are likely to fall into this band.

Unsupportive of recommendation for funding


Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Applications are likely to fall into this band.

Not recommended for funding

Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program.
  • The ARC considers that Research Opportunity comprises two separate elements:
  • Career experiences (relative to opportunity)
  • Career interruptions
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant. 
  • The ROPE Statement (released Feb 2014) is online.


  • Where the ARC seeks Detailed assessments, applicants are often given the opportunity to submit a rejoinder to respond to comments made by Detailed assessors
  • Rejoinders are not viewed by the Detailed assessors but are considered by the General assessors
  • The ARC prefers not to interfere in the assessment process. Only rarely will we agree to requests to remove assessments. 

What happens after rejoinder?

  • General assessors view all Detailed assessments and consider the rejoinder carefully
  • General assessors confer with each other to finalise scores
  • General assessors submit their final scores and ranks for their group of applications
  • General assessors’ scores in large schemes are then normalised to reduce the impact of different “marking styles”
  • Applications are then ranked within the discipline panel (where relevant—some schemes have a single panel).

ARC Assessment Process

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes.

Application Score/Rank Calculation

  • ‘Grouped Average’ of all submitted assessments for the Application
  • This calculation results in a “Proposal Score”
  • Application ranks are derived for each panel
  • Any Applications (within same panel) with equal Proposal Scores will have equal ranks.

Flow chart showing how the average of general scores and details scores determines the final rank.

Before a Selection Meeting

  • Panels are given access to final scores and rankings, and can review all (non-conflicted) Applications, not just those they have had carriage of
  • Panel members are encouraged to note any issues they believe may have skewed the assessment/ranking of a particular Application, or are noteworthy for panel discussion
  • Members are also invited to closely scrutinise ROPE issues
  • Panel members’ attention is drawn particularly to proposals around the likely funding cut-off, as these will need detailed discussion

Feedback on unsuccessful Applications

  • Recently the ARC has provided two types of feedback:
    • Overall ranking band
    • Ranking band within specific scheme selection criteria.


Proposal ID

Lead CI


Unsuccessful Band

Investigator(s) (40%)

Project Quality and Innovation (25%)

Feasibility and Benefit (20%)

Research Environment (15%)


Example, Prof G.

Not Recommended

This application is in the band 26% to 50% of unsuccessful applications within the discipline panel.

Band 26% to 50% of unsuccessful applications within the discipline panel.

Band 26% to 50% of unsuccessful applications within the discipline panel.

Top 10% of unsuccessful applications within the discipline panel.

Band 11% to 25% of unsuccessful applications within the discipline panel.

  • There are different approaches to feedback for some schemes due to their size and application process. 
  • For example:
    • For the Linkage Projects scheme, the ARC provides a statement about whether or not the Application was fast-tracked or went to selection meeting
    • For Industrial Transformation Research Program, each Application receives written feedback

NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section.