Continuous Linkage and NCGP assessment processes
University of the Sunshine Coast
6 February 2017
Dr Fiona Cameron

Presentation slide header

Outline

  • Linkage Projects
    • Continuous Assessment
  • NCGP Assessment Processes (generally)

[top]

Presentation slide header

Linkage Projects

  • The Linkage Projects scheme provides funding to Eligible Organisations to support research and development (R&D) projects which:
    • are collaborative between higher education researchers and other parts of the national innovation system
    • are undertaken to acquire new knowledge, and
    • involve risk or innovation.
  • Proposals for funding under the Linkage Projects scheme must include at least one Partner Organisation.
  • The Partner Organisation must make a contribution in cash and/or in kind to the project. The combined Partner Organisation contributions for a Proposal (i.e. the total of the cash and in-kind contributions of the Partner Organisations) must at least match the total funding requested from the ARC.

[top]

Presentation slide header

Linkage Projects—return and success rates

 

N.B. 2016 does not include continuous LP 2016 funding.

This chart shows the number of proposals received and funded, and the success and return rate for the Linkage Projects scheme for the years  2009–2016.

[top]

Presentation slide header

Gender of First CI: LP16 

 

This chart shows the success rate of proposals in Linkage Projects 2016, divided by gender of first Chief Investigator (CI).

[top]

Presentation slide header

Continuous application, assessment and funding process

 

  1. Proposal submitted to ARC
  2. Assessment/Rejoinders Process
  3. Overall Score Calculated
  4. Score compared to LP16 April list
  5. Preliminary Screening Meeting
  6. Ranking
    1. High 
    2. Low
  7. Selection Meeting
  8. Funding Recommendations to the Minister
  9. Announcement
  10. Projects start

[top]

Presentation slide header

The Annual Cycle (Approximate dates)

Old Linkage:

  • January: Assessment
  • June: Announcement
  • Refresh Rules
  • September: Submissions Open
  • November: Submissions Close
 
Continuous Linkage:
  • January: General & Detailed Assessment
  • Regular Panels
  • Regular Announcements
  • December: Refresh rules

[top]

Presentation slide header

Continuous Linkage Projects

Presentation slide header

NCGP Lifecycle

Funding Rules

  • Funding Rules are approved by Minister
  • Published on the ARC website
  • Sector is advised of availability

Proposals

  • Instructions to applicants, sample application form and FAQs published on ARC website
  • Eligibility Exemption Requests and Request Not to Assess Processes may be available
  • Applications submitted by Eligible Organisations by the relevant scheme closing date

Assessment

  • Proposals are considered against eligibility criteria and compliance with the Funding Rules.
  • Proposals are assessed by Detailed Assessors (with the exception of some Special Research Initiatives)
  • Applicants are given the opportunity to respond to Detailed Assessors’ written comments via a Rejoinder Process
  • Proposals are assessed by General Assessors taking into account the Detailed Assessments and Rejoinders

Selection meeting

  • The Selection Advisory Committee (General Assessors) considers all proposals, recommends proposals to be funded and recommends the level at which successful proposals should be funded. 

Approval of funding

  • ARC CEO provides recommendations to the with proposals to be approved for funding, proposals not recommended for funding, and the level of funding and duration of projects.
  • Minister considers recommendations and approves and announces funding outcomes

 [top]

Presentation slide header

ARC Assessment Process

 [top]  Presentation slide header  Fostering the next generation of researchers

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes

 [top]

Presentation slide header

ARC College of Experts

  • plays a key role in identifying research excellence, moderating external assessments and recommending fundable proposals
  • assists the ARC in recruiting and assigning assessors and in implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations
  • Nominations generally open in May each year
  • More information: ARC College of Experts

 [top]

Presentation slide header

Proposal assessment—overview

  • The peer review process is designed to be fair, thorough and transparent
  • All proposals are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Proposals are generally assigned to two types of assessors:
    • at least two General assessors (usually College of Experts members), and 
    • at least two Detailed assessors
  • ARC staff assess eligibility etc., but do not decide which proposals should be funded

 [top]

Presentation slide header

Assessment

  • General assessors are members of 
    • the College of Experts &/or 
    • a Selection Advisory Committee (SAC)
  • They:
    • assign their own ratings for a proposal against the relevant scheme selection criteria
    • consider the proposal, the ratings and comments provided by all Detailed assessors, and the applicant’s rejoinder; and
  • Attend a final selection meeting to make a funding recommendation based on discussion of Detailed and General assessments and Rejoinders

 [top]

Presentation slide header

ARC Assessors

  • We encourage every active researcher to become an assessor for the ARC.
  • If you are not currently an assessor for the ARC and would like to become one then send:
    • a brief CV
    • list of five recent publications
    • or a web link to this information
  • to ARCAssessorUpdate@arc.gov.au

 [top]

Presentation slide header

Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed assessors complete in-depth assessments of proposals by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process

 [top]

Presentation slide header

How are assessors assigned? 

  • RMS generates a “word cloud” visualisation of a proposal based on:
    • Proposal summary
    • Proposal title
    • Impact statement
    • FoR codes
    • SEO codes
  • RMS generates assessor suggestions and word cloud commonalities based on assessor codes, expertise and history – make sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their decisions
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional diversity
    • Gender balance
    • Assessor experience
  • For fellowship/award schemes, applicants in that round cannot assess others

 [top]

Presentation slide header

Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding proposal for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant 
    • could otherwise be perceived to benefit materially from the awarding of funding to the proposal involving that named participant.
  • ARC Conflict Of Interest And Confidentiality Policy

  • RMS takes into account a great deal of data (e.g. institutional), but it doesn’t know everything
  • Assessors reviewing ARC proposals who identify a conflict of interest must reject the proposal in RMS
  • If in any doubt, contact the ARC to confirm whether a conflict exists under our policies
  • ARC Research Integrity And Research Misconduct Policy

 [top]

Presentation slide header

Rating Scale

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

[top]

Presentation slide header

Rejoinder

  • Where the ARC seeks Detailed assessments, applicants are normally given the opportunity to submit a rejoinder to respond to comments made by Detailed assessors
  • Rejoinders are not viewed by the Detailed assessors but are considered by the General assessors
  • The ARC prefers not to interfere in the assessment process. Only rarely will we agree to requests to remove assessments.

 [top]

Presentation slide header

What happens after rejoinder?

  • General assessors view all Detailed assessments and consider the rejoinder carefully
  • General assessors confer with each other to finalise scores
  • General assessors’ scores in large schemes are then normalised to reduce the impact of different “marking styles”
  • Proposals are ranked within the discipline panel (where relevant—some schemes have a single panel)
  • If there is an “outlier” score—this is where your rejoinder has particular value.

 [top]

Presentation slide header

Proposal Score/Rank Calculation

  • “Grouped Average” of all submitted assessments for the proposal
  • This calculation results in a “Proposal Score”
  • Proposal ranks are derived for each panel
  • Any proposals (within same panel) with equal Proposal Scores will have equal ranks.

 

Flow chart showing how the average of general scores and details scores determines the final rank.

 [top]

Presentation slide header

Before a Selection Meeting

  • Panels are given access to final scores and rankings, and can review all (non-conflicted) proposals, not just those they have had carriage of
  • Panel members are encouraged to note any issues they believe may have skewed the assessment/ranking of a particular proposal, or are noteworthy for panel discussion
  • Members are also invited to closely scrutinise ROPE issues
  • Panel members’ attention is drawn particularly to proposals around the likely funding cut-off, as these will need detailed discussion

[top]

Presentation slide header

Feedback on unsuccessful proposals

  • Recently the ARC has provided two types of feedback:
    • Overall ranking band
    • Ranking band within specific scheme selection criteria
  • Example:

Proposal ID

Lead CI

Outcome

Unsuccessful Band

Investigator(s) (40%)

Project Quality and Innovation (25%)

Feasibility and Benefit (20%)

Research Environment (15%)

DPxxxxxx

Example, Prof G.

Not Recommended

This proposal is in the band 26% to 50% of unsuccessful proposals within the discipline panel.

Band 26% to 50% of unsuccessful proposals within the discipline panel.

Band 26% to 50% of unsuccessful proposals within the discipline panel.

Top 10% of unsuccessful proposals within the discipline panel.

Band 11% to 25% of unsuccessful proposals within the discipline panel.

[top]

Presentation slide header

NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section