Being an Assessor
University of South Australia
14 July 2017
Dr Fiona Cameron

""

Outline

  • The NCGP life cycle
  • College of experts 
  • Why are more assessors needed?
  • Conflict of Interest
  • Good assessments
  • Available information
  • RMS Profile

[top]

""

NCGP Lifecycle

Funding Rules

  • Funding Rules are approved by Minister
  • Published on the ARC website
  • Sector is advised of availability

Proposals

  • Instructions to applicants, sample application form and FAQs published on ARC website
  • Eligibility Exemption Requests and Request Not to Assess Processes may be available 
  • Applications submitted by Eligible Organisations by the relevant scheme closing date.

Assessment

  • Proposals are considered against eligibility criteria and compliance with the Funding Rules.
  • Proposals are assessed by Detailed Assessors (with the exception of some Special Research Initiatives)
  • Applicants are given the opportunity to respond to Detailed Assessors’ written comments via a Rejoinder Process
  • Proposals are assessed by General Assessors taking into account the Detailed Assessments and Rejoinders

Selection

  • The Selection Advisory Committee (General Assessors) considers all proposals, recommends proposals to be funded and recommends the level at which successful proposals should be funded. 

Approval of funding

  • ARC CEO provides recommendations to the Minister with proposals to be approved for funding, proposals not recommended for funding, and the level of funding and duration of projects
  • Minister considers recommendations and approves and announces funding outcomes

[top]

""

Proposal assessment—overview

  • The peer review process is designed to be fair, thorough and transparent
  • All proposals are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Proposals are generally assigned to two types of assessors:
    • at least two General assessors (usually College of Experts members), and 
    • at least two Detailed assessors (external to the ARC)
  • ARC staff assess eligibility etc., but do not decide which proposals should be funded

[top]

Presentation slide header

ARC Assessment Process

 [top]  Presentation slide header  Fostering the next generation of researchers

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes

[top]

Presentation slide header

Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed assessors complete in-depth assessments of proposals by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process

[top]

Presentation slide header

General assessment

  • General assessors are members of 
    • the College of Experts or 
    • a Selection Advisory Committee
      (NB: expanded College—not all members sit on all panels)
  • General assessors 
    • assign their own ratings against the relevant scheme selection criteria
    • consider the proposal, the ratings and comments provided by Detailed assessors, and the applicant’s rejoinder; and 
  • Once all assessments submitted to the ARC, Detailed and General assessments and Rejoinders are considered by the panels at the final selection meeting (more on this later)

[top]

Presentation slide header

ARC College of Experts

  • plays a key role in identifying research excellence, moderating external assessments and recommending fundable proposals
  • assists the ARC in recruiting and assigning assessors and in implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations
  • Nominations open usually around May each year
  • Link: ARC College of Experts

[top]

Presentation slide header

ARC College of Experts; Who are they?

  • Nominated generally by a University
  • Top researchers
  • Broad expertise
  • Good committee and assessor experience
  • Balanced College/Balanced Panels:
    • FOR code demands
    • Gender
    • Experience (1-3 years on College)

[top]

Presentation slide header

Why do we need more (good) assessors?—some examples 

The ARC is grateful to a large number of extremely hard-working assessors who conduct the peer review process:

 

Detailed assessments

Proposals

Average assessments per proposal

DECRA 2015

4578

1394

3.3

Discovery Projects 2015

12,173

3689

3.3

Linkage Projects 2015

2294

710

3.2

[top]

""

Assessment response rates

Assessment response rates

 

Bubble Graph showing assessment response rates of individual assessors (bubbles), measured as fraction of submitted assessments vs requests to assess. This is plotted against dollars awarded to the assessor (as lead CI) in the last 5 years.

[top]

""

What’s in it for me?
Assessor history/performance 1

  • Improve own grant writing
  • Increased visibility into research activity
  • Service to the sector
  • Contractual obligation for grantees

[top]

""

What’s in it for me?
Assessor history/performance 2

  • 2016—the carrot
    • Data on assessors provided to unis
    • 10 best assessors
    • Uni rewards
  • The stick…

[top]

""

Assessors

  • The ARC has published a Statement of Support for Assessors—including both detailed and general assessors
  • As reported last November, we are also transitioning to a system of recognising assessor work by annual reporting to their institutions
  • The first report will contain information on the level of contribution made by assessors at individual institutions to the ARC’s peer review processes.

[top]

Presentation slide header

Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding proposal for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant;
      • has a current application or is negotiating an application for funding with that named participant;
      • has been a collaborator or co-author with that named participant on a research output within the past four years;
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years;
      • has been a postgraduate student or supervisor of that named participant within the past five years;
  • could otherwise be perceived to benefit materially from the awarding of funding to the proposal involving that named participant.
  • Link: ARC Conflict Of Interest And Confidentiality Policy

[top]

Presentation slide header

Conflict of Interest (cont'd)

  • RMS takes into account a great deal of data (e.g. institutional), but it doesn’t know everything
  • Assessors reviewing ARC proposals who identify a conflict of interest must reject the proposal in RMS
  • If in any doubt, contact the ARC to confirm whether a conflict exists under our policies
  • Assessing proposals despite a conflict of interest is in breach of ARC rules and of the Australian Code for the Responsible Conduct of Research
  • Link: ARC Research Integrity and Research Misconduct Policy

[top]

Presentation slide header

How do I provide a good Detailed assessment?

  • Objective comments
  • Detailed comments (one or two sentences are rarely sufficient)
  • Sufficient information to allow applicants to provide a rejoinder to your comments
  • Comments match scores—for example, if you have given significant criticisms an “A” rating is unlikely to be appropriate. 
  • Observe conflict of interest rules and declare anything you are concerned about to the ARC

[top]

Presentation slide header

Rating Scale

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

[top]

Presentation slide header

Actual rating distribution—DP15

Detailed Assessors

A

B

C

D

E

Target

10%

15%

20%

35%

20%

Feasibility and Benefit

25%

34%

25%

12%

3%

Investigator(s)

40%

37%

17%

5%

1%

Project Quality and Innovation

25%

32%

25%

14%

4%

Research Environment

54%

32%

11%

3%

1%

Total

36%

34%

20%

9%

2%

 

General Assessors

A

B

C

D

E

Target

10%

15%

20%

35%

20%

Feasibility and Benefit

5%

24%

38%

24%

9%

Investigator(s)

13%

34%

33%

15%

5%

Project Quality and Innovation

7%

25%

37%

23%

8%

Research Environment

22%

40%

24%

10%

3%

Total

12%

31%

33%

18%

6%

[top]

Presentation slide header

Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program.
  • The ARC considers that Research Opportunity comprises two separate elements:
    • Career experiences (relative to opportunity)
    • Career interruptions
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant. 

[top]

Presentation slide header

ROPE
(Research Opportunity and Performance Evidence)

  • ROPE recognises research excellence in the context of the diversity of career and life experiences
  • Takes into account the quality rather than simply the volume or size of the research contribution
  • Research Opportunity comprises two separate elements:
    • Career experiences (relative to opportunity)
    • Career interruptions
  • All assessors should be familiar with the full statement:  ROPE Statement (released Feb 2014)

[top]

Presentation slide header

Assessment: additional issues to note

  • Value for money is a selection criterion for all schemes; commitment of taxpayer funds must be justified.
    • Pay close attention to the budget justification within the proposal
  • Research Environment
    • Is no longer a selection criterion

[top]

Presentation slide header

Assessment: DP example (1)

Investigators (A)

“The Chief Investigators comprise an outstanding team of researchers with complementary skill sets, and extensive experience in researching the four country case studies at the heart of this proposal. I also note that the Project Leader has maximised her research-only position to produce high quality research outputs, including a major review of aid effectiveness commissioned by UNESCO. One CI has had two career interruptions for the birth of her two children, and has published high quality articles in The International Journal of Demography and Population. A third CI has undertaken two secondments to the Asian Development Bank to advise the Bank on best practice for the delivery of aid on the ground.”

[top]

Presentation slide header

Assessment: DP example (2)

Project Quality and Innovation (A)

“The project plan outlines a highly ambitious study of the role of international aid agencies in reducing poverty in four similar internal regions in the four different countries. It will utilise a cutting edge Mixed Methods approach to triangulate data from the 2014 global survey on aid effectiveness, with data mining of relevant documentation, plus ethnographic studies of donors, managers and recipients. The plan is innovative and highly promising and should generate exciting new data and insights into those case studies with generalisable results.”

[top]

Presentation slide header

Assessment: DP example (3)

Feasibility and Benefit (B)

“This important project may have underestimated the time required to undertake the fieldwork, especially as it’s not clear from the proposal whether the preliminary work necessary to undertake the research such as liaising with regional governors has been undertaken. Such access may be problematic, and lack of access may delay or seriously compromise the project."

[top]

Presentation slide header

Assessment: DP example (4)

Research Environment (A)

“The project is located in the Research Centre for Regional Studies, a cross-institutional centre which is housed in a major Australian University, with membership drawn from two other Australian universities, and the National Universities of two of the countries to be studied. It includes collaborative research and exchange programs providing exciting training and research opportunities."

[top]

Presentation slide header

What not to do in an assessment

  • Include your ratings in the text
  • Write assessments that are so brief as to be unhelpful
  • Identify yourself, either directly or indirectly, or refer to other researchers or proposals in a way that can identify them
  • Include text that appears to be defamatory or distastefully irrelevant (such as gratuitous criticism of a researcher)
  • Include comments about the potential ineligibility of a Proposal. This information should be provided to the ARC by email, as eligibility considerations are kept strictly separate from the assessment process

[top]

Presentation slide header

What not to do—example 1

Investigator(s)
“Good relevant experience”
 
Project Quality and Innovation
“Not at the forefront but solid”
 
Feasibility and Benefit
“Generally OK”

Research Environment
“Appropriate”

[top]

Presentation slide header

What not to do—example 2

Don’t just quote the rubric!
 

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

 

[top]

Presentation slide header

What not to do—example 3

Investigator(s)

“I have serious doubts about the lead CI’s ability to deliver this project. I was unfortunate to collaborate with him on a number of publications in recent years, and partnered on an ill-fated DP proposal last year which fell apart due to his overbearing and tendentious manner. If this is how he behaves towards other researchers it is unlikely he will be able to bring this project, which requires extensive collaboration across institutions, to fruition.”

 [top]

Presentation slide header

Proposal Score/Rank Calculation

  • “Grouped Average” of all submitted assessments for the proposal
  • This calculation results in a “Proposal Score”
  • Proposal ranks are derived for each panel
  • Any proposals (within same panel) with equal Proposal Scores will have equal ranks.

 

Flow chart showing how the average of general scores and details scores determines the final rank.

[top]

Presentation slide header

NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section

 

[top]

Presentation slide header

I’ve had an RMS account for years…

Q: I never/rarely get any requests to assess; why?

A: Your RMS profile plays a big part in matching proposals to assessors

[top]

Presentation slide header

How are assessors assigned? 

  • RMS generates a “word cloud” visualisation of a proposal based on:
    • Proposal summary
    • Proposal title
    • Impact statement
    • FoR codes
    • SEO codes
  • RMS generates assessor suggestions based on assessor codes, expertise and history 
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their judgment
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional spread
    • Gender balance
    • Assessor experience
  • For fellowship/award schemes, applicants in that round cannot assess others
  • As with assessors, RMS makes a suggestion about the broad discipline panel for each proposal, but these suggestions are reviewed and can be changed

[top]

Presentation slide header

RMS profile example 1.

1. Useful but could be better.

  • Expertise Text
    • Molecular evolution, bioinformatics, genomic data analysis, computational biology
  • Classifications
    • 010401, 010406, 060102, 060301, 060309, 060409, 060411

[top]

Presentation slide header

RMS profile example 2.

2. Not so useful—CV style

  • Expertise Text
    • Assessor is a Professor of Microbiology with over 100 peer reviewed publications in national and international journals of high repute. They currently have research funded by industry and the ARC to study microbial responses to sewage effluent and changes in nitrogen genes in marine environments. Leads the Environmental Unit which is a research and commercial unit within the Research Institute for the Environment at XX University.
  • Classifications
    • 060501, 060503, 060504

[top]

Presentation slide header

RMS profile example 3.

 3. Really useful, maybe a little long.

  • Expertise Text
    • I have expertise in the following areas: I have a general interest in habitat selection by vertebrates, and in anthropological effects on vertebrate habitats and habitat selection; disease ecology of amphibians; predation and effects of predators on behaviour of prey; life-history strategies (especially costs of reproduction and trade-offs between offspring size and number), sex ratio theory, general behavioural ecology and population biology (especially movement and dispersal) of vertebrates. My taxonomic area of greatest expertise is herpetology, ie. reptile and amphibian ecology and behaviour, but I have worked on birds and mammals as well. I am also interested in the ecology of vertebrate pests (ie., deer, cane toads), and in effects of land use and climate on vertebrates.
  • Classifications
    • 050101, 050103, 050202, 050211, 060208, 060801, 060806, 060809, 060899

[top]

Presentation slide header

RMS profile example 4.

4. Ideal

  • Expertise Text
    • My major area of research expertise is in x,y,z.
    • I also have experience in research a, b, c.
    • I would also be able to assess in the areas of blah blah.
  • Classifications
    • 6-10 x 6-digit FOR codes

[top]

Presentation slide header

ARC Assessors

  • If you are not currently an assessor for the ARC and would like to become one then send:
    • a brief CV
    • list of five recent publications
    • or a web link to this information
  • to  ARCAssessorUpdate@arc.gov.au

[top]