Discovery Projects, DECRA & Future Fellowships, ROPE, Assessment
The University of Newcastle
22 August 2017
Professor Stephen Buckman

""

Commonwealth Investment in R&D 2016–17 (%)

Commonwealth Investment in R&D 2016–17 (%)

Pie chart showing Commonwealth Investment in R&D 2015–16 ($m).

[top]

""

National Competitive Grants Program

National Competitive Grants Program

Graphical representation of schemes in the ARC's National Competitive Grants Program. Each scheme is a rectangle with the area of the rectangle representing  ARC funding (new and ongoing projects) for 2016.

[top]

""

ARC NCGP funding by Fields of Research 2009–2017 

Stacked area chart showing ARC NCGP funding (new and ongoing) by two digit Fields of Research 2009–2017 (2017 excl. LP and LASP). Projects for which there is no FoR code nominated are excluded.

[top]

""

ARC NCGP funding by scheme 2009–2016

 ARC NCGP funding by scheme 2009–2016

Stacked area chart showing ARC funding by scheme through the years 2009–2016 (New and ongoing). Discovery Projects makes up the largest component. 

[top]

""

Discovery Program

The ARC's Discovery funding schemes recognise the importance of fundamental research to the national innovation system.

Schemes:

[top]

""

Discovery Projects

The Discovery Projects scheme provides funding for research projects that can be undertaken by individual researchers or research teams.

The objectives of the  Discovery Projects  scheme are to:

  • support excellent basic and applied research by individuals and teams
  • encourage high-quality research and research training 
  • enhance international collaboration in research
  • expand Australia’s knowledge base and research capability
  • enhance the scale and focus of research in the Science and Research Priorities.

[top]

""

Discovery Projects—return and success rates

 

Discovery Projects scheme return and success rates 2009–2017.

[top]

Presentation slide header

Participation and success rate of Chief Investigators (CIs) in Discovery Projects 2017 by gender and career age

Source: Discovery Projects Selection Report 2017 Figure 1

Chart showing participation and success rate of Chief Investigators (CIs) in Discovery Projects 2017 by gender and career age.

[top]

""

Discovery Early Career Researcher Award (DECRA)

The objectives of the Discovery Early Career Researcher Award (DECRA) scheme are to:

  • support excellent basic and applied research by early career researchers; 
  • advance promising early career researchers and promote enhanced opportunities for diverse career pathways;
  • enable research and research training in high quality and supportive environments; 
  • expand Australia’s knowledge base and research capability; and
  • enhance the scale and focus of research in the Science and Research Priorities.

[top]

""

DECRA commitments

  • The DECRA Recipient is expected to spend a minimum of 20 per cent of her/his time on activities at the Administering Organisation, and 80 per cent of her/his time on research activities related to the proposed DECRA.(D8.2.2 and D8.2.3) 
  • The DECRA Recipient may not engage in other professional employment for the duration of the DECRA without prior approval from the ARC under subsection D8.2.5. (D8.2.4) 
  • The DECRA Recipient may spend up to 0.2 (20 per cent of Full Time Equivalent) of her/his time annually on teaching activities. The DECRA will not be extended to accommodate any periods of teaching. Supervision of honours or postgraduate students is not included in this limit. (D8.2.8)

[top]

""

DECRA—return and success rates

DECRA—return and success rates

Discovery Early Career Research Award scheme return and success rates 2009–2017.

[top]

""

Participation and success rate of DECRA 2017 Candidates by gender and career age*

Participation_success_rate_CI_DECRA2017_gender_career_age

 

* Career age is calculated as years since PhD.

Chart showing participation and success rate of Chief Investigators (CIs) in DECRA 2017 by gender and career age.

[top]

""

Future Fellowships

The aim of Future Fellowships is to attract and retain the best and brightest mid-career researchers, to conduct their research in Australia.

Up to 100 four-year Future Fellowships will be awarded each year, providing a four-year salary at one of three salary levels of between $150,376 up to $213,693 per year (including on-costs).

An additional $50,000 per year may be provided to the Administering Organisation that may be used for higher degree by research stipends and expenditure on field research and travel costs essential to the project.


The objectives of the Future Fellowships scheme are to:

  • ensure that outstanding mid-career researchers are recruited and retained by Administering Organisations in continuing academic positions
  • build collaboration across industry and/or research organisations and/or disciplines
  • support research in national priorities that will result in economic, environmental, social and/or cultural benefits for Australia
  • strengthen Australia’s research capacity by supporting innovative, internationally competitive research.

[top]

""

Future Fellows—success rates

Future FEllows success rate

Chart showing Future Fellowships success rates 2009–2017.

[top]

""

 Participation and success rate of Future Fellowships 2017 Candidates by gender and career age*

 Participation and success rate of Future Fellowships 2017 Candidates by gender and career age*

 

* Career age is calculated as years since PhD.

Chart showing participation and success rate of Future Fellowships 2017 Candidates by gender and career age.

[top]

""

Average Age of Lead CI
Discovery Projects, DECRA and Future Fellowships

Average Age of Lead CIDiscovery Projects, DECRA and Future Fellowships

Chart showing average age of lead CI in DP, DECRA and Future Fellows, 2009–2017.

[top]

Presentation slide header

Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program.
  • The ARC considers that Research Opportunity comprises two separate elements:
  • Career experiences (relative to opportunity)
  • Career interruptions
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant. 
  • The ROPE Statement (released Feb 2014) is online

  • Childcare and/or parental care
  • Illness
  • Other forms of employment; unemployment
  • Remote location
  • Move from overseas.

Priorities: gender, equity

  • funding for enabling carers to attend conferences.

[top]

Presentation slide header

NCGP Lifecycle

Funding Rules

  • Funding Rules are approved by Minister.
  • Published on the ARC website.
  • Sector is advised of availability.

Proposals

  • Instructions to applicants, sample application form and FAQs published on ARC website.
  • Eligibility Exemption Requests and Request Not to Assess Processes may be available.
  • Applications submitted by Eligible Organisations by the relevant scheme closing date.

Assessment

  • Proposals are considered against eligibility criteria and compliance with the Funding Rules.
  • Proposals are assessed by Detailed Assessors (with the exception of some Special Research Initiatives).
  • Applicants are given the opportunity to respond to Detailed Assessors’ written comments via a Rejoinder Process.
  • Proposals are assessed by General Assessors taking into account the Detailed Assessments and Rejoinders.

Selection meeting

  • The Selection Advisory Committee (General Assessors) considers all proposals, recommends proposals to be funded and recommends the level at which successful proposals should be funded. 

Approval of funding

  • ARC CEO provides recommendations to the with proposals to be approved for funding, proposals not recommended for funding, and the level of funding and duration of projects.
  • Minister considers recommendations and approves and announces funding outcomes.

[top]

 

Presentation slide header

ARC College of Experts

  • plays a key role in identifying research excellence, moderating external assessments and recommending fundable proposals
  • assists the ARC in recruiting and assigning assessors and in implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations.

[top]

Presentation slide header

Forming selection panels

  • The ARC recognises the need to have a flexible approach to suit volume and disciplinary spread in each scheme
  • The number of discipline panels varies by scheme
    • For example, Discovery Projects typically has five panels:
      • BSB (Biological Sciences and Biotechnology)
      • EIC (Engineering, Information and Computing Sciences)
      • HCA (Humanities and Creative Arts)
      • MPCE (Mathematics, Physics, Chemistry and Earth Sciences)
      • SBE (Social, Behavioural and Economic Sciences).
    • However, proposals can be assigned across two panels to ensure appropriate interdisciplinary expertise, and assigned to a breadth of detailed reviewers
  • Some other schemes use a single multi-disciplinary panel (e.g. Australian Laureate Fellowships, ITRP)
    • LIEF has one multi-disciplinary panel.

[top]

Presentation slide header

Proposal assessment—overview

  • The peer review process is designed to be fair, thorough and transparent
  • All proposals are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Proposals are generally assigned to two types of assessors:
    • at least two General assessors (usually College of Experts members), and 
    • at least two Detailed assessors
  • ARC staff assess eligibility etc., but do not decide which proposals should be funded.

[top]

Presentation slide header

General assessment

  • General assessors are assigned by the Executive Directors of the ARC. They are members of:
    • the College of Experts or 
    • a Selection Advisory Committee.
       (NB: expanded College—not all members sit on all panels)
  • General assessors 
    • carriage 1 Gen. assessors assign Detailed assessors 
    • assign their own ratings against the relevant scheme selection criteria
    • consider the proposal, the ratings and comments provided by Detailed assessors, and the applicant’s rejoinder; and 
  • Once all assessments submitted to the ARC, Detailed and General assessments and Rejoinders are considered by the panels at the final selection meeting (more on this later).

[top]

Presentation slide header

ARC Assessment Process

 [top]  Presentation slide header  Fostering the next generation of researchers

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes

[top]

Presentation slide header

Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%).
  • Detailed assessors complete in-depth assessments of proposals by providing scores and comments against the scheme specific selection criteria.
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process (more on this later).

[top]

Presentation slide header

ARC Assessors

  • We encourage every active researcher to become an assessor for the ARC.
  • If you are not currently an assessor for the ARC and would like to become one then send:
    • a brief CV
    • list of five recent publications
    • or a web link to this information
  • to ARCAssessorUpdate@arc.gov.au.

[top]

Presentation slide header

How are assessors assigned? 

  • RMS generates a “word cloud” visualisation of a proposal based on:
    • Proposal summary
    • Proposal title
    • Impact statement
    • FoR codes
    • SEO codes
  • RMS generates assessor suggestions and word cloud commonalities based on assessor codes, expertise and history – make sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their decisions
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional diversity
    • Gender balance
    • Assessor experience
  • For fellowship/award schemes, applicants in that round cannot assess others
  • As with assessors, RMS makes a suggestion about the broad discipline panel for each proposal, but these suggestions are reviewed and can be changed.

[top]

Presentation slide header

Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding proposal for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant;
      • has a current application or is negotiating an application for funding with that named participant;
      • has been a collaborator or co-author with that named participant on a research output within the past four years;
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years;
      • has been a postgraduate student or supervisor of that named participant within the past five years;
  • could otherwise be perceived to benefit materially from the awarding of funding to the proposal involving that named participant
  • ARC Conflict Of Interest And Confidentiality Policy.

 [top]

Presentation slide header

Rating Scale

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

[top]

Presentation slide header

How do I provide a good Detailed assessment?

  • Objective comments
  • Detailed comments (one or two sentences are rarely sufficient)
  • Sufficient information to allow applicants to provide a rejoinder to your comments
  • Comments match scores—for example, if you have given significant criticisms an “A” rating is unlikely to be appropriate. 
  • Observe conflict of interest rules and declare anything you are concerned about to the ARC.

[top]

Presentation slide header

Why do we need more (good) assessors?
NCGP Stats in 2016

  • Number of schemes: 10
  • Proposals submitted: 6,236
  • Number of detailed assessments: 20,813
  • Number of general assessments: 12,795
  • Total assessments: 33,608
  • Grants awarded: 1,256
  • Variations to Funding agreements: 2,248 
  • EOYRs: 9,443.

[top]

Presentation slide header

What’s in it for me?
Assessor history/performance 

  • Improve own grant writing
  • Increased visibility into research activity
  • Service to the sector
  • Contractual obligation for grantees.

[top]

Presentation slide header

NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section.

[top]