Discovery Early Career Research Awards (DECRA) and Future Fellowships
ARC Industrial Transformation Training Centre for Mine Site Restoration
Curtin University
25 August 2017
Professor Therese Jefferson

""

National Competitive Grants Program

National Competitive Grants Program

Graphical representation of schemes in the ARC's National Competitive Grants Program. Each scheme is a rectangle with the area of the rectangle representing  ARC funding (new and ongoing projects) for 2016.

[top]

""

NCGP Lifecycle

NCGP Lifecycle

Infographic—NCGP Lifecycle timeline from development of funding rules to final report.

[top]

""

 Discovery Early Career Researcher Award (DECRA)

  • support excellent basic and applied research by early career researchers;
  • advance promising early career researchers and promote enhanced opportunities for diverse career pathways;
  • enable research and research training in high quality and supportive environments;
  • expand Australia’s knowledge base and research capability; and
  • enhance the scale and focus of research in the Science and Research Priorities.

The DECRA scheme provides more focused support for early career researchers in both teaching and research, and research-only positions.

Researchers may be eligible to apply if they have been awarded a PhD within five years, or longer if combined with periods of significant career interruption).

Up to 200 three-year DECRAs may be awarded each year providing a researcher with up to $139,369 per annum.

    • This is composed of $99,369 in salary and on-costs, and up to $40,000 in project costs. 

[top]

""

 DECRA commitments

  • The DECRA Recipient is expected to spend a minimum of 20 per cent of her/his time on activities at the Administering Organisation, and 80 per cent of her/his time on research activities related to the proposed DECRA.(D8.2.2 and D8.2.3).
  • The DECRA Recipient may not engage in other professional employment for the duration of the DECRA without prior approval from the ARC under subsection D8.2.5. (D8.2.4).
  • The DECRA Recipient may spend up to 0.2 (20 per cent of Full Time Equivalent) of her/his time annually on teaching activities. The DECRA will not be extended to accommodate any periods of teaching. Supervision of honours or postgraduate students is not included in this limit. (D8.2.8).

[top]

""

DECRA—return and success rates

DECRA—return and success rates

Discovery Early Career Research Award scheme return and success rates 2009–2017.

[top]

""

First-time awardees by scheme 2009–17

First-time awardees by scheme 2009–17

Chart showing the numbers and percentage of “first time” awardees in Discovery Early Career Researcher Award (DECRA) and Discovery Projects (DP) schemes, 2009–2017.

[top]

""

 

Average Age of Lead CI
Discovery Projects, DECRA and Future Fellowships

Average Age of Lead CIDiscovery Projects, DECRA and Future Fellowships

Chart showing average age of lead CI in DP, DECRA and Future Fellows, 2009-2017.

[top]

""

 Future Fellowships

The objectives of the Future Fellowships scheme are to:

  • ensure that outstanding mid-career researchers are recruited and retained by Administering Organisations in continuing academic positions
  • build collaboration across industry and/or research organisations and/or disciplines
  • support research in national priorities that will result in economic, environmental, social and/or cultural benefits for Australia
  • strengthen Australia’s research capacity by supporting innovative, internationally competitive research.

The aim of Future Fellowships is to attract and retain the best and brightest mid-career researchers, to conduct their research in Australia.

Up to 100 four-year Future Fellowships will be awarded each year, providing a four-year salary at one of three salary levels of between $150,376 up to $213,693 per year (including on-costs).

An additional $50,000 per year may be provided to the Administering Organisation that may be used for higher degree by research stipends and expenditure on field research and travel costs essential to the project. 

[top]

""

Future Fellows—success rates

Future FEllows success rate

Chart showing Future Fellowships success rates 2009–2017.

[top]

"" 

Comparison of Future Fellowships success rates between female and male participants from 2009 to 2016

Comparison of Future Fellowships success rates between female and male participants from 2009 to 2016

Source: Gender equity data.

Comparison of Future Fellowships success rates between female and male participants from 2009 to 2016.

[top]

""

 

ARC Assessment Process

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes

[top]

""

 General and detailed assessors

Your proposal arrives…

It is assigned to by an ARC Executive Director to (usually) two General assessors (usually College of Experts members).

The General Assessors then assign at least two Detailed assessors (usually more).

[top]

""

 General Assessors

  • General Assessors are usually members of the College of Experts but can also include external experts
  • The ARC recognises the need to have a flexible approach to suit volume and disciplinary spread in each scheme
  • The number of discipline panels varies by scheme.
  • (Funds are apportioned according to demand) For example, Discovery Projects typically has five panels:
    • BSB (Biological Sciences and Biotechnology)
    • EIC (Engineering, Information and Computing Sciences)
    • HCA (Humanities and Creative Arts)
    • MPCE (Mathematics, Physics, Chemistry and Earth Sciences)
    • SBE (Social, Behavioural and Economic Sciences).
  • Proposals can be assigned across two panels to ensure appropriate expertise, and assigned to a breadth of detailed reviewers.

[top]

""

 Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed assessors complete in-depth assessments of proposals by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process (more on this later).

[top]

""

 How are assessors assigned? 

  • RMS generates a “word cloud” of a proposal based on:
    • 100 word summary
    • Proposal title
    • Impact statement
    • FoR codes
    • SEO codes.
  • RMS generates assessor suggestions based on assessor codes, expertise and history—if you are an assessormake sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their judgment
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional spread
    • Gender balance
    • Assessor experience.
  • For fellowship/award schemes, applicants in that round cannot assess others
  • As with assessors, RMS makes a suggestion about the broad discipline panel for each proposal, but these suggestions are reviewed and can be changed.

[top]

""

 Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding proposal for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant;
      • has a current application or is negotiating an application for funding with that named participant;
      • has been a collaborator or co-author with that named participant on a research output within the past four years;
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years;
      • has been a postgraduate student or supervisor of that named participant within the past five years;
    • could otherwise be perceived to benefit materially from the awarding of funding to the proposal involving that named participant.
  • ARC Conflict of Interest and Confidentiality Policy 
  • RMS takes into account a great deal of data (e.g. institutional), but it doesn’t know everything
  • Assessors reviewing ARC proposals who identify a conflict of interest must reject the proposal in RMS
  • If in any doubt, contact the ARC to confirm whether a conflict exists under our policies
  • Assessing proposals despite a conflict of interest is in breach of ARC rules and of the Australian Code for the Responsible Conduct of Research
  • ARC Research Integrity and Research Misconduct Policy.

[top]

""

 Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program.
  • The ARC considers that Research Opportunity comprises two separate elements:
    • Career experiences (relative to opportunity)
    • Career interruptions.
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant
  • ROPE Statement (released Feb 2014).

[top]

""

 Rating Scale

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

 

[top]

""

 Rejoinder

  • Where the ARC seeks Detailed assessments, applicants are often given the opportunity to submit a rejoinder to respond to comments made by Detailed assessors
  • Rejoinders are not viewed by the Detailed assessors but are considered by the General assessors
  • The ARC prefers not to interfere in the assessment process. Only rarely will we agree to requests to remove assessments. 

[top]

""

 What happens after rejoinder?

  • General assessors view all Detailed assessments and consider the rejoinder carefully
  • General assessors confer with each other to finalise scores
  • General assessors submit their final scores and ranks for their group of proposals
  • General assessors’ scores in large schemes are then normalised to reduce the impact of different “marking styles”
  • Proposals are then ranked within the discipline panel (where relevant—some schemes have a single panel).

[top]

""

 Proposal Score/Rank Calculation

  • “Grouped Average” of all submitted assessments for the proposal
  • This calculation results in a “Proposal Score”
  • Proposal ranks are derived for each panel
  • Any proposals (within same panel) with equal Proposal Scores will have equal ranks.

 

Flow chart showing how the average of general scores and details scores determines the final rank.

[top]

Presentation slide header

Before a Selection Meeting

  • Panels are given access to final scores and rankings, and can review all (non-conflicted) proposals
  • Panel members are encouraged to note any issues they believe may have affected the assessment/ranking of a particular proposal, or are noteworthy for panel discussion
  • Members are also invited to closely scrutinise ROPE issues
  • Panel members’ attention is drawn particularly to proposals around the likely funding cut-off, as these will need detailed discussion.

[top]

""

 Selection Advisory Committee (SAC) Meetings - Outcomes

  • General Assessors meet as a panel
  • Go through proposals—recommend for funding (or not)
  • Recommend budgets
  • Recommendations forwarded to ARC  CEO
  • CEO’s Recommendations forwarded to Minister and decisions made
  • Funding awarded to Eligible Institutions on basis of Minister’s decisions
  • Announcements made—including your 100 word project summary.

[top]

""

Feedback on unsuccessful proposals

  • Recently the ARC has provided two types of feedback:
    • Overall ranking band
    • Ranking band within specific scheme selection criteria
  • Example:

Proposal ID

Lead CI

Outcome

Unsuccessful Band

Investigator(s) (40%)

Project Quality and Innovation (25%)

Feasibility and Benefit (20%)

Research Environment (15%)

DPxxxxxx

Example, Prof G.

Not Recommended

This proposal is in the band 26% to 50% of unsuccessful proposals within the discipline panel.

Band 26% to 50% of unsuccessful proposals within the discipline panel.

Band 26% to 50% of unsuccessful proposals within the discipline panel.

Top 10% of unsuccessful proposals within the discipline panel.

Band 11% to 25% of unsuccessful proposals within the discipline panel.

[top]

""

 NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section.

[top]