ARC Schemes, Continuous LP, Assessment Processes, Competitive Proposals, Engagement & Impact
RMIT University
5 May 2017
Professor Stephen Buckman

""

Outline

  • National Competitive Grants Program Overview
  • Linkage Projects
    • Continuous Linkage
    • LP17 Changes
  • Assessment Processes
  • Competitive Proposals
  • Engagement and Impact—update

[top]

""

National Competitive Grants Program

  • Through the NCGP, the ARC supports the highest-quality fundamental and applied research and research training across all disciplines*. 
  • The ARC encourages partnerships between researchers and industry, government, community organisations and the international community.
  • The NCGP comprises two main elements—Discovery and Linkage—under which the ARC funds a range of complementary schemes to:
    • support researchers at different stages of their careers
    • build Australia’s research capability
    • expand and enhance research networks and collaborations
    • develop centres of research excellence.

*Clinical and other medical research is primarily supported by the National Health and Medical Research Council.

[top]

""

National Competitive Grants Program

National Competitive Grants Program

 

Graphical representation of schemes in the ARC's National Competitive Grants Program. Each scheme is a rectangle with the area of the rectangle representing  ARC funding (new and ongoing projects) for 2016.

Area of box represents ARC funding by scheme (new and ongoing projects) for 2016*. 

*LP16 figures not including Continuous Linkage.

[top]

""

ARC NCGP funding by scheme 2009–2016

 ARC NCGP funding by scheme 2009–2016

[top]

 

""

NCGP Lifecycle

NCGP Lifecycle

Infographic—NCGP Lifecycle timeline from development of funding rules to final report.

[top]

""

Linkage Program

The ARC's Linkage funding schemes aim to encourage and extend cooperative approaches to research and improve the use of research outcomes by strengthening links within Australia’s innovation system and with innovation systems internationally.

Schemes:

  • Linkage Projects
  • Industrial Transformation Research Program
  • Linkage Infrastructure, Equipment and Facilities
  • ARC Centres of Excellence
  • Special Research Initiatives
  • Linkage Learned Academies Special Projects.

[top]

""

Linkage Projects

The Linkage Projects scheme provides funding to Eligible Organisations to support research and development (R&D) projects which:

  • are collaborative between higher education researchers and other parts of the national innovation system
  • are undertaken to acquire new knowledge, and
  • involve risk or innovation.

Proposals for funding under the Linkage Projects scheme must include at least one Partner Organisation. The Partner Organisation must make a contribution in cash and/or in kind to the project. The combined Partner Organisation contributions for a Proposal (i.e. the total of the cash and in-kind contributions of the Partner Organisations) must at least match the total funding requested from the ARC.

[top]

""

Linkage Projects

The objectives of the Linkage Projects scheme are to:

  1. support the initiation and/or development of long-term strategic research alliances between higher education organisations and other organisations, including industry and other research end-users, in order to apply advanced knowledge to problems and/or to provide opportunities to obtain national economic, commercial, social or cultural benefits
  2. provide opportunities for internationally competitive research projects to be conducted in collaboration with organisations outside the higher education sector, targeting those who have demonstrated a clear commitment to high-quality research
  3. encourage growth of a national pool of world-class researchers to meet the needs of the broader Australian innovation system
  4. build the scale and focus of research in the national Science and Research Priorities.

[top]

""

Linkage Projects—return and success rates

Linkage Projects—return and success rates

Linkage Projects (LP) scheme return and success rates 2009–2016. NB LP 2016 does not include continuous LP funding.

[top]

""

Comparison of Linkage Projects success rates between female and male participants from 2010 to 2016

""

Comparison of Linkage Projects success rates between female and male participants from 2010 to 2016. NB LP 2016 does not include continuous LP funding.

Source: LP Selection Report 2016, Figure 3.

[top]

""

Continuous Linkage

  • Commenced in July 2016
  • Closed December 22, 2016
  • Patchy submissions
    • Majority in last 2 weeks
  • 225 proposal submissions
  • 6-month turnaround (under NISA)
  • Assessment almost finalised

[top]

""

Continuous Linkage—Assessment

  • Selection Advisory Committee (SAC)
    • Interdisciplinary
    • Drawn from College of Experts
  • Assessment of proposals
    • 3 General Assessors (Carriages) - from SAC
    • 2-6 Detailed Assessors 
    • Rejoinders to detailed reports
    • Proposals “fast-tracked”

[top]

""

Continuous Linkage

  • Fast Tracking
    • Two “thresholds”
    • Score above high threshold
      • recommendation to fund
    • Score below low threshold
      • recommendation not to fund
  • Selection Meeting
    • Proposals with scores between thresholds plus any others flagged for discussion

[top]

""

Continuous Linkage

Selection Meetings

  • Videoconference
  • All Carriage 1’s attend
    • required to discuss with Carriages 2 & 3
  • Independent Chair

[top]

""

Summary of LP17 Changes

  • Addition of definitions for ‘Active Project’, ‘Candidate’, ‘GrantConnect’, ‘Project Initialisation Date’ and ‘Research Output’ (A3).
  • Addition of provision for reasonable essential extraordinary costs to allow a researcher who is a carer, or who themselves require care or assistance, to undertake travel essential to the Project (A5.2.1(k)).
  • Clarification of cross-scheme Project limits (A7.4) and Eligibility process (A7.5).
  • Revised eligibility for all schemes to be considered in terms of the life of a Project (Active Project), rather than only the funding years set out in the original Funding Agreement (D10.2).
  • For the purposes of eligibility, CIs on Industrial Transformation Research Hubs commencing in 2015 or later, or Industrial Transformation Training Centres commencing in 2016 or later will each count as one Linkage Project (A7.4.3).
  • Addition of Exempt Small Business as a New Partner Organisation exempt from the Cash Contribution requirements (D3.1)
  • Update to the Request Not to Assess process (A9.3).
  • Update to Publication and Dissemination of Research Outputs section to include Research Data (A12.5).
  • Adjustments to the Selection Criteria, including the weightings (D6).

*this list contains the major changes to rules and is not to be taken as exhaustive. Please consult the Funding Rules for schemes under the Linkage Program (2016 edition) or contact your research office for further clarification.

 [top]

Presentation slide header

NCGP Lifecycle

Funding Rules

  • Funding Rules are approved by Minister
  • Published on the ARC website
  • Sector is advised of availability

Proposals

  • Instructions to applicants, sample application form and FAQs published on ARC website
  • Eligibility Exemption Requests and Request Not to Assess Processes may be available
  • Applications submitted by Eligible Organisations by the relevant scheme closing date

Assessment

  • Proposals are considered against eligibility criteria and compliance with the Funding Rules.
  • Proposals are assessed by Detailed Assessors (with the exception of some Special Research Initiatives)
  • Applicants are given the opportunity to respond to Detailed Assessors’ written comments via a Rejoinder Process
  • Proposals are assessed by General Assessors taking into account the Detailed Assessments and Rejoinders

Selection meeting

  • The Selection Advisory Committee (General Assessors) considers all proposals, recommends proposals to be funded and recommends the level at which successful proposals should be funded. 

Approval of funding

  • ARC CEO provides recommendations to the with proposals to be approved for funding, proposals not recommended for funding, and the level of funding and duration of projects.
  • Minister considers recommendations and approves and announces funding outcomes

[top]

Presentation slide header

ARC College of Experts

  • plays a key role in identifying research excellence, moderating external assessments and recommending fundable proposals
  • assists the ARC in recruiting and assigning assessors and in implementing peer review reforms in established and emerging disciplines as well as interdisciplinary areas
  • experts of international standing drawn from the Australian research community: from higher education, industry and public sector research organisations

[top]

Presentation slide header

Forming selection panels

  • The ARC recognises the need to have a flexible approach to suit volume and disciplinary spread in each scheme
  • The number of discipline panels varies by scheme.
    • For example, Discovery Projects typically has five panels:
      • BSB (Biological Sciences and Biotechnology)
      • EIC (Engineering, Information and Computing Sciences)
      • HCA (Humanities and Creative Arts)
      • MPCE (Mathematics, Physics, Chemistry and Earth Sciences)
      • SBE (Social, Behavioural and Economic Sciences)
    • However, proposals can be assigned across two panels to ensure appropriate interdisciplinary expertise, and assigned to a breadth of detailed reviewers
  • Some other schemes use a single multi-disciplinary panel (e.g. Australian Laureate Fellowships, ITRP)
    • LIEF has one multi-disciplinary panel.

[top]

Presentation slide header

ARC Assessment Process

 [top]  Presentation slide header  Fostering the next generation of researchers

This pictorial graph shows the ARC assessment process.

  1. Application
  2. Panel (can go directly to Selection Meeting)
  3. External Assessment
  4. Selection Meeting
  5. Outcomes

[top]

Presentation slide header

Proposal assessment—overview

  • The peer review process is designed to be fair, thorough and transparent
  • All proposals are assessed against the selection criteria, and in accordance with the weightings for that scheme
  • Proposals are generally assigned to two types of assessors:
    • at least two General assessors (usually College of Experts members), and 
    • at least two Detailed assessors
  • ARC staff assess eligibility etc., but do not decide which proposals should be funded

[top]

Presentation slide header

General assessment

  • General assessors are assigned by the Executive Directors of the ARC. They are members of:
    • the College of Experts or 
    • a Selection Advisory Committee.
       (NB: expanded College—not all members sit on all panels)
  • General assessors 
    • carriage 1 Gen. assessors assign Detailed assessors 
    • assign their own ratings against the relevant scheme selection criteria
    • consider the proposal, the ratings and comments provided by Detailed assessors, and the applicant’s rejoinder; and 
  • Once all assessments submitted to the ARC, Detailed and General assessments and Rejoinders are considered by the panels at the final selection meeting (more on this later)

[top]

Presentation slide header

Detailed assessments

  • Detailed assessors are drawn from the Australian and international research community (≈ 25%)
  • Detailed assessors complete in-depth assessments of proposals by providing scores and comments against the scheme specific selection criteria
  • These assessments are then taken into consideration by General assessors in the later stages of the peer review process (more on this later)

[top]

Presentation slide header

ARC Assessors

  • We encourage every active researcher to become an assessor for the ARC.
  • If you are not currently an assessor for the ARC and would like to become one then send:
    • a brief CV
    • list of five recent publications
    • or a web link to this information
  • to ARCAssessorUpdate@arc.gov.au

[top]

Presentation slide header

How are assessors assigned? 

  • RMS generates a “word cloud” visualisation of a proposal based on:
    • Proposal summary
    • Proposal title
    • Impact statement
    • FoR codes
    • SEO codes
  • RMS generates assessor suggestions and word cloud commonalities based on assessor codes, expertise and history – make sure your RMS profile is up to date
  • No assignments are made “automatically”. This information is provided to ARC Executive Directors and College of Experts/SAC members to inform their decisions
  • Factors considered by an assigner may include (all things being equal):
    • Breadth of perspectives
    • Institutional diversity
    • Gender balance
    • Assessor experience
  • For fellowship/award schemes, applicants in that round cannot assess others
  • As with assessors, RMS makes a suggestion about the broad discipline panel for each proposal, but these suggestions are reviewed and can be changed

[top]

Presentation slide header

Conflict of Interest

  • In addition to institutional conflicts, an assessor may be deemed to have a CoI with a named participant on a funding proposal for a number of reasons including, but not limited to, if that assessor:
    • has a close personal relationship (including enmity) with that named participant;
    • has a professional relationship with that named participant including:
      • currently holds, or has held within the past two years, funding conjointly with that named participant;
      • has a current application or is negotiating an application for funding with that named participant;
      • has been a collaborator or co-author with that named participant on a research output within the past four years;
      • has been a co-editor with that named participant of a book, journal, compendium, or conference proceedings within the past two years;
      • has been a postgraduate student or supervisor of that named participant within the past five years;
  • could otherwise be perceived to benefit materially from the awarding of funding to the proposal involving that named participant.
  • ARC Conflict Of Interest And Confidentiality Policy

 [top]

Presentation slide header

Rating Scale

Rating Scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses or more fatal flaws. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

[top]

Presentation slide header

How do I provide a good Detailed assessment?

  • Objective comments
  • Detailed comments (one or two sentences are rarely sufficient)
  • Sufficient information to allow applicants to provide a rejoinder to your comments
  • Comments match scores—for example, if you have given significant criticisms an “A” rating is unlikely to be appropriate. 
  • Observe conflict of interest rules and declare anything you are concerned about to the ARC

[top]

Presentation slide header

Research Opportunity and Performance Evidence (ROPE)

  • The ARC is committed to ensuring all eligible researchers have fair access to competitive funding through the National Competitive Grants Program.
  • The ARC considers that Research Opportunity comprises two separate elements:
  • Career experiences (relative to opportunity)
  • Career interruptions
  • Performance Evidence is designed to provide assessors with information that will enable them to contextualise research outputs relative to the opportunity of a participant. 
  • The ROPE Statement (released Feb 2014) is online

[top]

Presentation slide header

NEW Peer Review section on the ARC website

Designed to support our 20,000 strong assessor community. 

Peer Review section

[top]

Presentation slide header

Important Information When Writing a Grant Application

  • Funding Rules
    • objectives
    • selection criteria
    • budget items
    • eligibility
  • Frequently Asked Questions (FAQs)
  • Instructions to Applicants (ITAs)

[top]

Presentation slide header

Grant process and writing tips

  • All grants that are successful should provide exciting new outcomes and be an excellent investment
  • Decisions will align with Scheme Objectives
  • Not all excellent proposals can get funding; most applicants will be disappointed

[top]

Presentation slide header

Insights into grants process 

  • Consider where to apply for funding; choose a scheme. 
  • Pay attention to eligibility and ARC cross scheme limits
  • The scheme objectives and the selection criteria—address every one of them
  • Choosing Field of Research Codes—assisting the ARC choose the right assessors—6 digit FoR Codes
  • Track Record—career interruption—the ROPE provision
  • The scale of assessment 
    • The external assessor—1–2 proposals
    • The ARC panel member—10–100
    • ARC Panel meeting—150–400
  • The rejoinder
  • Understand the research field and international context. Develop your ideas to solve a research problem. 
  • Network with leaders in the field. Consider the research environment when applying. 
  • Apply by yourself or as a team member….
  • Career interruptions—making a case for ROPE
  • Seek mentors on writing good grant applications
  • Your first grant application 
    • Write for your peers—write so that someone broadly in your field will understand your project
    • Write for the public—write a plain English statement
  • Don’t over-inflate authorship claims but don’t undersell yourself either

[top]

Presentation slide header

Low ranked proposals:

  • Use too much technical jargon
  • Make grandiose and implausible claims about outcomes
  • Don't support claims of excellence or progress with evidence
  • Relate to research areas without momentum
  • Are weakly linked into national and international research networks

[top]

Presentation slide header

Low ranked proposals also: 

  • Emphasise the collection of data rather than the solution of controversies
  • Set a negative or depressive tone about the state of the subject in Australia
  • Contain a high rate of spelling and grammatical errors
  • Are badly structured and difficult to follow

[top]

Presentation slide header

Responding to an assessment/rejoinder

  • Read the assessments then wait at least a day before starting the rejoinder
  • Approach it constructively
  • The rejoinder is to help College of Experts to seek applicant’s views on criticisms made by peers
  • Don’t get angry at the assessor—you’re wasting valuable space to address important concerns

[top]

Presentation slide header

Definitions

Engagement

  • Research engagement is the interaction between researchers and research end-users (including industry, government, non-governmental organisations, communities and community organisations), for the mutually beneficial exchange of knowledge, technologies and methods, and resources in a context of partnership and reciprocity.

Impact

  • Research impact is the contribution that research makes to the economy, society and environment, beyond the contribution to academic research.

[top]

Presentation slide header

 Engagement and Impact assessment

  • For 2018
    • all 42 Australian universities (defined by the HESA 2003) will be eligible to participate
    • all research disciplines will be involved
    • the methodology will not advantage one discipline over another
    • evaluations will be conducted by committees of experts from the university sector and industry

[top]

Presentation slide header

Development

  • Assessment is being developed by ARC and Department of Education and Training as a companion to ERA
  • The methodology is being developed with extensive consultation with the university sector, industry and other research end-users
  • Steering Committee—which includes higher education and industry leaders—overseeing the development of the framework for the assessment. Committee supported by two working groups:
    • Technical Working Group provides expert advice on the development of indicators that will support the engagement and impact assessment 
    • Performance and Incentives Working Group provides advice to the ARC about the potential incentive effects of the preferred model

[top]

Presentation slide header

Consultation

  • Two formal consultations have been undertaken:
    • Engagement and Impact Assessment Consultation Paper, public consultation, 2 May—24 June 2016
    • End-user survey, 2 May—1 June 2016, industry and other end-users or beneficiaries of university research were surveyed
  • A series of face-to-face meetings between the ARC CEO and industry leaders

[top]

Presentation slide header

Timeline

  • Methodology being developed in 2016, including extensive sector consultation
  • Framework and guidelines to be developed for 2017
  • Pilot assessments in mid 2017
  • Full assessment in 2018 as a companion to ERA
  • Nominations for 2018 open Feb/Mar 2017

[top]

Presentation slide header

ERA/EI

  • ERA and EI will run as companion exercises
  • ERA will continue to assess research quality
  • ERA acknowledges and encourages ‘blue sky’ research
  • Engagement and impact assessment will consider:
    • research interactions with industry, Government, non-governmental organisations, communities and community organisations, and 
    • research contributions to economy, society and environment

[top]

Presentation slide header

Engagement and Impact Assessment Pilot 

Development is in progress:

  • It will run mid 2017
  • Two separate pilot studies - engagement and impact
  • All Australian Universities are eligible to participate – participation is voluntary
  • 40 Australian universities (defined by the HESA 2003) are participating in the pilot
  • evaluations will be conducted by committees of experts from the university sector and industry/end users of research

[top]

Presentation slide header

Pilot assessment framework

Unit of Assessment

Engagement

Impact

Matrix of Metrics/

Indicators

Narrative

 

Impact Study

 

Rating for Engagement

Rating for Impact

[top]

Presentation slide header

Pilot assessment framework

Engagement

Impact

GROUP A

03 Chemical Sciences

11 Medical and Health Sciences

GROUP C

05 Environmental Sciences

07 Agricultural and Veterinary Sciences

09 Engineering

 

GROUP B

21 History and Archaeology

22 Philosophy and Religious Studies

GROUP D

13 Education

19 Studies in Creative Arts and Writing

20 Language, Communication and Culture

[top]

Presentation slide header

Engagement—a narrative

  • Provides context for the engagement indicators
  • Allows institutions to describe engagement activities where the engagement indicators are not sufficient
  • Allows institutions to provide additional quantitative information, where relevant, which may be used by the ARC to develop indicators for subsequent rounds

[top]

Presentation slide header

Impact studies

  • Impact will be assessed primarily using qualitative information in the form of impact studies
  • Impact studies typically detail
    • Research underpinning the impact
    • Approach to impact
    • Details of the impact
  • Impact studies will allow for the inclusion of measures or indicators of impact