Addendum 6 1How to use this Publication 7



Download 3.79 Mb.
Page2/7
Date05.05.2018
Size3.79 Mb.
#47659
1   2   3   4   5   6   7

Table of Contents


Addendum 6

1How to use this Publication 7

2Background 9

2.1Objectives of ERA 9

2.2Definition of Research 9

2.3FoR codes 9

2.3.1Two-digit FoR code 9

2.3.2Four-digit FoR code 9

2.3.3Six-digit FoR code 10

2.3.4Implications of the FoR code hierarchy 10

2.4Unit of Evaluation (UoE) 10

2.4.1Low Volume Threshold 11

2.4.2Low volume and national benchmarks 11

2.5Interdisciplinary and multidisciplinary research 11

2.5.1Institutional coding 12

2.6Reference Period 12

2.7ERA Submission Journal List 12

2.8ERA Indicator Development 13

2.9Development of arrangements for ERA 2015 13

3ERA Roles and Responsibilities 14

3.1Expert Review 14

3.2Peer Review 14

3.3Responsibilities of the Research Evaluation Committee (REC) 14

3.4Responsibilities of a REC member 15

3.5Responsibilities of a REC Chair 15

3.6Responsibilities of a Peer Reviewer 16

3.7Review of ERA processes and feedback 16

3.8ERA Scrutiny Committee 16

3.9Confidentiality 16

3.10Conflict of interest (COI) 17

3.11Research Integrity and Research Misconduct 17

3.12Other sensitivities 18

3.12.1Commercially Sensitive research outputs 18

3.12.2Culturally Sensitive research outputs 18

3.12.3Australian Government Security Classified research outputs 18

3.13 Assignment outside area of expertise 18

3.14Copyright 18



4The ERA Evaluation Process 20

4.1ERA phases 20

4.1.1Submission 24

4.1.2Assignment 24

4.1.3Evaluation and moderation 24

4.1.4Reporting 25



5The ERA Indicators: Background 26

5.1Introduction to the ERA Indicator Suite 26

5.2The ERA Indicator Principles 28

5.3ERA Rating Scale 29

5.3.1Notes on the Rating Scale 29

5.4A Dashboard of Indicators 30

5.5Drilldowns 30

5.6Explanatory Statements 31

5.7Volume and Activity vs. Quality 31

5.8Assignment of FoRs to Research Outputs 31

5.9FTE and Headcount 32

5.10Research Income and Research Commercialisation Income 32

5.11Applied Measures (excluding Research Commercialisation Income) 32

5.12Esteem Measures 32

5.13SEER warnings 32

6The ERA Indicators: Detail 33

6.1Indicator contextual information 34

6.1.1Interdisciplinary profile 34

6.1.2Intradisciplinary profile 35

6.2UoE Indicator Summary 36

6.3Volume and Activity 37

6.3.1Research Outputs 37

6.3.2FTE Profile by Academic Level 40

6.3.3Research Output by Year 42

6.4Publishing profile 43

6.5Citation Analysis 49

6.5.1Relative Citation Impact (RCI) Profile 49

6.5.2Centile Analysis Profile 51

6.5.3Distribution of papers by RCI Classes 54

6.6Peer Review 57

6.7Research Income 64

6.8Applied Measures 68

6.8.1Research Commercialisation Income 68

6.8.2Patents 71

6.8.3Registered Designs 73

6.8.4Plant Breeder’s Rights 74

6.8.5NHMRC Endorsed Guidelines 76

6.9Esteem Measures 77

7Glossary 79

8Abbreviations 82

9Discipline Clusters 83

Appendix 1: Research Output Drilldowns 84

Appendix 2: Peer Review Drilldowns and Peer Reviewer template 89

Appendix 3: HERDC Category 1 Research Income Drilldown 92

Appendix 4: Applied Measure Drilldowns 94

Appendix 5: Citation Benchmark Methodology 96

Appendix 6: ERA 2015 Discipline Matrix by Cluster 105

Appendix 7: Fields of Research code summary 112

Appendix 8: Aboriginal and Torres Strait Islander Studies 144

Appendix 9: Eligible ERA Institutions 145



Addendum


A number of units were deemed to be unassessable by the RECs during the evaluation meeting. These are identified in the National Report as: n/r – not rated due to coding issues.
  1. How to use this Publication


The Excellence in Research for Australia (ERA) 2015 Evaluation Handbook has been written for Research Evaluation Committee (REC) members to assist in their evaluation of the quality of research undertaken in eligible higher education institutions (‘institutions’).

The Handbook discusses the ERA approach, outlines the evaluation process and the principles of the ERA indicator suite and provides detailed information about each of the indicators. The Handbook is organised into five sections:



Background

This section discusses the underlying ERA framework including: the ERA objectives, definition of research, the Fields of Research codes (FoR) and the Unit of Evaluation (UoE). It also summarises changes to the ERA approach for 2015.
ERA Roles and Responsibilities

This section discusses ERA Expert Review and ERA Peer Review. It also outlines the roles and responsibilities of the REC, as well as detailing how issues such as conflict of interest (COI), copyright and confidentiality are addressed in ERA.
The ERA Evaluation Process

This section outlines the four stages of the ERA process—submission, assignment, evaluation and reporting.
The ERA Indicators: Background

This section introduces the ERA Indicator Suite. It includes the ERA Indicator Principles and the ‘dashboard’ approach to evaluation. It also details how FoR codes are apportioned to submission data in ERA.
The ERA Indicators: Detail

This section provides an in depth description of each of the ERA indicators, including the graphical presentation of data, Australian and world benchmarks (where applicable) and a description of the role of various indicators in the evaluation process.

This Handbook should be read in conjunction with the policy documents in the list below.


Policy documents

  • ERA 2015 Submission Guidelines—provides guidance to institutions about ERA 2015 submission rules and components.

  • ERA 2015 Discipline Matrix—provides indicator applicability for disciplines. It is available in Appendix 6 of this Handbook or in Excel format from the ARC website.

  • ERA 2015 Peer Reviewer Handbook—outlines the evaluation process for ERA Peer Reviewers and provides information about the conduct of peer review.


Technical documents

  • ERA-SEER 2015 Technical Specifications—provides technical instruction for institutions preparing and submitting ERA 2015 submissions.

  • ERA-SEER 2015 Technology Pack—comprises technical documentation, Code Tables and XML schema related to the ERA 2015 submission process.

Further information about ERA is available on the ARC website. The ERA Team can be contacted by email at era@arc.gov.au or phone 02 6287 6755.


  1. Background

    1. Objectives of ERA


The objectives of ERA are to:

  1. establish an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australia‘s higher education institutions

  2. provide a national stocktake of discipline-level areas of research strength and areas where there is opportunity for development in Australian higher education institutions

  3. identify excellence across the full spectrum of research performance

  4. identify emerging research areas and opportunities for further development

  5. allow for comparisons of research in Australia, nationally and internationally, for all discipline areas.
    1. Definition of Research


For the purposes of ERA, research is defined as the creation of new knowledge and/or the use of existing knowledge in a new and creative way so as to generate new concepts, methodologies, inventions and understandings. This could include synthesis and analysis of previous research to the extent that it is new and creative.

Institutions must ensure that all research outputs submitted to ERA meet this definition of research. Outputs that do not meet this definition may be excluded from submissions during the ERA submission process or, where they are not excluded from submissions, their inclusion may adversely affect the quality rating assigned during the evaluation process.


    1. FoR codes


A Unit of Evaluation (UoE) in ERA is the discipline within an institution. For the purposes of ERA, disciplines are defined as four-digit and two-digit FoRs as identified in the Australian and New Zealand Standard Research Classification (ANZSRC). The ANZSRC provides 22 two-digit FoR codes, 157 four-digit FoR codes, and an extensive range of six-digit codes. The ANZSRC was released in 2008 by the Australian Bureau of Statistics (ABS) and Statistics New Zealand. It provides important information about each four-digit and two-digit FoR. The ANZSRC is available in full from the ABS website.
      1. Two-digit FoR code


This is the highest level of the ANZSRC hierarchy. A two-digit FoR code relates to a broad discipline field, such as 02 Physical Sciences. A two-digit FoR code consists of a collection of related four-digit FoR codes, such as 0201 Astronomical and Space Sciences, 0203 Classical Physics, and all other four-digit FoRs within the 02 Physical Sciences code.
      1. Four-digit FoR code


This is the second level of the ANZSRC hierarchy. A four-digit FoR code is a specific discipline field of a two-digit FoR code, for example, 0201 Astronomical and Space Sciences. A four-digit FoR code consists of a collection of related six-digit FoR codes. Institutions submit data for ERA at the four-digit FoR level.
      1. Six-digit FoR code


This is the lowest level of the hierarchy of ANZSRC codes. A six-digit FoR code is a further breakdown of a four-digit FoR code, for example, 020101 Astrobiology is within 0201 Astronomical and Space Sciences. Six-digit FoR data is not collected in ERA and evaluation is not conducted at this level.
      1. Implications of the FoR code hierarchy


ERA has been designed to provide flexibility for, and recognition of, discipline-specific research behaviours at both the four-digit and two-digit FoR code levels.

Although six-digit FoR codes are not assessed in ERA, it is important that REC members are aware of the diversity of six-digit FoR codes beneath the four-digit FoR codes. For many disciplines, the six-digit FoR codes represent a wide and diverse range of sub-disciplines which may have quite different publishing practices. For this reason, the profile for a particular four-digit FoR code for one institution may look very different from another institution’s because of the differences in focus at the six-digit level. For example, FoR 0502 Environmental Science and Management includes 12 diverse six-digit fields.

This means that the 0502 UoE at an institution with a focus on 050209 Natural Resource Management may have very different publishing behaviours and research outlets to another 0502 UoE at an institution which focuses primarily on 050201 Aboriginal and Torres Strait Islander Environmental Knowledge.

Similarly, REC members must be cognisant of the six-digit codes which sit beneath the 99 (‘other’) codes. In many cases, important sub-disciplines with significant research activity may be represented in the 99 (‘other’) codes. For example, FoR 1699 (Other Studies In Human Society) includes six separate six-digit fields, such as Gender Specific Studies and Studies of Asian Society.

For some broad discipline areas, related disciplines are located in different parts of the ANZSRC. For example, some areas of Materials Science can be found in 02 Physical Sciences, 03 Chemical Sciences, 09 Engineering Sciences, and 10 Technology. REC members should ensure they are aware of the boundaries for their allocated FoR codes, and the interaction of the related FoR codes. Please refer to Appendix 6: Discipline Matrix ERA 2015 for further information.

    1. Unit of Evaluation (UoE)


ERA evaluation occurs at both the four-digit and two-digit FoR code levels. A UoE for ERA is the research discipline, as defined by the ANZSRC four-digit and two-digit FoR codes, for an eligible institution (Appendix 9). UoEs do not correspond to named disciplines, departments or research groups within an institution.

Data for ERA is submitted by institutions at the four-digit FoR code level, and is aggregated to create four-digit and two-digit UoEs. Research Evaluation Committees (RECs) are formed around broad discipline groupings for the purpose of administering the ERA evaluations. RECs will evaluate both four-digit and two-digit UoEs.

The four-digit FoR codes generally align with their two-digit code within the same REC, with the exception of the four-digit FoR codes beneath ‘10—Technology’ which are split across three RECs. The construction of the ‘10—Technology’ two-digit UoEs for evaluation will include all the four-digit codes beneath (i.e. 1001–1099). The evaluation of the two-digit ‘10-Technology’ UoEs will occur by a cross-REC evaluation.

      1. Low Volume Threshold


Four-digit and two-digit UoEs will only be assessed where there is a meaningful level of data to be evaluated. An institution is only evaluated in ERA in a four-digit or two-digit discipline if the number of research outputs reaches a low volume threshold.

For disciplines where citation analysis is used, no evaluation will be conducted for the FoR at a given institution if the number of indexed journal articles over the six year reference period is fewer than 50 in any four- or two‑digit FoR.

For disciplines where peer review is used, no evaluation will be conducted for the FoR at a given institution where, over the six year reference period, there are fewer than the equivalent of 50 submitted research outputs. Books are given an effective weighting of 5:1 compared with other research outputs for the purposes of determining the low volume threshold in these disciplines; for other purposes in ERA they are counted as a single research output.

For some FoRs at some institutions, there may be insufficient research volume to undertake a valid analysis at the four-digit level, but sufficient research volume at the two-digit level. In these instances, evaluation will take place at the two-digit FoR code level only.

The two-digit profiles include all data from the four-digit FoR codes beneath, regardless of whether they reached the low volume threshold at the four-digit FoR code level. The two-digit FoRs, therefore, form unique UoEs and may present the RECs with a quite different profile from the constituent four-digit FoRs. For example, a two-digit UoE may contain a mix of material which has been evaluated at the four-digit level and material which has not.

In instances where an institution does not meet the low volume threshold in the FoR, the UoE will be publicly reported as ‘not assessed’. This means that data submitted on research outputs, research income, applied measures and esteem measures for the relevant two-digit or four-digit FoR for that institution will be collected but not evaluated under ERA. However, the data submitted will still contribute to the construction of the ERA benchmarks and all ERA data will aggregate for national-level reporting irrespective of whether any FoRs within a specific institution meet the low volume threshold.


      1. Low volume and national benchmarks


For the purposes of generating FoR-specific national benchmarks (referred to as Australian Higher Education Provider (HEP) benchmarks in ERA), the ARC will aggregate outputs within each of the two- and four-digit FoR code levels nationally. HEP benchmarks are used to profile an institution’s performance against other Australian HEPs. Therefore, these benchmarks will include information submitted to a particular FoR from all institutions, including data from the ‘not assessed’ UoEs.
    1. Interdisciplinary and multidisciplinary research


As ERA is a discipline-based research evaluation exercise, interdisciplinary and multidisciplinary research is disaggregated based on its discipline components. However, RECs will have access to information which shows the nature and extent of inter/multidisciplinary research for each UoE. Each research output can be assigned to a maximum of three four-digit FoRs. For each UoE RECs are able to view a ‘Discipline Profile’ showing the extent to which the research outputs of a UoE have also been assigned to other four-digit FoRs. This will provide additional information for the purposes of assigning UoEs to REC members, and is also contextual/discipline information for REC members to consider when undertaking their evaluation.

Where multi/interdisciplinary work is being considered, REC members may be assigned between RECs as required to bring appropriate expertise to bear on the evaluation. At the final REC evaluation meeting, all RECs will meet concurrently which also allows for cross-REC expertise to contribute to finalising evaluations.


      1. Institutional coding


Institutions may add institutional reporting codes that link components of their submission to particular institutional units such as research centres or departments. Following completion of the ERA evaluation, institutions will then be able to use these codes to compile information about, for example, an institutional unit in ‘climate change research’ that had its research outputs submitted for evaluation under a variety of Fields of Research (e.g. environmental science and management, atmospheric sciences, law, soil sciences and demography).

Institutional coding is for institutional use and not for the purposes of ERA evaluation.


    1. Reference Period


The collection of data for ERA 2015 is based on several reference periods as detailed in Table 1 below.

Table 1: ERA 2015 reference periods

Data Type

Reference Period

Years

Research Outputs

1 January 2008 to 31 December 2013

6

Research Income

1 January 2011 to 31 December 2013

3

Applied Measures

1 January 2011 to 31 December 2013

3

Esteem Measures

1 January 2011 to 31 December 2013

3

Data regarding eligible researchers is not collected for a reference period but based on a single staff census date of 31 March 2014.

    1. ERA Submission Journal List


The ERA 2015 Submission Journal List includes 24 028 scholarly journals. An article must be published in a journal included in the list in order to be submitted as a journal article in ERA.
The ERA 2015 Submission Journal List includes journals that meet the following criteria:

  • were active during the ERA 2015 reference period for research outputs (1 January 2008 to 31 December 2013)

  • are scholarly

  • have peer or editorial review policies acceptable to the discipline

  • have an ISSN.

Each journal on the list is assigned up to three FoR codes. The FoRs assigned to a journal are not listed in any order of relevance or importance.

A journal may be assigned either two-digit FoRs or four-digit FoRs. Where the subject matter of a journal is sufficiently broad to cover more than three four-digit FoR codes, the journal has been assigned one or more two-digit FoR codes; where the subject matter is sufficiently broad to cover more than three two-digit FoR codes, the journal has been assigned as Multidisciplinary (MD).


    1. ERA Indicator Development


During 2008, the ARC convened an Indicator Development Group (IDG), comprising experts in research metrics and statistics to consider, test and recommend appropriate
discipline-specific indicators, including measures of quality, applied research and research activity. To test the appropriateness of the proposed indicators for each discipline, the ARC held discipline cluster workshops with discipline experts. The ARC has also further consulted with the sector regarding the refinement of the indicators following the ERA 2010 and ERA 2012 evaluations. The indicator development process has been informed by analytical testing to verify the validity of the indicators. Where an indicator has not been clearly demonstrated to be a valid and robust measure of research quality for a discipline, it has not been included in ERA. The ERA Indicator Principles are included in Section 5: The ERA Indicators—Background, and the details of each indicator are discussed in Section 6: The ERA Indicators—Detail, of this Handbook.
    1. Development of arrangements for ERA 2015


The ARC has consulted broadly in the development of arrangements for ERA 2015. In addition to the feedback on ERA 2012 processes provided by the sector and REC members, the ARC issued a sector-wide consultation paper on a range of issues, as well as draft ERA 2015 documentation. The ERA 2015 Submission Guidelines and ERA 2015 Discipline Matrix are informed by the outcomes of these consultations.

For a list of submission-related changes for ERA 2015, see page 7 of the ERA 2015 Submission Guidelines. The changes relevant to the evaluation process include the addition of a new category of non-traditional research outputs for all disciplines­—Research Report for an External Body, which consists of four subcategories of reports. The ARC has also provided further clarification of the requirements for the selection of research outputs for peer review within a UoE.




  1. ERA Roles and Responsibilities

    1. Expert Review


Expert review informed by discipline-specific indicators is central to ERA evaluations. ERA evaluations are conducted by the members who comprise the RECs. Each four-digit UoE will be assigned to three REC members. The same REC members will automatically be assigned to the two-digit UoEs based on the four-digit assignments. In cases where only the two-digit UoE is evaluated, typically due to the low volume threshold, at least three REC members will be assigned.

Evaluations are informed by the range of indicators identified in the ERA 2015 Discipline Matrix at Appendix 6, with particular focus placed on those that relate most closely to the quality of research outputs—such as citation metrics and peer review.


    1. Peer Review


REC members have access to a pool of peer reviewers who have been recruited for ERA 2015. Peer reviewers are assigned by the principal reviewer, a nominated REC member, for each UoE in which peer review is used as an indicator, as identified in the ERA 2015 Discipline Matrix. In each case, the principal reviewer is expected to assign at least two peer reviewers to each UoE.

External peer reviewers report on, but do not rate, the sample of peer review outputs which they have reviewed. Their report informs the evaluations by the REC members. Peer reviewers do not have access to any of the ERA indicators or data presented to REC members, only the sample of outputs nominated by each institution for peer review.


    1. Responsibilities of the Research Evaluation Committee (REC)


The responsibilities of a REC are to:

  • assign an agreed rating for all UoEs under each four-digit and two-digit FoR code where there is sufficient volume for an evaluation

  • work with the other RECs to ensure consistent application across the exercise of the overall quality standards and common assessment procedures

  • provide feedback and advice as requested by the ARC on any aspects of the assessment process

  • report the results to the ARC.


    1. Responsibilities of a REC member


The responsibilities of a REC member are to:

  • participate fully in the evaluation process within their REC

  • abide by confidentiality and Conflict of Interest (COI) requirements as detailed in Section 3.10

  • maintain confidentiality of both the deliberations and decisions of the REC

  • identify all instances where they may have a COI or other sensitivity and raise these with the ARC prior to the conflict occurring

  • ensure they adequately prepare for meetings to avoid unnecessary additional administrative costs and inconvenience to other committee members

  • be diligent in completing tasks allocated to them by the REC Chair

  • assign external peer reviewers where required

  • evaluate assigned material and allocate preliminary ratings to each UoE

  • contribute fully, constructively and dispassionately to all REC processes and, within the capacity of their expertise, take ownership of the collective decisions of the REC

  • exercise due skill and care in the performance of their responsibilities.
    1. Responsibilities of a REC Chair


The responsibilities of a REC Chair are to:

  • ensure that the REC operates within the policies, guidelines and procedures established by the ARC

  • abide by confidentiality and COI requirements

  • ensure that confidentiality is maintained for the deliberations and decisions of the REC

  • identify instances where there may be COI or other sensitivity and raise these with the ARC prior to conflict occurring

  • contribute fully, constructively and dispassionately to all REC processes and take ownership of the collective decisions of the REC

  • assign material to REC members for evaluation

  • evaluate their own assigned material and give preliminary ratings

  • ensure that evaluations are completed within agreed timeframes

  • chair the REC meeting to review preliminary ratings, and guide the REC to provide final ratings for quality separately for each UoE

  • ensure that REC members have an opportunity to contribute fully to the process and REC activities

  • ensure that REC decisions are documented

  • report on the results to the ARC

  • participate in a review at the conclusion of the REC meeting and report to the ARC on the evaluation processes undertaken by the REC.

In the event that a REC Chair is unable to perform some or all of these responsibilities the ARC will appoint an Acting Chair from within the REC with responsibilities, determined by the ARC, for all or part of the responsibilities of a REC Chair. This will most commonly occur, for example, where the Chair has identified a COI and the ARC appoints an Acting Chair for the purposes of assigning material for evaluation.
    1. Responsibilities of a Peer Reviewer


The responsibilities of a peer reviewer are to:

  • evaluate assigned material and provide a report using the peer review template

  • be diligent in completing tasks allocated to them

  • exercise due skill and care in the performance of their responsibilities

  • identify instances where they may have a COI or other sensitivities, raise these with the ARC prior to conflict occurring and comply with the directions of the ARC relating to the management of COI

  • abide by confidentiality requirements.
    1. Review of ERA processes and feedback


Throughout their engagement for the purposes of ERA, REC members are invited and encouraged to comment on and provide feedback about all ERA processes. One of the outcomes of the evaluation meeting is that RECs will make recommendations for consideration by the ARC about future improvements for ERA processes. The ARC will also convene a meeting of REC Chairs at the conclusion of the evaluation phase for a range of purposes, including an overarching review of evaluation processes.
    1. ERA Scrutiny Committee


The ARC will appoint a Scrutiny Committee for ERA 2015 to:

  • scrutinise the processes followed by the RECs in assessing the ‘home’ UoE of each REC member. A REC member’s ‘home’ UoE would be the UoE associated with their institution and their primary four-digit FoR of expertise

  • scrutinise the outcome for each ‘home’ UoE with the benefit of relevant benchmark information from the ERA 2015 evaluations

  • provide a report to the ARC Chief Executive Officer (CEO) advising of any issues in relation to the evaluation outcomes.
    1. Confidentiality


REC members and peer reviewers are required to sign a confidentiality agreement with the ARC prior to their participation in ERA. The agreement covers all aspects of their work with ERA, and the agreement survives the conclusion of their engagement for the purposes of ERA.

REC members and peer reviewers may not contact researchers and/or institutions under any circumstances in relation to material that has been submitted for evaluation in ERA, or seek additional information from any sources. REC members and peer reviewers must not reveal details about any evaluation, deliberations or conclusions, at any time.


    1. Conflict of interest (COI)


A COI is any situation where a REC member or peer reviewer has an interest which conflicts, might conflict, or may be perceived to conflict with the interests of the implementation of ERA. Examples of COI include:

  • being employed by, or holding an adjunct or honorary appointment at, the institution that has made the submission which is being assigned

  • having a close personal relationship with someone whose work is significantly incorporated in the UoE task being assigned for evaluation. This could include a partner, spouse, family member or close friend. Included in this category is enmity

  • being a close collaborator with someone whose work is significantly incorporated in the UoE task that is being assigned for evaluation. For example, where a REC member is a close collaborator with authors for 10% or more of the total outputs of a UoE, that would constitute a potential COI

  • other conflicts that a REC member will need to raise and have clarified, including financial interests (for example holding a company directorship, stock ownership or options, patents, royalties, consultancy or grant) which could lead to financial gain to a REC member in circumstances where they have access to information or are able to influence decision-making.

While most COIs will be determined before the assignment of evaluation tasks occurs, REC members and peer reviewers may encounter material with which they have a potential COI during evaluation and are required to declare any potential or actual COI as soon as practicable after it has been identified. In such circumstances, the ARC will address each instance on a case by case basis, usually by reassigning the material to another reviewer.

A REC member or a peer reviewer will never be involved in considerations about UoEs in any discipline from their own institution, or any institution with which they have a declared COI.

    1. Research Integrity and Research Misconduct


As specified within the ARC Research Integrity and Research Misconduct Policy, anyone engaged on ARC business, such as ARC College of Experts members, Research Evaluation Committee members, Selection Advisory Committee members, external assessors and contractors, are required to report alleged breaches of research integrity or research misconduct issues identified in relation to ARC funded business to the ARC Research Integrity Officer.

The policy and contact details for the Research Integrity Officer are available on the ARC website.

Should you identify an alleged breach of research integrity or a research misconduct issue as part of your evaluation please notify the ARC Research Integrity Officer. A Notification Form for an Allegation of Research Integrity Breach or Misconduct (Attachment A of the policy) can be used to report the allegation.

The Research Integrity Officer will refer the allegation to the relevant institution for investigation in accordance with the requirements of the Australian Code for the Responsible Code of Research. Sufficient information should be provided to enable the institution to progress an investigation into the allegation (if required).


    1. Other sensitivities


To be eligible for ERA, all research outputs must either be published or made publicly available in the ERA reference period. However, if any research material causes offence or serious sensitivity to a REC members or peer reviewers, they are asked to raise their concern with the ARC as soon as practicable. In this case the UoE would normally be reassigned.
      1. Commercially Sensitive research outputs


A research output that includes commercially sensitive information may be included as part of a submission provided the necessary permissions have been obtained. This will be flagged to RECs and peer reviewers.
      1. Culturally Sensitive research outputs


A research output that is culturally sensitive may be included as part of a submission provided that the ARC is appropriately advised of the sensitivities. This will be flagged to RECs and peer reviewers.
      1. Australian Government Security Classified research outputs


A research output that includes information classified in line with the Australian Government Protective Security Manual as either ‘In-Confidence’ or greater, or ‘Restricted’ or greater, cannot be included in a submission (this also includes outputs subsequently classified as ‘Sensitive’, ‘For Official Use Only’, or greater under the Australian Government Security Classification System).
    1. Assignment outside area of expertise


One of the REC members will be assigned as the principal reviewer for a UoE. The principal reviewer will take the lead role in the discussion of that UoE at the REC meeting.

There will also be a number of cross-REC assignments where REC Chairs will be able to draw on expertise from members outside their own REC. On such occasions, REC members may be asked to evaluate UoEs that do not appear to correspond directly with their area of expertise. REC members’ scholarly judgement and views are extremely valuable in the evaluation and moderation of these UoEs.


    1. Copyright


ERA REC members and peer reviewers have access to, and use of, relevant research outputs to conduct ERA peer review. Acting under section 183(1) of the Copyright Act 1968 (Cth), the Commonwealth of Australia, as represented by the ARC, has authorised each ERA REC member and peer reviewer, to do acts comprised in the copyright of relevant material for the purposes of ERA. As a result, authorised REC members and peer reviewers may make all uses of relevant material that are necessary or convenient to enable their participation in ERA. The authorisation is strictly limited to their participation in ERA and will not extend to uses for any purpose unrelated to participation in ERA.

Access to research outputs is provided strictly for the purposes of conducting evaluation for ERA. REC members and peer reviewers are not permitted to reproduce or distribute the outputs for any purpose other than participation in ERA. To ensure appropriate protection of copyright material in ERA submissions, REC members and peer reviewers must at all times comply with the authorisation.


  1. The ERA Evaluation Process

    1. ERA phases


ERA 2015 consists of a number of phases, including Submission, Assignment, Evaluation and Reporting. Each of these phases is composed of a number of stages or activities. Table 2 below outlines the ERA 2015 phases and evaluation schedule.

Table 2: ERA phases and evaluation schedule


PHASE

ACTIVITY

Submission

Submission of data by eligible institutions to the ARC

Assignment

REC Chairs assign UoEs to REC members

REC members (principal reviewers) assign UoEs to peer reviewers

Evaluation

Stage 1

Preliminary individual evaluation of UoEs by REC members at the four-digit level, including peer review (where peer review is an identified indicator) of research outputs; evaluation of all assigned UoEs by peer reviewers
9 June to 27 July 2015

Stage 2A

REC members’ moderation of four-digit evaluations and preliminary independent evaluation of UoEs at the two-digit level
29 July to 8 September 2015

Stage 2B

REC members’ moderation of two-digit evaluations
10 September to 29 September 2015

Stage 2C

REC members’ review of moderated four-digit and two-digit evaluations in preparation for the Stage 3 meeting
1 October to 8 October 2015

Stage 3

Meeting of all RECs to finalise recommended evaluation outcomes
12 October to 16 October 2015

Reporting

State of Australian University Research 2015 -2016: Volume 1 ERA National Report published

The various stages of the ERA 2015 Evaluation process are outlined in Figure 1 below.



Figure 1: ERA stages and activity


      1. Submission


Institutions will be given access to the ERA IT system, the System to Evaluate the Excellence of Research (SEER), to upload their ERA data. The data will be verified and validated to ensure that they meet the ERA requirements (see the ERA 2015 Submission Guidelines). The submitted data are used to construct UoEs for each four-digit and two-digit FoR code which includes all relevant indicators for evaluation as well as the relevant national and international benchmarks.
      1. Assignment


At the conclusion of the submission phase, UoEs will be assigned to REC members by the REC Chair for evaluation, except in particular instances of identified COI, in which case an Acting Chair will be appointed by the ARC for the purposes of assignment.

Each four-digit UoE will be assigned to three REC members. REC members will automatically be assigned to two-digit UoEs based on their four-digit assignments. There will also be a number of cross-REC assignments where REC Chairs will be able to draw on expertise from members outside their own REC.

Each UoE will have a REC member appointed as principal reviewer who will take a lead role in discussion of the UoE at the Stage 3 Evaluation Meeting. Where peer review is identified as an indicator, external peer reviewers will be assigned for the purposes of constructing the peer review indicator.

REC Chairs and REC members should take account of identified COIs and workload when assigning UoEs for review or evaluation.

The appointed principal reviewer for a UoE will assign peer reviewers for that UoE. Assignment is based on peer reviewer expertise at the two- and four-digit FoR code level. Principal reviewers may also need to consider the expertise of peer reviewers at the six-digit FoR code level to ensure that evaluation is carried out by those with the appropriate expertise. This may be particularly relevant for Indigenous research. The ANZSRC provides alternative groupings to aid the understanding of research from different cultural perspectives which are unique to Australia and New Zealand. Appendix 8 provides a list of six-digit codes relating to Aboriginal and Torres Strait Islander Studies with the related four-digit codes and discipline grouping.

      1. Evaluation and moderation


Evaluation in ERA is essentially conducted online to access the relevant data, indicators and peer review outputs (where peer review is an indicator) for each assigned UoE. REC members review the range of relevant indicators to reach a preliminary view (a rating with reference to the ERA rating scale and supporting text) about each UoE, and to record that view in SEER prior to the REC meeting. Peer review similarly is conducted through SEER, and peer reviewers have access through SEER to nominated outputs for peer review.

In the first instance, preliminary evaluations at the four-digit and two-digit levels are conducted independently by REC members. Evaluation is split across several stages as illustrated in Figure 1. In Stage 1, REC members undertake their initial evaluations of


four-digit UoEs, independently from each other. In Stage 2A REC members have access to the evaluations of other REC members that are co-assigned to the same UoEs to allow them an opportunity to reflect on their preliminary evaluations, and to provide opportunity for moderation between REC members’ preliminary ratings.

Moderation is an integral process in ERA—it ensures that each evaluation is conducted as an exchange of views between experts in a discipline and their colleagues in other disciplines. This process promotes the standard application of the ERA methodology across disciplines. In Stages 1—2C, moderation is conducted independently in SEER with individual REC members considering their own evaluations in light of the posted ratings and comments of other reviewers of the same UoE. It does not involve direct communication with other reviewers.

At the conclusion of the online evaluation stages the RECs will convene to consider all of the preliminary evaluations and agree to final evaluation outcomes for each UoE. The final ratings are the decision of the entire REC and every UoE will be discussed by the REC as a committee, except where REC members are excluded due to an identified COI.

The ratings agreed by the RECs are final. The RECs will deliver their agreed final ratings to the ARC.


      1. Reporting


The ERA National Report will be produced by the ARC. The National Report will present a comprehensive assessment by discipline of the quality of research activity conducted in Australia’s higher education institutions. This report will provide information on the discipline-specific research activity of each eligible Australian higher education institution and the contribution of each discipline to the national landscape. In addition, the ARC will provide a range of information to individual institutions following the completion of the ERA 2015 evaluations, to assist further with their understanding of the ERA results.
  1. The ERA Indicators: Background

    1. Introduction to the ERA Indicator Suite


ERA is based on the principle of expert review informed by indicators. Quantitative and qualitative indicators present significant amounts of data in a readily accessible format. Many of the tabular indicator presentations are complemented by graphical presentations, which display the same data in a different format.

The indicator profiles in ERA serve several functions:



  • to summarise data within a UoE

  • to provide a mechanism for REC members to review subsets of data through drilldown menus

  • to understand how a UoE performs relative to other Australian institutions

  • to understand how a UoE performs relative to the world.

The ERA indicator suite has been developed to align with the research behaviours of each discipline. For this reason, there are differences in the selection of indicators. For example, some disciplines use citation analysis, while others use peer review of research outputs—peer review and citation analysis are not used in combination.



Figure 2 shows ERA 2015 indicators at a glance. Detailed information on which FoR codes use which indicators is available in the ERA 2015 Discipline Matrix (see Appendix 6).

Figure 2: ERA Indicators at a glance



Flowchart diagram. ERA 2015 Indicators


    1. Download 3.79 Mb.

      Share with your friends:
1   2   3   4   5   6   7




The database is protected by copyright ©ininet.org 2024
send message

    Main page