University rankings new method



Download 54.52 Kb.
Date19.10.2016
Size54.52 Kb.
#4373




UNIVERSITY RANKINGS – NEW METHOD
Sabah Al – Najjar

Mathematics and Statistics Department, UC – Zayed University

P .O .Box 19282 – Dubai, UAE

Email: Sabah.alnajjar@zu.ac.ae



ABSTRACT
In the past few years university rankings have become an unavoidable part of academic life. It has created competition between institutions and has become a marketing tool. It has become a helpful and/or complimentary factors for evaluating universities and their processes. Ranking is a term used to refer to the rating and ordering of higher educational institutions. It is controversial and no ranking is absolutely objective. We should be cautious about any ranking. It would be impossible to have a comprehensive ranking world-wide because of their huge differences through a large variety of countries and the technical difficulties of obtaining comparable data.

Rankings focus not only on universities as a whole, but also on various fields such as teaching, research etc. Ranking lists originally began at the national level, and within a clearly defined political and cultural region. International ranking lists are offering highly condensed comparative information. However, we might ask: how useful is this information and how reliable? And whether or not the ranking affect the decision-making , the students and the university’s behavior, and whether the ranking have affected on the students admissions.

One thing is certain: the wider the boundary and the more diverse the setting, the less valid are the ranking results. Political authority and policy makers are usually interested in national and regional rankings.

The purpose of this work is intended to provide some kind of review, with background information on most popular existing ranking systems; to help in understanding some of the pitfalls of university ranking; and to suggest a new objective, and to provide a reliable and useful measurement for ranking , such as, Gulf GCC universities.


Keywords: Assessment, Criteria, Indicators, Media, Methodology, Organizations, Research, Standardize, Systems, Variables.



I. BASIC APPROCHES FOR RANKING A UNIVERSITY
There are many approaches based on a number of principles such as:





  • a multidimensional concept of university quality instead of a one size-fit all approach; one that takes into account the diversity of academic institutions, missions and goals as well as language and cultural specifics.




  • A separate measurement and presentation of single indicators that may be ranked separately, allowing for individual preferences instead of an overall score.



  • A presentation of ranking results in rank groups (top-middle-bottom) instead of league tables.


II. RANKING METHODOLOGY
There are major differences in ranking methodology used:
-The different definition of university quality;

-The different criteria and indicators to measure quality; and

-The different weightings for each indicator.
For this reason the ranking results are also very different, therefore methodological decision plays a crucial role in determining the rule to be used.

The ranking process starts with the collection of information, existing data sources, site visits, studies and research. The type and quantity of variables are then decided upon and selected, and indicators and sub-indicators will be standardized and weighted from the selected variables.



Finally, the calculation and the analysis are conducted. Following this the universities are sorted into ranking order. Thus indicators can play a crucial role in the decision making in the results of ranking. Those indicators and criteria usually will take into consideration the following:- scientific, pedagogic, administrative and socio -economic aspects, student staff ratio, first year students, score in secondary schools, teaching and research, library and computer spending, drop out rate, satisfaction (student faculty), study conditions, employment perspective etc;
III. RANKING ORGANIZATIONS
Ranking has both direct impact on the higher education system and have attracted public attention. Because several systems have been proposed to rank the academic institutions, we can classify the organizations that rank by:
(i)-Magazines and Media; ranking was done primarily for commercial reasons to attract customers. Some magazines used: US New World Report, Newsweek, MacLean's Magazine, The Financial Times, The Times, Times Higher Education Supplement (THES), Economist, Der Spiegel, Le Nouvel Observateur, Liberation…etc;
(ii)Educational authorities such as -Teaching Quality Assessment and Research Assessment Exercise' in the UK, Scientific Academies In The University Sectors in the Netherlands, The Teaching and Research Ranking Lists in Germany, the National Research Council's Ranking Lists Research Doctorate program in the USA, the Shanghai Ranking Lists in China,… etc.
IV. THE MAIN CURRENT UNIVERSITY RANKING


  1. Institute of Higher Education, Shanghai Jiao Tong University; In 2004 Shanghai Jiao Tong University compiled an academic ranking of world universities. It ranks the top universities throughout the world according to a formula that took into account five criteria:- The number of winning Nobel prizes and Fields Medals in mathematics;-Highly cited researchers, -Articles published in Natural Science, -Articles in Science Citation,-Index expanded and Social Science Citation Index,-and- Academic performance per faculty at each university. For each criterion the highest scoring institution is assigned a score of 100, and other institutions to be calculated on a percentage of the top score. Scores for each indicator are weighted to arrive at a final overall for an institution. However, this model only uses criteria related to research performance on the basis of quantitative indicators.




  1. The Times Higher Educational Supplement (THES); THES depends heavily on surveys of thousands of experts, and produces a world wide ranking based on five qualitative and quantitative indicators , Peer review worldwide interviews -International reputation among recruiters, number of citations per faculty, International research impact, International faculty interviewed . This model heavily uses criteria related to research performance and international outlook.




  1. The Sunday Times; Universities were ranked according to nine indicators.-Student satisfaction, Teaching Excellence, Heads/peer assessments, Research quality, A/AS level points, Unemployment, First/2:is awarded;-student/.staff ratio;-and- Dropout rate




  1. The Leadin Ranking; The Leadin University's centre for Science and technology (WTS) has developed ranking lists based exclusively on some indicators for 100 European universities with the largest number of Scientific publications. The ranking list proposed four different indicators, large number of publications, number of citations per publication (CPPS),Total number of publications multiplied by the relative impact in the given field (p*CPP/FCSM) and the number of citations per publication divided by the average impact in the given field (CPP/FCSM).




  1. Financial Times Ranking; the objectives of FTR are to produce a listing of business schools that are producing the global managers for the 21st century. They are publishing four different ranking:-Global MBA, Executive MBA, Executive Education and , European Business School.

There are some 15-20 criteria used in the different rankings. The ranking criteria vary but they are mostly based on four general aspects, Career progress of alumni, International focus of the program, Research strength of the faculty, and gender diversity with regard to participants, faculty and board.

(vi) CHE Ranking; In 2002 the most serious attempt to put a German University ranking system in place came to fruition. The centre for University Development (CHE) compiled the study with the German weekly magazine 'Sterm'. They looked at 242 nationally recognized universities and professional schools. More than 100,000 students and 10,000 professors took part in the process. Around 30 indicators were measured. Some variable data such as student numbers, the average study duration and the number of graduations were also considered. However, judgments on the quality of teaching and specialist areas played a more decisive role than factors such as the atmosphere at the university or the library equipment.




  1. Organization of the Islamic Conference Rankings (OIC); The statistical, Economic and Social Research and Training Center for Islamic Countries/Ankara (SESRTIC) made a study to develop a way to evaluate OIC universities. The study was based on research performance of the OIC universities. The coverage included published articles in the period 2001-2006 in journals covered by the Institute for Scientific Information (ISI), vis. Science Citation Index, Science Citation Index-Expanded (Sci-Expanded), and , Social Science Citation Index (SSCI). The rankings were based on four indicators, Composite Index Research Quality, Research Performance, Research Volume, and- Rate of Growth for Research Quality.




  1. News and World report; The magazine itself published an ‘America’s best colleges’ report and is disclosed annually. The Rankings are based on 2 integral factors;

•Data extracted by U.S. News from educational institutes (Via annual survey

or from the school website itself)
•Surveys aimed at opinions of various university faculties and external

administrators.





  1. The Washington Monthly’s annual college guide; A guide which is

unbiased towards the importance of categories. This is due to the via equal

weightings devised for the calculation of the final scores. The technique of

ranking used relied on the following 3 categories and its sub components;
•Community Service

- Military training corporations in particular the percentage of Students

enrolled for the Army and the Navy reserve.

-Alumni and the percentage of those working in the peace corporations

-Community service projects and the percentage of federal work-study grants allocated to it.
•A Schools Research

-Amount of spending on research from the institution

-Amount of Ph.D. awards in Science and engineering

-Data regarding undergraduate alumni who those who have achieved a

Ph.D. in various subjects
•Social Mobility (Helping the poor)
Shanghai Jiao Tong University and The Times Higher Educational Supplement are the most publicly visible ranking systems.
V. NEW METHOD OF RANKING
In an effort to address the many methodological problems of ranking lists, an International Ranking Expert Group (IREG) was founded in 2004 in Bucharest by the

UNESCO-European Centre for Higher Education. (UNESCO-CEPES) and Institute for Higher Education Policy in Washington DC. In the second meeting in Berlin, in 2006, the IREG sets the principles of quality and good practice in ranking of higher education. It sets a framework for the elaboration and dissemination of ranking whether they are national, regional or global in scope, which ultimately would lead to a system of continuous improvement and refinement of methodologies used to conduct these rankings.

So many methods and hypothesis in ranking universities are resulting people mistrusting the effectiveness and they often ask if they are even worth using (especially since the ranking methods serve different purposes utilizing certain indices. From this it can be said that using a more unified and systematic approach, using unbiased guidelines such as UNESCO – Europe recommendations are more reliable.

In order to find excellence (between the universities) in educational and research performance, taking IREG’s recommendations, as a framework, and putting in our mind the following questions:-

Do the young universities need a winning Nobel prize or a Fields medal in mathematics?

Do the awards have any relevant affect on the quality of learning or are they simply there for university image?

How can we give the younger universities the chance to prove themselves and demonstrate their academic ability?

A new method has been proposed to evaluate in a more objective, reliable and useful way. The previous ranking methods seemed not to have a good and constructive validity, either in addressing their methodology, or in serving fairly many universities (including those in the GCC). The most popular ranking methods depend heavily on research performance.

The Goals of this new rankings are;


  • To participate in one of the methods for assessment of higher education inputs and output processes. The Ranking system should be a useful tool , from a marketing perspective and complimentary to the work of accrediting authorities and independent review agencies.



  • To have a sufficient level of clarity, when referring to the assigned purpose and target. The Indicators should be designed to be objective orientated.



  • To Identify both the diversity and various missions and targets of the institutions .



  • Detail historical, economic, cultural and linguistic aspects of the educational system being ranked.

The Universities will be assessed and contrasted using three broad criteria's ; teaching, research and outreach (i.e. performance in the community service). Further to this, the criteria of the performance measures consist of more indicators . Table below shows the criteria, their indicators, and respective weighting.



Table showing the criteria, indicators and respective weighting for the new method


Criteria

Indicator

Weightings

Teaching


Ratios:, Budget-Students, Lab-Budget & Student-Faculty.

Student Satisfaction: Courses, Quality of teaching & Faculty.

Faculty and Staff Satisfaction: Admin, Transparency & Salary

Learning resources, Library & Computer Budget, Course Load, Specialized courses (Cultural & Global)

Student grade at first enrollment, number of students that had left before completing their degree.


50%


Research

Allocated budget for both research & the university itself, articles published per faculty, amount of conferences, workshops, participation of Faculty & Staff.

30%

Outreach

Employment after graduation as a percentage, further education, external links with other institutions, businesses, industries, government & organizations

20%



We utilize the Weigh – and – add approach. Taken as a whole the score is determined via weighted criteria which are measured by sub-indicators serving as alternatives for the quality of both academia & institution. Weighted scores relevant to each indicator is added to produce an overall score. After comparison with other universities the overall score will finally determine the ranking position in the league table. The highest performer for each indicator is given a score of 100. Remaining universities are then calculated as a percentage of the top scorer. The scores for each indicator are weighted as shown above, which arrives at a final overall score for the institution. After the final step of calculation and analysis, the universities are sorted into a ranking order.

VI. CONCLUSION

It would be impossible to have a comprehensive ranking world-wide because of their huge differences through a large variety of countries and the technical difficulties of obtaining comparable data. A new method has been proposed to evaluate in a more objective, reliable and useful way

Therefore, it is recommendable to start ranking locally before moving on to regional. After this has been done ranking in a global context can be targeted. This would contribute to learning our strengths and weaknesses which would in turn assist in closing the gaps for globalization purposes.

REFERENCES

Australian Universities Quality Agency (AUQA): Ranking of Higher Education Institutions, ISBN: 1 877090 57 3, Occasional Publications Series No: 6. http://www.auqa.edu.au/files/publications/ranking_of_higher_education_institutions_final.pdf


Christopher Morphew and Bruce Baker ,On the Utility of National Datasets and Resource Cost

Models for Estimating Faculty Instructional Costs in Higher Education, J of Education Finance 33:1, 2007, PP20 – 48


Goldstein, H. / Spiegelhalter, D.J.: League tables and their limitations: Statistical issues in comparisons of institutional performance, in: JOURNAL OF THE Journal of the Royal Statistical Society. Series A (Statistics in Society), 1996 , vol. 159, no. 3, p. 385-409.
Hanges, Paul J.; Lyon, Julie S. Relationship Between U.S. News and World Report's and the National Research Council's Ratings/Rankings of Psychology Departments, American Psychologist, v60 n9 p1035-1037 Dec 2005
How the guide compiled, The Sunday Times, September 23,2007.

Jonathan King, Public Affairs, We're No. 2! (Now what?)


No, it's not the BCS standings. A new set of university rankings places Berkeley second worldwide. Should we shout the news from the rooftops, or put it gingerly back in Pandora's box? UC BERKELEV News, 2004.
John PA Ioannidis, Nikolaos A Patsopoulos, Fotini K Kavvoura, Athina Tatsioni, Evangelos Evangelous, Ioanna Kouri, Despina G Contopoulos-Ioannidis, and George Liberopoulos; International ranking systems for universities and institutions: a critical appraisal, BMC Medicine 5: 30, Oct 2007.
Kimberly D. Elsbach, Roderick M. Kramer, Members’ Responses to Organizational Identity Threats: Encountering and Countering the Business Week Rankings. Administrative Science Quarterly, V41, p442-76 S’96.
N.C. Liu and Y. Cheng, (2005), Academic Ranking of World Universities – Methodologies and Problems, Higher Education in Europ, Vol. 30, No 2.
Michael Sauder and Ryon Lancaster, Do Rankings Matter? The Effects of U.S. News &

World Report Rankings on the Admissions Process of Law Schools, Law & Society Review, Volume 40, Number 1 (2006)
Vivien A. Beattie and Alan Goodacre, 2005,A New Method for Ranking Academic Journals in accounting and Finance, University of Stirling Accounting, Finance and Law Working paper No. 01.

Academic Rankings of Universities in the OIC countries, Statistical, Economics, and Social Research and Training Centre for Islamic Countries (SESRTCIC), http://www.sesrtcic.org, 2007

Academic Rankings of Universities in the OIC countries, Statistical, Economics, and Social Research and Training Centre for Islamic Countries (SESRTCIC), http://www.sesrtcic.org, 2007

Academic Ranking of World Universities, http://en.wikipedia.org/wiki/Academic_Ranking_of_world_Universities.


“Academic strike back at spurious rankings” http://openaccess.eprints.org/index.php?/archives/251-guid.html
Berlin Principle on Ranking of Higher Education Institutions, www.cepes.ro/hed/meetings/berlin06/Berlin Principles.pdf
Clark Keer, The purpose of a university is to make students safe for idea – not ideas safe for students,

http://www.topuniversities.com/worlduniversityrankings/methodology/purpose_amp_app/


College and University Rankings, International Rankings, http://www.library.uiuc.edu/edx/rankint.htm
College and university rankings, http://en.wikipedia.org/wiki/College_and_university_rankings
Performance Ranking of Scientific papers for World Universities, http://ranking.heeact.edu.tw/en-us/2008/TOP/100
How do we identify Highly Cited Researchers?, http://isihighlycited.com/isi_copy/howweidentify.htm
Robert Morse and Sam Flanigan, How We Calculate the Rankings, http://www.usnews.com/articles/education/best-colleges/2008/08/21/how-we-calculat

Ranking Web of World Universities, www.mavir.net



New Method offers Better Way To Rank Universities, Researchers Say, University of Florida News, July 2000, http://news.ufl.edu/2000/07/26/top-us/
University Rankings, World Education News & Reviews, Volume 19, 4, 2006, http://www.wes.org/eWENR/06aug/practical.htm.
The Methodology: A simple overview, TOPUNIVERSITIES, http://www.topuniversities.com/worlduniversityrankings/methodology/simple_overview/
Ranking Web of World universities 2008, http://www.webometrics.info/top4000.asp
THES – QS World University Rankings, http://en.wikipedia.org/wiki/THES_University_Rankings
The Washington Monthly’s Annual College Guide, http://www.washingtonmonthly.com/features/2006/0609.collegeguide.html
U.S. News & World Report, http://en.wikipedia.org/wiki/US_News_&_World_Report





[] logo white


Download 54.52 Kb.

Share with your friends:




The database is protected by copyright ©ininet.org 2024
send message

    Main page