Personal Research Database



Download 6.47 Mb.
Page152/275
Date02.05.2018
Size6.47 Mb.
#47265
1   ...   148   149   150   151   152   153   154   155   ...   275
85 (1), 203-217.

Full Text: 2010\Scientometrics85, 203.pdf

Abstract: This paper focuses the attention on the ch-index, a recent bibliometric indicator similar to the Hirsch (h) index, to evaluate the published research output of a scientist (Ajiferuke and Wolfram, Proceedings of the 12th international conference of the international society for scientometrics and informetrics. Rio de Janeiro, pp. 798-808, 2009). Ch-index is defined as the number such that, for a general group of scientific publications, ch publications are cited by at least ch different citers while the other publications are cited by no more than ch different citers. The basic difference from the classical h is that, according to ch, the diffusion of one author’s publication is evaluated on the basis of the number of different citing authors (or citers), rather than the number of received citations. The goal of this work is to discuss the pros and cons of ch and identify its connection with h. A large sample of scientists in the Quality Engineering/Management field are analyzed so as to investigate the novel indicator’s characteristics. Then, the analysis is preliminarily extended to other scientific disciplines. The most important result is that ch is almost insensitive to self-citations and/or citations made by recurrent citers, and it can be profitably used for complementing h.

Keywords: Author, Bibliometric, Bibliometric Indicators, Bibliometrics, Citations, Citers, Citing Authors, h-Index, Hirsch Index, Hirsch-Index, Impact, Journals, Quality, Recurrent Citers, Research, Science, Self-Citation, Self-Citations

? Velden, T., Haque, A. and Lagoze, C. (2010), A new approach to analyzing patterns of collaboration in co-authorship networks: Mesoscopic analysis and interpretation. Scientometrics, 85 (1), 219-242.

Full Text: 2010\Scientometrics85, 219.pdf

Abstract: This paper focuses on methods to study patterns of collaboration in co-authorship networks at the mesoscopic level. We combine qualitative methods (participant interviews) with quantitative methods (network analysis) and demonstrate the application and value of our approach in a case study comparing three research fields in chemistry. A mesoscopic level of analysis means that in addition to the basic analytic unit of the individual researcher as node in a co-author network, we base our analysis on the observed modular structure of co-author networks. We interpret the clustering of authors into groups as bibliometric footprints of the basic collective units of knowledge production in a research specialty. We find two types of coauthor-linking patterns between author clusters that we interpret as representing two different forms of cooperative behavior, transfer-type connections due to career migrations or one-off services rendered, and stronger, dedicated inter-group collaboration. Hence the generic coauthor network of a research specialty can be understood as the overlay of two distinct types of cooperative networks between groups of authors publishing in a research specialty. We show how our analytic approach exposes field specific differences in the social organization of research.

Keywords: Bibliometric, Chemistry, Co-Author Networks, Coauthorship, Community Structure, Complex Networks, Disciplines, Growth-Model, International Scientific Collaboration, Journal Literature, Manifestation, Network Analysis, Productivity, Research, Science, Scientific Communication

? Aguillo, I.F., Bar-Ilan, J., Levene, M. and Ortega, J.L. (2010), Comparing university rankings. Scientometrics, 85 (1), 243-256.

Full Text: 2010\Scientometrics85, 243.pdf

Abstract: Recently there is increasing interest in university rankings. Annual rankings of world universities are published by QS for the Times Higher Education Supplement, the Shanghai Jiao Tong University, the Higher Education and Accreditation Council of Taiwan and rankings based on Web visibility by the Cybermetrics Lab at CSIC. In this paper we compare the rankings using a set of similarity measures. For the rankings that are being published for a number of years we also examine longitudinal patterns. The rankings limited to European universities are compared to the ranking of the Centre for Science and Technology Studies at Leiden University. The findings show that there are reasonable similarities between the rankings, even though each applies a different methodology. The biggest differences are between the rankings provided by the QS-Times Higher Education Supplement and the Ranking Web of the CSIC Cybermetrics Lab. The highest similarities were observed between the Taiwanese and the Leiden rankings from European universities. Overall the similarities are increased when the comparison is limited to the European universities.

Keywords: Bibliometric Methods, Comparative Analysis, Leiden Ranking, Ranking, Shanghai Ranking, Taiwan Ranking, Times Ranking, Universities, Webometrics Ranking

? Fu, L.D. and Aliferis, C.F. (2010), Using content-based and bibliometric features for machine learning models to predict citation counts in the biomedical literature. Scientometrics, 85 (1), 257-270.

Full Text: 2010\Scientometrics85, 257.pdf

Abstract: The most popular method for judging the impact of biomedical articles is citation count which is the number of citations received. The most significant limitation of citation count is that it cannot evaluate articles at the time of publication since citations accumulate over time. This work presents computer models that accurately predict citation counts of biomedical publications within a deep horizon of 10 years using only predictive information available at publication time. Our experiments show that it is indeed feasible to accurately predict future citation counts with a mixture of content-based and bibliometric features using machine learning methods. The models pave the way for practical prediction of the long-term impact of publication, and their statistical analysis provides greater insight into citation behavior.

Keywords: Bibliometric, Bibliometrics, Citation Analysis, Information Retrieval, Machine Learning, Text Categorization

? Gomez-Sancho, J.M. and Mancebon-Torrubia, M.J. (2010), A new approach to measuring scientific production in JCR journals and its application to Spanish public universities. Scientometrics, 85 (1), 271-293.

Full Text: 2010\Scientometrics85, 271.pdf

Abstract: Scientific production has been evaluated from very different perspectives, the best known of which are essentially based on the impact factors of the journals included in the Journal Citation Reports (JCR). This has been no impediment to the simultaneous issuing of warnings regarding the dangers of their indiscriminate use when making comparisons. This is because the biases incorporated in the elaboration of these impact factors produce significant distortions, which may invalidate the results obtained. Notable among such biases are those generated by the differences in the propensity to cite of the different areas, journals and/or authors, by variations in the period of materialisation of the impact and by the varying presence of knowledge areas in the sample of reviews contained in the JCR. While the traditional evaluation method consists of standardisation by subject categories, recent studies have criticised this approach and offered new possibilities for making inter-area comparisons. In view of such developments, the present study proposes a novel approach to the measurement of scientific activity, in an attempt to lessen the aforementioned biases. This approach consists of combining the employment of a new impact factor, calculated for each journal, with the grouping of the institutions under evaluation into homogeneous groups. An empirical application is undertaken to evaluate the scientific production of Spanish public universities in the year 2000. This application considers both the articles published in the multidisciplinary databases of the Web of Science (WoS) and the data concerning the journals contained in the Sciences and Social Sciences Editions of the Journal Citation Report (JCR). All this information is provided by the Institute of Scientific Information (ISI), via its Web of Knowledge (WoK).

Keywords: Accuracy, Citation, Citation Analysis, Cross-Field, Field-Normalization, Impact Factors, Indicators, ISI, Journal Impact Factor, Performance, Research Evaluation, Universities

? Bookstein, F.L., Seidler, H., Fieder, M. and Winckler, G. (2010), Too much noise in the Times Higher Education rankings. Scientometrics, 85 (1), 295-299.

Full Text: 2010\Scientometrics85, 295.pdf

Abstract: Several individual indicators from the Times Higher Education Survey (THES) data base-the overall score, the reported staff-to-student ratio, and the peer ratings-demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the “top 200” universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.

Keywords: Rankings, Statistical Noise, Times Higher Education Ranking

? Zyczkowski, K. (2010), Citation graph, weighted impact factors and performance indices. Scientometrics, 85 (1), 301-315.

Full Text: 2010\Scientometrics85, 201.pdf

Abstract: A scheme of evaluating an impact of a given scientific paper based on importance of papers quoting it is investigated. Introducing a weight of a given citation, dependent on the previous scientific achievements of the author of the citing paper, we define the weighting factor of a given scientist. Technically the weighting factors are defined by the components of the normalized leading eigenvector of the matrix describing the citation graph. The weighting factor of a given scientist, reflecting the scientific output of other researchers quoting his work, allows us to define weighted number of citation of a given paper, weighted impact factor of a journal and weighted Hirsch index of an individual scientist or of an entire scientific institution.

Keywords: Citation, Citation Graph, Citations, Eigenvector, Google, h-Index, Hirsch Index, Hirsch-Index, Pagerank, Performance Index, Science, Self-Citations, Weighted Bibliometric Indices

? Wiles, L., Olds, T. and Williams, M. (2010), Evidence base, quantitation and collaboration: three novel indices for bibliometric content analysis. Scientometrics, 85 (1), 317-328.

Full Text: 2010\Scientometrics85, 317.pdf

Abstract: Bibliometric measurements, though controversial, are useful in providing measures of research performance in a climate of research competition and marketisation. Numerous bibliometric studies have been performed which rely on traditional indices (such as the journal impact factor and citation index) and provide little descriptive data regarding the actual characteristics of research. The purpose of this study was two-fold, to develop three novel bibliometric indices, designed to describe the characteristics of research (relating to evidence base, quantitation and collaboration), and to apply them in a cross-sectional audit of original research articles published in Australian professional association journals across medicine, nursing and allied health in 2007. Results revealed considerable variation in bibliometric indices across these journals. There were emerging clusters of journals that published collaborative research using higher levels of evidence and reported quantitative data, with others featuring articles using lower levels of evidence, fewer quantitative data and less collaboration among authors.

Keywords: Allied Health Occupations, Authorship, Bibliometric, Bibliometrics, Gender, Health Research, Medicine, Nursing Health Occupations, Professional Practice, Research

? Albarran, P., Crespo, J.A., Ortuno, I. and Ruiz-Castillo, J. (2010), A comparison of the scientific performance of the US and the European union at the turn of the 21st century. Scientometrics, 85 (1), 329-344.

Full Text: 2010\Scientometrics85, 329.pdf

Abstract: In this paper, scientific performance is identified with the impact that journal articles have through the citations they receive. In 15 disciplines, as well as in all sciences as a whole, the EU share of total publications is greater than that of the U.S. However, as soon as the citations received by these publications are taken into account the picture is completely reversed. Firstly, the EU share of total citations is still greater than the U.S. in only seven fields. Secondly, the mean citation rate in the U.S. is greater than in the EU in every one of the 22 fields studied. Thirdly, since standard indicators-such as normalized mean citation ratios-are silent about what takes place in different parts of the citation distribution, this paper compares the publication shares of the U.S. and the EU at every percentile of the world citation distribution in each field. It is found that in seven fields the initial gap between the U.S. and the EU widens as we advance towards the more cited articles, while in the remaining 15 fields-except for Agricultural Sciences-the U.S. always surpasses the EU when it counts, namely, at the upper tail of citation distributions. Finally, for all sciences as a whole the U.S. publication share becomes greater than that of the EU for the top 50% of the most highly cited articles. The data used refers to 3.6 million articles published in 1998-2002, and the more than 47 million citations they received in 1998-2007.

Keywords: Bibliometric Tools, Citation, Citation Analysis, European Paradox, Indicators, National Research Performance, Policy, Research Performance, Science-and-Technology, Scientific Ranking, US, World

? Lewison, G. and Turnbull, T. (2010), News in brief and features in New Scientist magazine and the biomedical research papers that they cite, August 2008 to July 2009. Scientometrics, 85 (1), 345-359.

Full Text: 2010\Scientometrics85, 345.pdf

Abstract: New Scientist is a British weekly magazine that is half-way between a newspaper and a scientific journal. It has many news items, and also longer feature articles, both of which cite biomedical research papers, and thus serve to make them better known to the public and to the scientific community, mainly in the UK but about half overseas. An analysis of these research papers shows (in relation to their presence in the biomedical research literature) a strong bias towards the UK, and also one to the USA, Scandinavia and Ireland. There is a reasonable spread of subject areas, although neuroscience is favoured, and coverage of many journals-not just the leading weeklies. Most of the feature articles (but not the news items) in New Scientist include comments by other researchers, who can put the new results in context. Their opinions appear to be more discriminating than those of commentators on research in the mass media, who usually enthuse over the results while counselling patience before a cure for the disease is widely available.

Keywords: Cancer, Cited Papers, Coverage, Health Research, Impact, Media, News Stories, Newspapers, Popular Science Writing, Press, Research, Risks, SARS

? Kaur, H. and Gupta, B.M. (2010), Mapping of dental science research in India: A scientometric analysis of India’s research output, 1999-2008. Scientometrics, 85 (1), 361-376.

Full Text: 2010\Scientometrics85, 361.pdf

Abstract: The study examines India’s performance based on its publication output in dental sciences during 1999-2008, based on several parameters, including the country annual average growth rate, global publication share & rank among 25 most productive countries of the world, national publication output and impact in terms of average citations per paper, international collaboration output and share and contribution of major collaborative partners, contribution and impact of select top 25 Indian institutions and select top 15 most productive authors, patterns of communication in national and international journals and characteristics of its 45 high cited papers. The study uses 10 years (1999-2008) publications data in dental sciences of India and other countries drawn from Scopus international multidisciplinary bibliographical database.

Keywords: Dental Citations, Dental Publications, Dental Research, Research, Scientometric Analysis

? Ortega, J.L. and Aguillo, I.F. (2010), Shaping the European research collaboration in the 6th Framework Programme health thematic area through network analysis. Scientometrics, 85 (1), 377-386.

Full Text: 2010\Scientometrics85, 377.pdf

Abstract: This paper aims to analyse the collaboration network of the 6th Framework Programme of the EU, specifically the “Life sciences, genomics and biotechnology for health” thematic area. A collaboration network of 2,132 participant organizations was built and several variables were added to improve the visualization such as type of organization and nationality. Several statistical tests and structural indicators were used to uncover the main characteristic of this collaboration network. Results show that the network is constituted by a dense core of government research organizations and universities which act as large hubs that attract new partners to the network, mainly companies and non-profit organizations.

Keywords: 6th Framework Programme, Biotechnology, Centrality, Emergence, Network Analysis, Research, Research Collaboration, Science, Scientometrics, Web

? Calver, M., Wardell-Johnson, G., Bradley, S. and Taplin, R. (2010), What makes a journal international? A case study using conservation biology journals. Scientometrics, 85 (2), 387-400.

Full Text: 2010\Scientometrics85, 387.pdf

Abstract: The qualitative label ‘international journal’ is used widely, including in national research quality assessments. We determined the practicability of analysing internationality quantitatively using 39 conservation biology journals, providing a single numeric index (IIJ) based on 10 variables covering the countries represented in the journals’ editorial boards, authors and authors citing the journals’ papers. A numerical taxonomic analysis refined the interpretation, revealing six categories of journals reflecting distinct international emphases not apparent from simple inspection of the IIJs alone. Categories correlated significantly with journals’ citation impact (measured by the Hirsch index), with their rankings under the Australian Commonwealth’s ‘Excellence in Research for Australia’ and with some countries of publication, but not with listing by ISI Web of Science. The assessments do not reflect on quality, but may aid editors planning distinctive journal profiles, or authors seeking appropriate outlets.

Keywords: Bibliometrics, Citation, Citation Studies, Conservation Biology, Hirsch Index, International Journal, ISI, Journal Ranking, Journals, Paper, Perspectives, Publication, Research

? Breimer, L.H. and Nilsson, T.K. (2010), A longitudinal and cross-sectional study of Swedish biomedical PhD processes 1991-2009 with emphasis on international and gender aspects. Scientometrics, 85 (2), 401-414.

Full Text: 2010\Scientometrics85, 401.pdf

Abstract: This longitudinal survey of Swedish biomedical PhDs from 1991 to 2009 found a 2.5-fold increase in biomedical PhD graduates, especially women, and mainly non-MDs, while the number of MDs remained fairly constant. The proportion obtaining a biomedical PhD in Sweden in 2006 was two and a half times that in USA compared to population and three and a half times by GDP, but similar to that of the Netherlands. Female non-MD but not female MD candidates were more likely than men to be examined by female examiners. Fewer of the non-MD than MD women continued to publish in English after their PhD. The median number of authors per paper in a thesis had increased by 1 (from 4 to 5) compared with 15-20 years ago. Swedish biomedical research was already well internationalized in 1991, when 38% of the external examiners came from abroad. This rose to 53% in 2003 but in 2009 had returned to 42%. USA and UK were the most common countries but Australia accounted for 2%. When assessed by connection with foreign research teams, Swedish researchers were also internationally well connected. Studies in other countries are needed to assess how generally applicable these findings are. Our findings suggest that the policy and management of Swedish scientific research systems needs revision to harmonize with the national economic capacity.

Keywords: Bibliometrics, Cross-Border Comparisons, Gender Issues, Higher Education Performance Indicators, Iternationalization of Research, PhD Process, Research, Researchers, Thesis

? Assimakis, N. and Adam, M. (2010), A new author’s productivity index: p-index. Scientometrics, 85 (2), 415-427.

Full Text: 2010\Scientometrics85, 415.pdf

Abstract: In this paper a new author’s productivity index is introduced, namely the golden productivity index. The proposed index measures the productivity of an individual researcher evaluating the number of papers as well as the rank of co-authorship. It provides an efficient method to measure the author’s contribution in articles writing, compared to other ordinary methods. It gives emphasis to the first authors contributions due to the fact that traditionally the rank of each author shows the magnitude of his contribution in the article.

Keywords: Articles, Author, Author Rank, Citation Measures, Co-Authorship, Coauthors, Collaboration, Contribution, Credit, Metrics, Multiple Authorship, Order, P-Index, Patterns, Productivity, Publication, Scientists

? Juznic, P., Peclin, S., Zaucer, M., Mandelj, T., Pusnik, M. and Demsar, F. (2010), Scientometric indicators: Peer-review, bibliometric methods and conflict of interests. Scientometrics, 85 (2), 429-441.

Full Text: 2010\Scientometrics85, 429.pdf

Abstract: The paper discusses the role of scientometric indicators in peer-review selection of research project proposals. An ex post facto evaluation was made of three calls for research project proposals in Slovenia: 2003 with a peer review system designed in a way that conflict of interest was not avoided effectively, 2005 with a sound international peer-review system with minimized conflict of interest influence but a limited number of reviewers, and 2008 with a combination of scientometric indicators and a sound international peer review with minimized conflict of interest influence. The hypothesis was that the three different peer review systems would have different correlations with the same set of scientometric indicators. In the last two decision-making systems (2005 and 2008) where conflict of interest was effectively avoided, we have a high percentage (65%) of projects that would have been selected in the call irrespective of the method (peer review or bibliometrics solely). In contrast, in the 2003 call there is a significantly smaller percentage (49%) of projects that would have been selected in the call irrespective of the method (peer review or bibliometrics solely). It was shown that while scientometric indicators can hardly replace the peer-review system as the ultimate decision-making and support system, they can reveal its weaknesses on one hand and on the other can verify peer-review scores and minimize conflict of interest if necessary.

Keywords: Bibliometric, Bibliometrics, Citation, Conflict Of Interests, Counts, Evaluation, EX Post Evaluation, Exercises, Impact, Peer Review Systems, Physics, Publications, Research, Research Project Proposals, Science Policy, Scientific Excellence, Scientometric Indicators, System

? Lancho-Barrantes, B.S., Guerrero-Bote, V.P. and Moya-Anegon, F. (2010), The iceberg hypothesis revisited. Scientometrics,



Download 6.47 Mb.

Share with your friends:
1   ...   148   149   150   151   152   153   154   155   ...   275




The database is protected by copyright ©ininet.org 2024
send message

    Main page