35 (1), 133-154.
Full Text: 1996\Scientometrics35, 133.pdf
Abstract: Systemic analyses of national research systems are now within the reach of bibliometricians. By systemic we mean comprehensive, time series, institutionally based, sectoral level analyses of national research output. This paper describes such an analysis for the UK, a system comprising 8% of world scientific output. The paper analyses publishing size and the number of publishing institutions for each sector. Then each sector’s intra-sectoral, inter-sectoral and international collaboration is assessed. The paper then examines the data by field, looking at sector publishing profiles across fields, and at how the collaborative patterns vary between fields. It concludes with a summary profile of each institutional sector.
Keywords: Collaboration, Publishing, Research, Science
Schubert, A. (1996), Acientometrics: A citation based bibliography, 1990. Scientometrics, 35 (1), 155-163.
Full Text: 1996\Scientometrics35, 155.pdf
? Merton, R.K. (1996), Untitled. Scientometrics, 35 (2), U3.
Full Text: Scientometrics35, U3.pdf
? Glänzel, W., Katz, S., Moed, H. and Schoepflin, U. (1996), Proceedings of the Workshop on ‘Bibliometric Standards’ Rosary College, River Forest, Illinois (USA) Sunday, June 11, 1995. Scientometrics, 35 (2), 165-166.
Full Text: 1996\Scientometrics35, 165.pdf
Keywords: Illinois, USA
Glänzel, W. (1996), The need for standards in bibliometric research and technology. Scientometrics, 35 (2), 167-176.
Full Text: 1996\Scientometrics35, 167.pdf
Abstract: The need for standardisation in bibliometric research and technology is discussed in the context of failing communication within the scientific community, the unsatisfactory impact of bibliometric research outside the community and the observed incompatibility of bibliometric indicators produced by different institutes. The development of bibliometric standards is necessary to improve the reliability of bibliometric results, to guarantee the validity of bibliometric methods and to make bibliometric data compatible. Both conceptual and technical questions are raised. Consequences of lacking standards are illustrated by typical examples. Finally, particular topics of standardisation are proposed based on experiences made at ISSRU.
Keywords: Bibliometric, Bibliometric Indicators, Bibliometric Methods, Bibliometric Research, Communication, Community, Development, Indicators, Methods, Reliability, Research, Standards, Technology, Validity
Moed, H.F. (1996), Differences in the construction of SCI based bibliometric indicators among various producers: A first overview. Scientometrics, 35 (2), 177-191.
Full Text: 1996\Scientometrics35, 177.pdf
Abstract: This contribution discusses basic technical-methodological issues with respect to data collection and the construction of bibliometric indicators, particularly at the macro or meso level. It focusses on the use of the Science Citation Index. Its aim is to highlight important decisions that have to be made in the process of data collection and the construction of bibliometric indicators. It illustrates differences in the methodologies applied by several important producers of bibliometric indicators: the Institute for Scientific Information (ISI), CHI Research, Inc. the Information Science and Scientometrics Research Unit (ISSRU) at Budapest, and the Centre for Science and Technology Studies at Leiden University (CWTS). The observations made in this paper illustrate the complexity of the process of ‘standardisation’ of bibliometric indicators. Moreover, they provide possible explanations for divergence of results obtained in different studies. The paper concludes with a few general comments related to the need of ‘standardisation’ in the field of bibliometrics.
Keywords: Basic Research, Bibliometric, Bibliometric Indicators, Bibliometrics, Citation, Complexity, First, Indicators, Institute for Scientific Information, ISI, SCI, Science Citation Index, Scientometrics
Katz, J.S. (1996), Bibliometric standards: Personal experience and lessons learned. Scientometrics, 35 (2), 193-197.
Full Text: 1996\Scientometrics35, 193.pdf
Abstract: Bibliometric standards are essential for comparative research. However, these standards can not be set by committee but must evolve through an on-going debate. Perhaps, the Scientometric community needs a refereed forum more dedicated to methodological issues than policy matters in which the standards debate can proceed in a focused and professional manner.
Keywords: Community, Needs, Policy, Research, Standards
Bourke, P. and Butler, L. (1996), Standards issues in a national bibliometric database: The Australian case. Scientometrics, 35 (2), 199-207.
Full Text: 1996\Scientometrics35, 199.pdf
Abstract: In recent years researchers in the Performance Indicators Project at the Australian National University have undertaken a number of projects involving collaboration with colleagues in England or attempts to replicate results obtained by others. All projects have necessitated close scrutiny of the methodologies previously used or to be used and have made clear the urgent need for comparable standards. In this paper we have focused on two projects: one, an analysis of Australia’s shares of publications and citations, where we sought to learn from the debate on methodology that surrounded the question of decline in British science, the second, an analysis of astronomy publications in Australia where we sought to replicate methodology used in a previous European study.
Keywords: Analysis, Australia, Bibliometric, Citations, Collaboration, Database, England, Methodology, Publications, Science, Standards
Zitt, M. and Teixeira, N. (1996), Science macro-indicators: Some aspects of OST experience. Scientometrics, 35 (2), 209-222.
Full Text: 1996\Scientometrics35, 209.pdf
Abstract: We report OST experience on macro-indicators producing, especially on academic science and ISI sources. This task requires a combination of organizational choices for data handling and processing, and of bibliometric choices for a selection of indicators appropriate to the missions. Both aspects are briefly studied: the OST database, which also contains non-bibliometric datasets, is organized on the relational principle (RDBMS). Bibliometric indicators selected are classical ones, with a stress on overall coherence. In conclusion, standardization issue is briefly discussed. Standardization may not be desirable at the same extent for different targets (data, nomenclatures, indicators, procedures, etc.) and must not hinder further research. Natural process of communication and explicitation may also lead to fruitful convergences, without freezing supposed ‘best ways’.
Keywords: Bibliometric, Bibliometric Assessment, Communication, Database, Indicators, ISI, Journals, Lead, Procedures, Research, Science, Set, Stress, UK Scientific Performance
? Gomez, I., Bordons, M., Fernandez, M.T. and Mendez, A. (1996), Coping with the problem of subject classification diversity. Scientometrics, 35 (2), 223-235.
Full Text: 1996\Scientometrics35, 223.pdf
Abstract: The delimitation of a research field in bibliometric studies presents the problem of the diversity of subject classifications used in the sources of input and output data. Classification of documents according to thematic codes or keywords is the most accurate method, mainly used in specialised bibliographic or patent databases. Classification of journals in disciplines presents lower specificity, and some shortcomings as the change over time of both journals and disciplines and the increasing interdisciplinarity of research. Differences in the criteria in which input and output data classifications are based obliges to aggregate data in order to match them. Standardization of subject classifications emerges as an important point in bibliometric studies in order to allow international comparisons, although flexibility is needed to meet the needs of local studies.
Keywords: Bibliometric, Bibliometric Analysis, Bibliometric Studies, Classification, Criteria, Flexibility, Interdisciplinarity, International, ISI, Journals, Needs, Patent, Publications, Research, SCI 1984-89, Spanish Pharmacologists
? Vinkler, P. (1996), Some practical aspects of the standardization of scientometric indicators. Scientometrics, 35 (2), 237-245.
Full Text: 1996\Scientometrics35, 237.pdf
Abstract: In the present stage of Scientometrics indicators published are mostly incomparable, which fact impedes the development of the field and makes the users of scientometric results mistrustful. Consequently, standardization of data, methods, indicators and their presentation is urgently needed. For instance, the time periods applied should be standardized across fields and subfields in calculating citation and publication indicators.
Keywords: Citation, Development, Indicators, Methods, Publication, Scientometric, Scientometrics
Arvanitis, R., Russell, J.M. and Rosas, A.Ma. (1996), Experiences with the national citation reports database for measuring national performance: The case of Mexico. Scientometrics, 35 (2), 247-255.
Full Text: 1996\Scientometrics35, 247.pdf
Abstract: The National Citation Report (NCR) is an integrated citation file supplied by the Institute for Scientific Information (ISI), of an individual country’s articles in science and social sciences. Our experience with the NCR database for Mexico suggests that this is an important addition to the tools available for carrying out bibliometric analysis of research performance. However, in order to generate reliable and accurate indicators using these datafiles we recommend that these be handled by specialists well acquainted with the ISI information products and with the scientific setup of the country concerned.
Keywords: Analysis, Bibliometric, Bibliometric Analysis, Citation, Database, Indicators, Information, Institute for Scientific Information, ISI, Mexico, Research, Research Performance, Science, Sciences, Social Sciences
? McGrath, W.E. (1996), The unit of analysis (objects of study) in bibliometrics and scientometrics. Scientometrics, 35 (2), 257-264.
Full Text: 1996\Scientometrics35, 257.pdf
Abstract: Slow development of bibliometric theory may be due in part to neglect of the unit of analysis - the objects described by variables and about which inferences are made. Problems include: inferences are often made on units other than those sampled, leading to inappropriate conclusions, units in literature reviews and meta-analysis are often not comparable, thus hindering cumulation of knowledge, confusion when names of sampling units in one study might also be the names of variables in other studies - e.g., no. of citations (variable) to papers (sampling unit) and no. of papers (variable) in journals (sampling unit), loss of information about the unit of analysis, means and variances when data are aggregated. If theory is to advance, scientometrics needs a generic definition of the unit of analysis, a complete list of all known units - classified and structured, meta-analyses, reporting standards - especially when data are aggregated, clear indications of data level (nominal, ordinal, interval, ratio), conventions for including units in titles, abstracts and keyword or subject indexes.
Keywords: Analysis, Bibliometric, Bibliometrics, Citations, Development, Indications, Information, Journals, Knowledge, Literature, Meta-Analysis, Needs, Neglect, Papers, Reporting, Scientometrics, Standards, Theory
Rao, I.K.R. (1996), Methodological and conceptual questions of bibliometric standards. Scientometrics, 35 (2), 265-270.
Full Text: 1996\Scientometrics35, 265.pdf
Abstract: Bibliometric studies are mostly empirical nature and they are mostly centred arround presentation of facts and data. There are very few studies which are centred arround theoretical foundation. The facts are gathered either through surveys or from published bibliographies, indexes, data bases. Based on these facts, empirical models and principles are being developed. The normative principles and standards have to evolve from the logical analyses of the empirical models. The stage is set to integrate empirical models of bibliometrics into standards. Future, bibliometric studies have to address this issue and reach the stage of normative principles.
Keywords: Bibliographies, Bibliometric, Bibliometric Studies, Bibliometrics, Models, Principles, Standards
Lazarev, V.S. (1996), On Chaos in bibliometric terminology. Scientometrics, 35 (2), 271-277.
Full Text: 1996\Scientometrics35, 271.pdf
Abstract: On behalf of a case study of articles on bibliometric selection and ranking the variance in terminology of the properties of journals is shown: the same properties are called in various manners, while one and the same terms have different meanings. Similar inconsistencies are found in the terms denoting readers’ activities which are studied in bibliometrics for the assessment of the use of periodicals. The author concludes that there are actually only two properties of periodicals that are quantitatively assessed, viz. ‘productivity’ and ‘value’. Their definitions are suggested for terminology standardization of general properties of journals and of readers’ activities.
Keywords: Assessment, Bibliometric, Bibliometrics, Case Study, Chaos, Journals, Periodicals, Ranking, Terminology
? Aguillo, I.F. (1996), Increasing the between-year stability of the impact factor in the science citation index. Scientometrics, 35 (2), 279-282.
Full Text: 1996\Scientometrics35, 279.pdf
Abstract: The critical evaluation of scientific productivity during last years has been done with the help of the Journal Citation Reports ranks of journals. The relative performance of each journal was derived from a simple calculation called Impact Factor. Such measure has been widely criticized by scientometricians, but alternative proposals were never adopted due perhaps to their complexity, but also to economic limitations. For the informetric purposes this situation has led to a worrying lack of standardization and, worst of all, makes useless many studies for comparative purposes. In order to enhance the comparative value of the impact factor we develop a new easy method that increases the time period used for its calculation. Such new index has advantages over the old one.
Keywords: Alternative, Citation, Complexity, Evaluation, Impact Factor, Journal, Journal Citation Reports, Journals, Science, Science Citation Index, Stability
? Marshakova Shaikevich, I. (1996), The standard impact factor as an evaluation tool of science fields and scientific journals. Scientometrics, 35 (2), 283-290.
Full Text: 1996\Scientometrics35, 283.pdf
Abstract: The standard impact factor for particular fields of science (Ig) and the relative impact factor K for scientific journals are introduced. The technique of calculation of standard impact factor (Ig) for a field is an inherent part of a method which allows a cross-field evaluation of scientific journals. This method for evaluating scientific journals elaborated in 1988 was aimed at the analysis of Russian journals covered by the SCI database, it was also used for chemical journals (more that 300) and for journals in the Life sciences (more than 1000). The results are discussed.
Keywords: Analysis, Database, Evaluation, Impact Factor, Journals, SCI, Science, Sciences, Scientific Journals, Standard, The SCI Database
Glänzel, W. (1996), A bibliometric approach to social sciences. National research performances in 6 selected social science areas, 1990-1992. Scientometrics, 35 (3), 291-307.
Full Text: 1996\Scientometrics35, 291.pdf
Abstract: The Brazilian scientific production and its international impact increased considerably in the last 10 years. This increase occurred in spite of a reduction in the resources for science in the same period. The data show that the explanation for this apparent paradox lies in the active process of international and national collaboration which increased in this same period. Collaborative work was supported by several programs of the Brazilian agencies. Advantages and possible drawbacks of the intensification of scientific collaboration for the Brazilian science are discussed.
? Nieminen, P. (1996), Type of empirical research reports, as an explanatory factor in citation performance of psychiatric research. Scientometrics, 35 (3), 309-320.
Full Text: 1996\Scientometrics35, 309.pdf
Abstract: In all fields of human sciences there has long been a debate whether research of these fields should closely follow the traditional method with accurate measurements and statistical inference. More qualitative approaches have been proposed, by which is ment that the research aim is to use the data in their qualitative form. The purpose of this study was to describe the differences in citations between qualitative and quantitative empirical reports. A total of 262 published reports of research pertaining to the therapeutic community and psychiatric wards in a variety of treatment settings from 1987 to 1992 were analyzed. The main finding of this study was that quantitative reports were mon frequently cited than qualitative ones - also when some confounding factors were controlled.
Keywords: Citation, Citations, Confounding, Differences, Human, Qualitative, Qualitative Approaches, Quantitative, Research, Sciences, Statistical, Traditional, Treatment
? Berg, J. and Wagner-Döbler, R. (1996), A multidimensional analysis of scientific dynamics. Part I. Case studies of mathematical logic in the 20th century. Scientometrics, 35 (3), 321-346.
Full Text: 1996\Scientometrics35, 321.pdf
Abstract: Sequences of empirical Lotka-like distributions of the publications of scientific areas are mapped into a multidimensional parameter space. On this basis a new definition of the notion of an epidemic phase of a discipline is introduced. A graphic representation of the parameter space along with results of an exponential regression analysis of the Lotka exponent yield an image of the inner state of a discipline and renders possible a prognosis. Examples, primarily from mathematical logic, are described in detail. The notion of a scientific elite is discussed and the hypotheses of Ortega, Merton, and Price are critically assessed.
Keywords: Analysis, Law, Prognosis, Publications, Regression Analysis
? Breimer, L.H. and Breimer, D.D. (1996), The CED Le DEC: Common European doctorate, or doctorate Europeen commune or dissertations on the Internet. Scientometrics, 35 (3), 347-353.
Full Text: 1996\Scientometrics35, 347.pdf
Abstract: An international electronic thesis system is proposed to provide ready access to doctoral dissertations and ensure uniform standards. To establish common criteria, the publication-based Dutch doctoral degree system was assessed and compared with studies of other national systems. Current Dutch doctoral theses in biomedical fields were of a high standard. 93% of theses were based on published work. The median number of papers per thesis was four, with five authors per paper. The candidate was the key author on 84%. Representative journals of publication ranked in the top quartile of the Science Citation Index with a median rank of 241.
Keywords: Author, Authors, Biomedical, Citation, Dissertations, Doctoral Theses, Internet, Journals, Papers, Publication, Science, Science Citation Index, Standards, Thesis, Universities
? Seglen, P.O. (1996), Quantification of scientific article contents. Scientometrics, 35 (3), 355-366.
Full Text: 1996\Scientometrics35, 355.pdf
Abstract: The information contents of 143 biomedical journal articles were quantified by standardized criteria, emphasizing quantitative measurements and estimated labour investments. A hundredfold variability in article information contents was uncovered, producing a Poisson distribution with a median (peak) value at about one-half of the sample mean. Two-thirds of the articles thus had information contents below the average scientific article, testifying to the somewhat excessive fragmentation of the primary scientific literature: The information contents of an article depended on three different factors: (1) the number of pages, which rarely exceeded an upper limit corresponding to the standard article format (7-8 pages), (2) the number of figures plus tables per page, which similarly reached saturation at the standard format value (one per page), (3) the density of information packaging within each figure and table, for which no upper limit was observed. The latter factor could, therefore, account for virtually all information contents in excess of the standard article format. Differences in the information density of figures and tables were apparently not perceived by a peer reviewer, who tended to overestimate low-contents articles and underestimate high-contents articles. Furthermore, a model evaluation of the article authors indicated that evaluation by contents quantification and by straight article counting might give different results. Since neither peer review nor publication counts could satisfactorily detect differences in the information contents of scientific articles, objective contents quantification would seem to be required for an exact and fair evaluation of scientific productivity.
Keywords: Biomedical, Criteria, Evaluation, Information, Journal, Journal Articles, Less, Literature, Model, Packaging, Peer Review, Peer-Review, Primary, Publication, Publication Counts, Quality, Review, Saturation, Standard, Variability
Meneghini, R. (1996), The key role of collaborative work in the growth of Brazilian science in the last ten years. Scientometrics, 35 (3), 367-373.
Full Text: 1996\Scientometrics35, 367.pdf
Abstract: The Brazilian scientific production and its international impact increased considerably in the last 10 years. This increase occurred in spite of a reduction in the resources for science in the same period. The data show that the explanation for this apparent paradox lies in the active process of international and national collaboration which increased in this same period. Collaborative work was supported by several programs of the Brazilian agencies. Advantages and possible drawbacks of the intensification of scientific collaboration for the Brazilian science are discussed.
Keywords: Collaboration, Explanation, Growth, International, Reduction, Science, Scientific Collaboration, Scientific Production, Work
Vinkler, P. (1996), Relationships between the rate of scientific development and citations. The chance for citedness model. Scientometrics, 35 (3), 375-386.
Full Text: 1996\Scientometrics35, 375.pdf
Abstract: Chances for information to be cited (CC) depend on disciplines and topics because of different publication and referencing practices. However, the developmental rate of knowledge strongly influences CC as well. By a simple model it has been concluded that CC are the greater the faster the publication rate.
Keywords: Citations, Development, Information, Knowledge, Model, Publication, Referencing, Subfields
Winclawska, B.M. (1996), Polish sociology citation index (principles for creation and the first results). Scientometrics, 35 (3), 387-391.
Full Text: 1996\Scientometrics35, 387.pdf
Abstract: The author discusses inefficiencies of Garfield’s Social Sciences Citation Index to measure quality of a discipline in a national context. She proposes an alternative measurement tool to the Garfield’s index. The example of sociology was selected, an index of Polish sociology was created and data from it was compared with data retrieved from the SSCI. The two sets were compared to show greater ‘sensitivity’ of the locally created index.
Keywords: Alternative, Citation, Context, Data, First, Index, Measure, Measurement, Principles, Quality, Quality of, Sociology, SSCI
? Schubert, A. (1996), Scientometrics: A citation based bibliography, 1991. Scientometrics,
Share with your friends: |