Personal Research Database



Download 6.47 Mb.
Page153/275
Date02.05.2018
Size6.47 Mb.
#47265
1   ...   149   150   151   152   153   154   155   156   ...   275
85 (2), 443-461.

Full Text: 2010\Scientometrics85, 443.pdf

Abstract: A study is described of the rank/JIF (Journal Impact Factor) distributions in the high-coverage Scopus database, using recent data and a three-year citation window. It includes a comparison with an older study of the Journal Citation Report categories and indicators, and a determination of the factors most influencing the distributions. While all the specific subject areas fit a negative logarithmic law fairly well, those with a greater External JIF have distributions with a more sharply defined peak and a longer tail-something like an iceberg. No S-shaped distributions, such as predicted by Egghe, were found. A strong correlation was observed between the knowledge export and import ratios. Finally, data from both Scopus and ISI were used to characterize the rank/JIF distributions by subject area.

Keywords: Categories, Citation, Citation Analysis, Fields, Impact Factor, ISI, Journal Impact Factor, Journal Impact Measures, Knowledge Export, Pathfinder, Science, Scientometrics

? Cho, C.C., Hu, M.W. and Liu, M.C. (2010), Improvements in productivity based on co-authorship: A case study of published articles in China. Scientometrics, 85 (2), 463-470.

Full Text: 2010\Scientometrics85, 463.pdf

Abstract: The issue of primary interest to this study is the collaboration that has taken place in science and technology (S&T) research in China. Due to our empirical evidences, the regions with higher relationship (network) capital enjoy higher knowledge productivity in terms of published articles. Our purpose in this paper is to investigate the relationships that exist between regional published articles and co-authorship in China covering the period from 1998 to 2007 by using Stata to investigate the relation between the regional publications and co-authored published articles. As main findings, the greater the number of co-authored articles that a region has, the greater their success, in terms of the number of articles published. Indeed, both domestic and international co-authorship have had positive effects on published article levels in China.

Keywords: Academic Research, Co-Authorship, Collaboration, Determinants, Economics, Journals, Knowledge, Knowledge Production Function, Publications, Regional Innovation Systems, Research, Science, Scientific Collaboration, Technology

? Haddow, G. and Genoni, P. (2010), Citation analysis and peer ranking of Australian social science journals. Scientometrics, 85 (2), 471-487.

Full Text: 2010\Scientometrics85, 471.pdf

Abstract: Citation analyses were performed for Australian social science journals to determine the differences between data drawn from Web of Science and Scopus. These data were compared with the tier rankings assigned by disciplinary groups to the journals for the purposes of a new research assessment model, Excellence in Research for Australia (ERA), due to be implemented in 2010. In addition, citation-based indicators including an extended journal impact factor, the h-index, and a modified journal diffusion factor, were calculated to assess whether subsequent analyses influence the ranking of journals. The findings suggest that the Scopus database provides higher number of citations for more of the journals. However, there appears to be very little association between the assigned tier ranking of journals and their rank derived from citations data. The implications for Australian social science researchers are discussed in relation to the use of citation analysis in the ERA.

Keywords: Australia, Citation, Citation Analysis, Citation Sources, Citations, Counts, Coverage, Diffusion, ERA, Google-Scholar, h Index, h-Index, Impact, Information, Journal Ranking, Journals, Research, Research Assessment, Research Assessment Exercise, Researchers, Science, Scopus, Social Science Journals, Web-of-Science

? De Marchi, M. and Rocchi, M. (2010), Note on R&D expenditures and fixed capital formation. Scientometrics, 85 (2), 489-494.

Full Text: 2010\Scientometrics85, 489.pdf

Abstract: In this paper we deal with the fixed capital nature of the means of production and labour employed in research and development which generate scientific and technological knowledge. We argue that these R&D current expenditures typically have the nature of fixed investments. We then present an empirical analysis which shows that expenditures on industrial R&D are more strongly linked to the formation of fixed capital than to the formation of capital in general. Applying this conclusion to the economics of research and innovation would make it possible to analyse investments in the production of scientific and technological knowledge with a higher degree of clarity and precision.

Keywords: Capital, Innovation, Production, Research

? Miguel, S., Moya-Anegon, F. and Herrero-Solana, V. (2010), The impact of the socio-economic crisis of 2001 on the scientific system of Argentina from the scientometric perspective. Scientometrics, 85 (2), 495-507.

Full Text: 2010\Scientometrics85, 495.pdf

Abstract: In recent years a number of studies have focused on Argentina’s 2001 economic crisis and its political, social, and institutional repercussions. To date, however, no studies have analyzed its effects upon the country’s scientific system from a scientometric perspective, in terms of resources dedicated to scientific activity and the final output and impact. The present study does so by means of a set of scientometric indicators that reflect economic effort, human resources dedicated to research, publications, collaborative relations, and the international visibility of scientific contributions.

Keywords: 2001, Argentina, Latin-America, Publications, Research, Scientific System, Scientometric Indicators, Socio-Economic Crisis

? Bolanos-Pizarro, M., Thijs, B. and Glänzel, W. (2010), Cardiovascular research in Spain. A comparative scientometric study. Scientometrics, 85 (2), 509-526.

Full Text: 2010\Scientometrics85, 509.pdf

Abstract: A bibliometric analysis of Spanish cardiovascular research is presented. The study focuses on the productivity, visibility and citation impact in an international, notably European context. Special attention is given to international collaboration. The underlying bibliographic data are collected from Thomson Reuters’s Web of Science on the basis of a ‘hybrid’ search strategy combining core journals, lexical terms and citation links especially developed for the field of cardiology.

Keywords: Bibliometric, Bibliometric Analysis, Bibliometric Approach, Cardiovascular Research, Citation, Citations, Co-Authorship, Indicators, International Collaboration, International Scientific Collaboration, Journal Impact, Journals, Output, Research, Research Performance, Science, Spain

? Meyer, M., Debackere, K. and Glänzel, W. (2010), Can applied science be ‘good science’? Exploring the relationship between patent citations and citation impact in nanoscience. Scientometrics, 85 (2), 527-539.

Full Text: 2010\Scientometrics85, 527.pdf

Abstract: There is a rich literature on how science and technology are related to each other. Patent citation analysis is amongst the most frequently used to tool to track the strengths of links. In this paper we explore the relationship between patent citations and citation impact in nanoscience. Our observations indicate that patent-cited papers perform better in terms of standard bibliometric indicators than comparable publications that are not linked to technology in this way. More specifically, we found that articles cited in patents are more likely to be cited also by other papers. The share of highly cited papers is the most striking result. Instead of the average of 4% of all papers, 13.8% of the papers cited once or twice in patents fall into this category and even 23.5% of the papers more frequently cited in patents receive citation rates far above the standard. Our analyses further demonstrate the presence and the relevance of bandwagon effects driving the development of science and technology.

Keywords: Bibliometric, Citation, Citation Analysis, Citation Impact, Citations, Collaboration, Emerging Field, Exploration, Innovation, Interdisciplinarity, Nano-Science, Nanoscience, Nanotechnology, Nanotechnology, Patent, Patent Citations, Performance, Publications, Science, Science-Technology Linkage, Scientific Literature, Technology

? Jeong, S. and Kim, H.G. (2010), Intellectual structure of biomedical informatics reflected in scholarly events. Scientometrics, 85 (2), 541-551.

Full Text: 2010\Scientometrics85, 541.pdf

Abstract: The purpose of this paper was to analyze the intellectual structure of biomedical informatics reflected in scholarly events such as conferences, workshops, symposia, and seminars. As analysis variables, ‘call for paper topics’, ‘session titles’ and author keywords from biomedical informatics-related scholarly events, and the MeSH descriptors were combined. As analysis cases, the titles and abstracts of 12,536 papers presented at five medical informatics (MI) and six bioinformatics (BI) global scale scholarly event series during the years 1999-2008 were collected. Then, n-gram terms (MI = 6,958, BI = 5,436) from the paper corpus were extracted and the term co-occurrence network was analyzed. One hundred important topics for each medical informatics and bioinformatics were identified through the hub-authority metric, and their usage contexts were compared with the k-nearest neighbor measure. To research trends, newly popular topics by 2-year period units were observed. In the past 10 years the most important topic in MI has been “decision support”, while in BI “gene expression”. Though the two communities share several methodologies, according to our analysis, they do not use them in the same context. This evidence suggests that MI uses technologies for the improvement of productivity in clinical settings, while BI uses algorithms as its tools for scientific biological discovery. Though MI and BI are arguably separate research fields, their topics are increasingly intertwined, and the gap between the fields blurred, forming a broad informatics-namely biomedical informatics. Using scholarly events as data sources for domain analysis is the closest way to approximate the forefront of biomedical informatics.

Keywords: Author, Bibliometric Analysis, Bioinformatics, Biomedical Informatics, Biotechnology, Co-Word Analysis, Co-Word Analysis, Conference, Conferences, Exploratory Analysis, Field, Intellectual Structure, Medical Informatics, Research, Scholarly Event, Science, Social Network Analysis, Undiscovered Public Knowledge

? Perakakis, P., Taylor, M., Mazza, M. and Trachana, V. (2010), Natural selection of academic papers. Scientometrics, 85 (2), 553-559.

Full Text: 2010\Scientometrics85, 553.pdf

Abstract: Academic papers, like genes, code for ideas or technological innovations that structure and transform the scientific organism and consequently the society at large. Genes are subject to the process of natural selection which ensures that only the fittest survive and contribute to the phenotype of the organism. The process of selection of academic papers, however, is far from natural. Commercial for-profit publishing houses have taken control over the evaluation and access to scientific information with serious consequences for the dissemination and advancement of knowledge. Academic authors and librarians are reacting by developing an alternative publishing system based on free-access journals and self-archiving in institutional repositories and global disciplinary libraries. Despite the emergence of such trends, the journal monopoly, rather than the scientific community, is still in control of selecting papers and setting academic standards. Here we propose a dynamical and transparent peer review process, which we believe will accelerate the transition to a fully open and free-for-all science that will allow the natural selection of the fittest ideas.

Keywords: Academic Publishing, Ethics, Evaluation, Journals, Peer Review, Science

? Prathap, G. (2010), The iCE approach for journal evaluation. Scientometrics, 85 (2), 561-565.

Full Text: 2010\Scientometrics85, 561.pdf

Abstract: Recent research has shown that simple graphical representations of research performance can be obtained using two-dimensional maps based on impact (i) and citations (C). The product of impact and citations leads to an energy term (E). Indeed, using E as the third coordinate, three-dimensional landscape maps can be prepared. In this paper, instead of using the traditional impact factor and total citations received for journal evaluation, Article Influence(TM) and Eigenfactor(TM) are used as substitutes. Article Influence becomes a measure of quality (i.e. a proxy for impact factor) and Eigenfactor is a proxy for size/quantity (like citations) and taken together, the product is an energy-like term. This can be used to measure the influence/prestige of a journal. It is also possible to propose a p-factor (where p = E (1/3)) as an alternative measure of the prestige or prominence of a journal which plays the equivalent role of the h-index.

Keywords: Article Influence (TM), Citations, Eigenfactor, Eigenfactor (TM), Evaluation, h Index, h-Index, Impact Factor, Impact Factor, Journal Evaluation, P-Index, Research

? Hagen, N.T. (2010), Deconstructing doctoral dissertations: how many papers does it take to make a PhD? Scientometrics, 85 (2), 567-579.

Full Text: 2010\Scientometrics85, 567.pdf

Abstract: A collection of coauthored papers is the new norm for doctoral dissertations in the natural and biomedical sciences, yet there is no consensus on how to partition authorship credit between PhD candidates and their coauthors. Guidelines for PhD programs vary but tend to specify only a suggested range for the number of papers to be submitted for evaluation, sometimes supplemented with a requirement for the PhD candidate to be the principal author on the majority of submitted papers. Here I use harmonic counting to quantify the actual amount of authorship credit attributable to individual PhD graduates from two Scandinavian universities in 2008. Harmonic counting corrects for the inherent inflationary and equalizing biases of routine counting methods, thereby allowing the bibliometrically identifiable amount of authorship credit in approved dissertations to be analyzed with unprecedented accuracy. Unbiased partitioning of authorship credit between graduates and their coauthors provides a post hoc bibliometric measure of current PhD requirements, and sets a de facto baseline for the requisite scientific productivity of these contemporary PhD’s at a median value of approximately 1.6 undivided papers per dissertation. Comparison with previous census data suggests that the baseline has shifted over the past two decades as a result of a decrease in the number of submitted papers per candidate and an increase in the number of coauthors per paper. A simple solution to this shifting baseline syndrome would be to benchmark the amount of unbiased authorship credit deemed necessary for successful completion of a specific PhD program, and then monitor for departures from this level over time. Harmonic partitioning of authorship credit also facilitates cross-disciplinary and inter-institutional analysis of the scientific output from different PhD programs. Juxtaposing bibliometric benchmarks with current baselines may thus assist the development of harmonized guidelines and transparent transnational quality assurance procedures for doctoral programs by providing a robust and meaningful standard for further exploration of the causes of intra- and inter-institutional variation in the amount of unbiased authorship credit per dissertation.

Keywords: Author, Authorship Credit, Baseline, Benchmark, Bias, Bibliometric, Bibliometric Counting, Bibliometry, Ethics, Evaluation, Faculty-Student Collaborations, Graduate, Publication, Theses

? Shin, J.C. and Cummings, W.K. (2010), Multilevel analysis of academic publishing across disciplines: research preference, collaboration, and time on research. Scientometrics, 85 (2), 581-594.

Full Text: 2010\Scientometrics85, 581.pdf

Abstract: This study developed a multilevel model of academic publishing and tested the effects of several predictors on faculty publishing. In particular, the analysis paid special attention to faculty preference, time on research, research collaboration, and faculty discipline. The data we used for this study is the Changing Academic Professions (CAP) data which is the follow-up study of the Carnegie Foundation in 1992. The study found that faculty preference for research affects research publishing. In addition, faculty collaboration with international peers is a critical factor in academic publishing. While time spent on research is related to publishing, time spent on teaching does not have a conflicting effect on faculty research. In the institution level analysis, institutional goal-orientation and institutional mission were found to have effects on academic publishing. However, the principal determinants of academic publishing were found to lie at the individual faculty member level. For each of these findings, there are subtle differences by academic discipline.

Keywords: Academic Publication, Departments, Disciplinary Differences, Faculty Research Productivity, Gender, Higher-Education, Interdisciplinary, Research, Research Collaboration, Research Preference, Scientific Productivity, Teaching Effectiveness, Time on Research

? Shapira, P., Youtie, J. and Porter, A.L. (2010), The emergence of social science research on nanotechnology. Scientometrics, 85 (2), 595-611.

Full Text: 2010\Scientometrics85, 595.pdf

Abstract: This article examines the development of social science literature focused on the emerging area of nanotechnology. It is guided by the exploratory proposition that early social science work on emerging technologies will draw on science and engineering literature on the technology in question to frame its investigative activities, but as the technologies and societal investments in them progress, social scientists will increasingly develop and draw on their own body of literature. To address this proposition the authors create a database of nanotechnology-social science literature by merging articles from the Web of Science’s Social Science Citation Index and Arts and Humanities Citation Index with articles from Scopus. The resulting database comprises 308 records. The findings suggest that there are multiple dimensions of cited literature and that social science citations of other social scientists’ works have increased since 2005.

Keywords: Citation, Citations, Collaboration, Emerging Technologies, Interdisciplinarity, Nanoscience, Nanotechnology, Patterns, Publications, Research, Robots, Science, Science Citation Index, Scientometrics, Societal Implications, Technical Change, Technology, Trust, US

? Mingers, J. and Lipitakis, E.A.E.C. (2010), Counting the citations: A comparison of Web of Science and Google Scholar in the field of business and management. Scientometrics, 85 (2), 613-625.

Full Text: 2010\Scientometrics85, 613.pdf

Abstract: Assessing the quality of the knowledge produced by business and management academics is increasingly being metricated. Moreover, emphasis is being placed on the impact of the research rather than simply where it is published. The main metric for impact is the number of citations a paper receives. Traditionally this data has come from the ISI Web of Science but research has shown that this has poor coverage in the social sciences. A newer and different source for citations is Google Scholar. In this paper we compare the two on a dataset of over 4,600 publications from three UK Business Schools. The results show that Web of Science is indeed poor in the area of management and that Google Scholar, whilst somewhat unreliable, has a much better coverage. The conclusion is that Web of Science should not be used for measuring research impact in management.

Keywords: Citations, Databases, Google Scholar, h-Index, Impact, ISI, Journals, Publications, Research, Research Impact, Scopus, Web of Science

? Vieira, P.C. and Teixeira, A.A.C. (2010), Are finance, management, and marketing autonomous fields of scientific research? An analysis based on journal citations. Scientometrics, 85 (3), 627-646.

Full Text: 2010\Scientometrics85, 627.pdf

Abstract: Although there is considerable consensus that Finance, Management and Marketing are ‘science’, some debate remains with regard to whether these three areas comprise autonomous, organized and settled scientific fields of research. In this paper we aim to explore this issue by analyzing the occurrence of citations in the top-ranked journals in the areas of Finance, Management, and Marketing. We put forward a modified version of the model of science as a network, proposed by Klamer and Van Dalen (J Econ Methodol 9(2):289-315, 2002), and conclude that Finance is a ‘Relatively autonomous, organized and settled field of research’, whereas Management and (to a larger extent) Marketing are relatively non-autonomous and hybrid fields of research’. Complementary analysis based on sub-discipline rankings using the recursive methodology of Liebowitz and Palmer (J Econ Lit 22:77-88, 1984) confirms the results. In conclusions we briefly discuss the pertinence of Whitley’s (The intellectual and social organization of the sciences, 1984) theory for explaining cultural differences across these sub-disciplines based on its dimensions of scholarly practices, ‘mutual dependency’ and ‘task uncertainty’.

Keywords: Analysis, Autonomy, Citations, Co-Word Analysis, Communication, Departments, Economics Journals, Finance, Index, Journals, Management, Marketing, Nanotechnology, Network, Patterns, Quality, Relative Impacts, Research, Science

? Lopresti, R. (2010), Citation accuracy in environmental science journals. Scientometrics, 85 (3), 647-655.

Full Text: 2010\Scientometrics85, 647.pdf

Abstract: Citations in five leading environmental science journals were examined for accuracy. 24.41% of the 2,650 citations checked were found to contain errors. The largest category of errors was in the author field. Of the five journals Conservation Biology had the lowest percentage of citations with errors and Climatic Change had the highest. Of the citations with errors that could be checked in Web of Science, 18.18% of the errors caused a search for the cited article to fail. Citations containing electronic links had fewer errors than those without.

Keywords: Author, Citation, Citation Accuracy, Citation Errors, Citations, Environmental Journals, Journals, Science, Web of Science

? De Witte, K. and Rogge, N. (2010), To publish or not to publish? On the aggregation and drivers of research performance. Scientometrics, 85 (3), 657-680.

Full Text: 2010\Scientometrics85, 657.pdf

Abstract: This paper presents a methodology to aggregate multidimensional research output. Using a tailored version of the non-parametric Data Envelopment Analysis model, we account for the large heterogeneity in research output and the individual researcher preferences by endogenously weighting the various output dimensions. The approach offers three important advantages compared to the traditional approaches: (1) flexibility in the aggregation of different research outputs into an overall evaluation score, (2) a reduction of the impact of measurement errors and a-typical observations, and (3) a correction for the influences of a wide variety of factors outside the evaluated researcher’s control. As a result, research evaluations are more effective representations of actual research performance. The methodology is illustrated on a data set of all faculty members at a large polytechnic university in Belgium. The sample includes questionnaire items on the motivation and perception of the researcher. This allows us to explore whether motivation and background characteristics (such as age, gender, retention, etc.,) of the researchers explain variations in measured research performance.

Keywords: Academic Economists, Belgium, Composite Indicator, Composite Indicators, Conditional Efficiency, Data Envelopment Analysis, Data Envelopment Analysis, Evaluation, Higher Education, Nonparametric Frontier Models, Preferences, Publication Productivity, Research, Research Institutes, Research Output, Research Performance, Research Productivity, Researchers, Retention, Scientific Productivity, Scientometric Indicators, Teaching Effectiveness

? Bornmann, L. and Daniel, H.D. (2010), The validity of staff editors’ initial evaluations of manuscripts: A case study of Angewandte Chemie International Edition. Scientometrics,



Download 6.47 Mb.

Share with your friends:
1   ...   149   150   151   152   153   154   155   156   ...   275




The database is protected by copyright ©ininet.org 2024
send message

    Main page