Toggle menu
Toggle personal menu
Not logged in
Your IP address will be publicly visible if you make any edits.

Scientometrics

From Wickepedia

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[1] In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Historical development

Scientometrics was introduced by Vasily Nalimov under its Russian name Naukometriya in 1969, which translates to “Scientometrics” in English.[2][3][4][5] Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index[1] and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the number of publications and research outcomes and the rise of the computers allowed effective analysis of this data.[6] While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications.[1] Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.[7][8]

The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field.[9]

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals and the rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became a leading indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level indicators have been proposed.[10][11]

Around the same time, the interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur.[12]

Methods and findings

Methods of research include qualitative, quantitative and computational approaches. The main focus of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings[7][8][13] establishing faculty productivity and tenure standards,[14] assessing the influence of top scholarly articles,[15] and developing profiles of top authors and institutions in terms of research performance.[16]

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems.[citation needed]

More recent methods rely on open source and open data to ensure transparency and reproducibility in line with modern open science requirements. For instance, the Unpaywall index and attendant research on open access trends is based on data retrieved from OAI-PMH endpoints of thousands of open archives provided by libraries and institutions worldwide.[17]

Common scientometric indexes

Indexes may be classified as article-level metrics, author-level metrics, and journal-level metrics depending on which feature they evaluate.

Impact factor

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI).

Science Citation Index

The Science Citation Index (SCI) is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield. It was officially launched in 1964. It is now owned by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters).[18][19][20][21] The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of science and technology, because of a rigorous selection process.[22][23][24]

Acknowledgement index

An acknowledgement index (British English spelling[25]) or acknowledgment index (American English spelling[25]) is a method for indexing and analyzing acknowledgments in the scientific literature and, thus, quantifies the impact of acknowledgements. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts. Unlike the impact factor, it does not produce a single overall metric, but analyses the components separately. However, the total number of acknowledgements to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgement appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity.[26][27]

Altmetrics

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics[28] proposed as an alternative[29] or complement[30] to more traditional citation impact metrics, such as impact factor and h-index.[31] The term altmetrics was proposed in 2010,[32] as a generalization of article level metrics,[33] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts,[34] but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.[35][36] It demonstrates both the impact and the detailed composition of the impact.[32] Altmetrics could be applied to research filter,[32] promotion and tenure dossiers, grant applications[37][38] and for ranking newly-published articles in academic search engines.[39]

Criticisms

Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.[40]

See also

Journals

References and footnotes

  1. 1.0 1.1 1.2 Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  2. Vasily Vasilyevich Nalimov, B. M. Mulchenko: "Scientometrics." Studies of science as a process of information. In: Science. Moscow, Russia 1969.
  3. М. В. Борисов, А. И. Майсуразде: Восстановление связей в научном рубрикаторе на основе кластеризации гетерогенной сети. 2014 (recognition.su [PDF; abgerufen am 15. Mai 2021]).
  4. De Solla Price, D., editorial statement. Scientometrics Volume 1, Issue 1 (1978)
  5. 7.0 7.1 Paul Benjamin Lowry, Denton Romans, Aaron Curtis: Global journal prestige and supporting disciplines: A scientometric study of information systems journals. In: Journal of the Association for Information Systems. 5. Jahrgang, Nr. 2, 2004, S. 29–80, doi:10.17705/1jais.00045.
  6. 8.0 8.1 Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars’ journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see a YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.
  7. About. In: International Society for Scientometrics and Informetrics. Abgerufen am 18. Januar 2021 (Lua error in Module:Multilingual at line 149: attempt to index field 'data' (a nil value).).
  8. Belikov, A.V., Belikov, V.V.: A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts. In: F1000Research. 4. Jahrgang, 2015, S. 884, doi:10.12688/f1000research.7070.1, PMC 4654436 (freier Volltext).
  9. Kinouchi, O.: A simple centrality index for scientific social recognition. In: Physica A: Statistical Mechanics and Its Applications. 491. Jahrgang, 2018, S. 632–640, doi:10.1016/j.physa.2017.08.072, arxiv:1609.05273, bibcode:2018PhyA..491..632K.
  10. J Lane: Assessing the Impact of Science Funding. In: Science. 324. Jahrgang, Nr. 5932, 2009, S. 1273–1275, doi:10.1126/science.1175335.
  11. Paul Benjamin Lowry, Sean Humphreys, Jason Malwitz, Joshua C Nix: A scientometric study of the perceived quality of business and technical communication journals. In: IEEE Transactions on Professional Communication. 50. Jahrgang, Nr. 4, 2007, S. 352–378, doi:10.1109/TPC.2007.908733. Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.
  12. Douglas L Dean, Paul Benjamin Lowry, Sean Humpherys: Profiling the research productivity of tenured information systems faculty at U.S. institutions. In: MIS Quarterly. 35. Jahrgang, Nr. 1, 2011, S. 1–15, doi:10.2307/23043486, JSTOR:23043486.
  13. Gilbert G. Karuga, Paul Benjamin Lowry, Vernon J. Richardson: Assessing the impact of premier information systems research over time. In: Communications of the Association for Information Systems. 19. Jahrgang, Nr. 7, 2007, S. 115–131, doi:10.17705/1CAIS.01907.
  14. Paul Benjamin Lowry, Gilbert G. Karuga, Vernon J. Richardson: Assessing leading institutions, faculty, and articles in premier information systems research journals. In: Communications of the Association for Information Systems. 20. Jahrgang, Nr. 16, 2007, S. 142–203, doi:10.17705/1CAIS.02016.
  15. Heather Piwowar, Jason Priem, Richard Orr: The Future of OA: A large-scale analysis projecting Open Access publication and readership. 9. Oktober 2019, doi:10.1101/795310.
  16. E. Garfield: Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas. In: Science. 122. Jahrgang, Nr. 3159, 1955, S. 108–11, doi:10.1126/science.122.3159.108, PMID 14385826, bibcode:1955Sci...122..108G.
  17. Eugene Garfield: The evolution of the Science Citation Index. In: International Microbiology. 10. Jahrgang, Nr. 1, 2011, S. 65–9, doi:10.2436/20.1501.01.10, PMID 17407063 (upenn.edu [PDF]).
  18. Eugene Garfield: Science Citation Index. In: Science Citation Index 1961. 1. Jahrgang, 1963, S. v–xvi (upenn.edu [PDF; abgerufen am 27. Mai 2013]).
  19. History of Citation Indexing. Clarivate Analytics, November 2010, abgerufen am 4. November 2010.
  20. Science Citation Index Expanded. Abgerufen am 17. Januar 2017.
  21. Jiupeng Ma, Hui-Zhen Fu, Yuh-Shan Ho: The Top-cited Wetland Articles in Science Citation Index Expanded: characteristics and hotspots. In: Environmental Earth Sciences. 70. Jahrgang, Nr. 3, Dezember 2012, S. 1039, doi:10.1007/s12665-012-2193-y, bibcode:2009EES....56.1247D.
  22. Yuh-Shan Ho: The top-cited research works in the Science Citation Index Expanded. In: Scientometrics. 94. Jahrgang, Nr. 3, 2012, S. 1297, doi:10.1007/s11192-012-0837-z (edu.tw [PDF]).
  23. 25.0 25.1 Acknowledgement vs. Acknowledgment. 22. September 2012;.
  24. , C. Lee Giles, Hui Han, Eren Manavoglu: Automatic acknowledgement indexing: expanding the semantics of contribution in the CiteSeer digital library. In: K-CAP '05., S. 19–26. ISBN 1-59593-163-5 doi:10.1145/1088622.1088627
  25. C. L. Giles, I. G. Councill: Who gets acknowledged: Measuring scientific contributions through automatic acknowledgment indexing. In: Proc. Natl. Acad. Sci. U.S.A. 101. Jahrgang, Nr. 51, 15. Dezember 2004, S. 17599–17604, doi:10.1073/pnas.0407743101, PMID 15601767, PMC 539757 (freier Volltext), bibcode:2004PNAS..10117599G (psu.edu [PDF]).
  26. PLOS Collections. In: Public Library of Science (PLOS).: „Altmetrics is the study and use of non-traditional scholarly impact measures that are based on activity in web-based environments“
  27. "The "alt" does indeed stand for "alternative"" Jason Priem, leading author in the Altmetrics Manifesto -- see comment 592
  28. Janica Chavda, Anika Patel: Measuring research impact: bibliometrics, social media, altmetrics, and the BJGP. In: British Journal of General Practice. 66. Jahrgang, Nr. 642, 30. Dezember 2015, S. e59–e61, doi:10.3399/bjgp16X683353, PMID 26719483, PMC 4684037 (freier Volltext).
  29. 32.0 32.1 32.2 Jason Priem, Dario Taraborelli, Paul Groth, Cameron Neylon: Altmetrics: A manifesto (v 1.01). In: Altmetrics. 28. September 2011 (altmetrics.org).
  30. Peter Binfield: Article-Level Metrics at PLoS - what are they, and why should you care? (Video) In: University of California, Berkeley. 9. November 2009;.Template:Cbignore
  31. Sönke Bartling, Sascha Friesike: Opening Science: The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing. Springer International Publishing, Cham 2014, ISBN 978-3-319-00026-8, 181, doi:10.1007/978-3-319-00026-8 (archive.org): “Altmetrics and article-level metrics are sometimes used interchangeably, but there are important differences: article-level metrics also include citations and usage data; ...”
  32. Paul Mcfedries: Measuring the impact of altmetrics [Technically Speaking]. In: IEEE Spectrum. 49. Jahrgang, Nr. 8, August 2012, ISSN 0018-9235, S. 28, doi:10.1109/MSPEC.2012.6247557.
  33. Finbar Galligan, Sharon Dyas-Correia: Altmetrics: Rethinking the Way We Measure. In: Serials Review. 39. Jahrgang, Nr. 1, März 2013, S. 56–61, doi:10.1016/j.serrev.2013.01.003.
  34. , Christoph Carl Kling, Steffen Lemke, Athanasios Mazarakis, Isabella Peters: Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media. . ISBN 9781450355636 doi:10.1145/3201064.3201101
  35. Peter Weingart: Impact of bibliometrics upon the science system: Inadvertent consequences? In: Scientometrics. 62. Jahrgang, Nr. 1, 1. Januar 2005, ISSN 0138-9130, S. 117–131, doi:10.1007/s11192-005-0007-7.

External links

Template:Science and technology studies