Bibliometric methods can be used to assess scientific publications and their authors.
Scientific publishing can be evaluated qualitatively and quantitatively. Both methods are complementary to each other. In qualitative evaluation, the focus is, above all, in the publication's factual content and its relevance regarding the field of science in question. Thus, the reviewers require experience and expertise in the area. Peer reviewing before publishing is one part of qualitative evaluation.
In quantitative evaluation or in other words, bibliometrics, different indicators are produced from the publication using mathematical and statistical methods. These indicators illustrate the productivity, impact and quality in publishing. Indicators are used when evaluating research, for instance, in State of scientific research in Finland reports and university rankings.
- Bibliometrics for the University Community self-study course
We offer UEF Library customers guidance in utilising bibliometrics, altmetrics and datametrics.
We also offer tools for analysis and access to databases acquired by the organisation, as well as training and guidance in using them. For the university administration we produce tailored services for research assessment.
We give advice and guidance in the recommendations of responsible use of metrics.
Feasibility of bibliometrics in different fields of research
Bibliometric databases, such as Web of Science and Scopus, favour so-called exact sciences. These include natural sciences, such as biochemistry, pharmacy and medicine. Findings in these fields of science are usually published as articles in English-written journals. The more a certain field publishes in books (monographs) and in other languages than English, the weaker results bibliometrics provides. Typical examples are human sciences and sociological sciences. These fields often rely on indicators provided by Google Scholar content (see Publish or Perish under Bibliometric analysis tools and databases). The opportunities of bibliometrics are therefore very different between historians and medical scientists.
You can find the indicators described below from, for example, the following databases
- Web of Science (mm. Author Search, Journal Citation Reports)
- Scopus (View Citation Overview)
- Scimago Journal & Country Rank
Usually articles are tallied up by author, sometimes also by subject, institute or organisation. The easiest way to perform an analysis is to do it by author or by organisation, because these publication numbers can be acquired straight from bibliometric databases. In addition, your own organisation's publication index (UEF CRIS) and the Research.fi portal are useful.
Usually tallied up in the way that the author's citations to his or her own articles (auto citations) are removed. You can find the citation numbers in, for example, the Scopus (Author Search - Activate the icon before the author's name - View Citation Overview) and Web of Science (Author Search - Search by author's name - Create Citation Report) databases. Also Google Scholar offers citation information.
Illustrates how networked journals and citations are. The total number of citations to articles for the last five years are accounted for, as well as the most cited separate journal issues. More information (select Eigenfactor Metrics on the service's main page).
Physicist Jorge Hirsch developed the h-index for evaluating productivity of theoretical physicists in 2005. Using the index has gradually spread to other fields of science. The h-index is determined by arranging a list of person's publications in a descending order of the number citations and finding the ordinal number of a publications that has been cited at least as many times as how large the ordinal number is. For example, if a person's h-index is 5, it means that they have five publications that have been cited at least five times each. The h-index combines the number of publications to the number of citations. The goal is to consider both the researcher's complete career development and significance of their publications. The faster a researcher can raise their h-index, the more "impressive" they are. Vice versa, if a younger and a more mature scientist have the same h-index, the younger one is more "successful" or "impressive" than the more mature researcher.
How to find a researcher's h-index
- Web of Science
- Scopus (See also Scopus tutorials: author search, author details)
- Google Scholar/Publish or perish
The Web of Science-based InCites Journal Citation Reports (JCR) contains, among others, journals' reference information and impact values. It is published once a year, usually in June.
Known also as impact number, impact measure and citation factor. This is an indicator for a scientific journal, and it proportions the number of citations for an article to the number of published articles. It is tallied up for the whole journal, never for separate articles or authors. An impact factor, for example, for the year 2016 is calculated by tallying up the citations to articles published in the journal in the two previous years (2015+2014), and this sum is divided by the number of articles published in the journal within the same period of time. If a journal's impact factor is 3.7, it means that for each article published in the journal in question has been cited an average of 3.7 times during the two years prior to the impact factor year. IF is an average value of references with a crooked distribution, for only a part of a journal's articles are cited at all, or a very small part of them are cited widely. Journals with also a small IF value publish articles that are cited a lot, and journals with high IF may publish articles that do not get cited at all. How to find Journal Impact Factor
An indicator that shows how many times an article has been cited on average during its publishing year. The higher the immediacy index, the more rapidly the article has received citations. A journal's Immediacy Index value can be found in, for example, the Journal Citation Reports database (Select Journals - Click on a journal title - Key Indicators).
A Google PageRank algorithm-based impact factor for a scientific journal. This indicator is tallied up slightly in the same way as the impact factor. However, the journal's subject area and "authority" initially determine the citations' importance. The number of citations counted from only one year, and this is proportioned to the number of articles published in the journal during the three previous years. The emphasis on citations aims to normalise the SJR values. The goal is to prevent "unearned increment" of certain fields of research, since it occurs easily when only Impact factor values are used. For more information, see Scimago Journal & Country Rank, choose Journal Rankings - Search by a journal's title - SJR value can be found in the diagram.
An indicator based on the number of citations received by a journal, with a goal to improve the comparability of different fields of science. Citations received by a single, separate journal are proportioned to the total number of citations received by the journal in question in the same manner as regarding the SJR indicator. You can find a journal's SNIP value in, for instance, the Scopus database (Sources - Search for a journal -> Click on the title of a journal - Source Details) More information: How are CiteScore metrics used in Scopus?
Bibliometric analysis tools and databases
Scopus is a multidisciplinary and diversified literature database produced and upheld by Elsevier. Contains over 50 million article references from 20,000 peer-reviewed journals. Its strength is standardising and reliably identifying author names and institution names. Provides the possibility to search for publications, citations, h-indexes, SJR values and SNIP values. Search results can also be analysed, for instance, to find essential journals, researchers and institutions of a desired subject matter. Scopus tutorials: author search, author details.
Web of Science consists of three subdatabases: Science Citation Index (SCI), Social Sciences Citation Index (SSCI) and Arts & Humanities Citation Index (A&HCI). Contains nearly the same journal publications as Scopus. Originally produced and upheld by Institute of Scientific Information, afterwards by Thomson Reuters and today by Clarivate Analytics. Web of Science provides the possibility to search for publications, citations and h-indexes. Search results can also be analysed in the same way as within the Scopus database.
The use of Web of Science is significantly hindered by the fact that the author names have not been standardised. For example, a Finnish author with the letters ä or ö in their family name may be found in the database with more than ten different spelling variations. The same issue considers the names of institutions. For a reliable result, all possible name variations should be included in the search pool.
JCR is the only official source for Impact factor values, produced by Clarivate Analytics. Contains the Impact factor values and other indicators of all journals included in the Web of Science database, restricted only to them. Clarivate Analytics is the owner of Impact factor values, and no other party is allowed to publish them, at least carrying the same title. Journal Citation Reports can be used as a part of the Web of Science database.
Publish or Perish is free-of-charge analysing software program which can be installed on a personal computer and which utilises publishing/publication data inside the Google Scholar database. Google Scholar contains several file formats, and this makes Publish or Perish usable especially in other fields besides "article science". The reliability of an analysis is impaired, because the exact contents of the Google Scholar database have not been published. More information: Google Scholar/Publish or Perish. Install Publish or Perish software.
A publishing channel ranking by a Finnish science community. The channels are divided into three levels (1 = basic, 2 = leading, 3 = top). Contains journals books series, book publishers and conferences. The channels are evaluated by 23 field of science-based panels consisting of specialists that include approximately 200 distinguished natively Finnish scientists or scientists working in Finland. Thirteen per cent of the basic government funding for universities is based on publications on JuFo-ranked channels. Read also: 10 facts about Publication Forum (Responsible Research)
The University of Eastern Finland specialists produce annually over 2,000 scientific publications, of which approximately two thirds have been published abroad. The primary publishing channels of research findings are international publishing forums, if it is appropriate to the field of science in question. Publications of the staff of the university can be retrieved from the SoleCRIS research database by, for instance, a person name, subject heading or publication type.
Publication information portal JUULI
For browsing the publication information of Finnish research organisations and for making searches on them. The portal is upheld by the National Library of Finland in collaboration with the Ministry of Education and Culture and CSC. The data in JUULI has been gathered from the publication indexes of Finnish universities during the Ministry of Education's annual data acquisition. At this moment, the portal contains university publication starting from the year 2011.
Publication reports of university education can be used to observe universities' annual publication activities using different kinds of classifications (for example, publication type and publication forum rank).
The truth is rarely pure and never simple.
The development of bibliometrics began in the first decades of the 20th century. At that time, scientific research activities had begun to establish themselves. Development of science itself was wanted to be analysed and modelled. Because research findings present themselves above all as publications, it was considered that analysing publications could provide information also about the development of science. The interest towards publications was purely academic at this point.
Bibliometrics lived a low profile for a long time within researchers' chambers, but in the 2000s, it emerged again. Academic ranking systems, scientific recruitments, tenure track arrangements, person evaluations, funding resolutions and such were wanted to be based on "objective" information. Then the possibilities of bibliometrics were noted. Sources of bibliometric information had developed to be easily used. Indicators could be produced easily and swiftly in large numbers.
Nevertheless, indicators are only numbers, and they need to be interpreted in some way in order to be understood. This often has difficulties. What is "small" or "large" in some field of science, and who defines the size or greatness? Who interprets these numbers in the first place and for what purpose? An individual scientist applying for funding? The faculty administration for outcome evaluation meetings? The university management figuring out the worldwide ranking of their institute?
Using bibliometrics to sort out the mutual "rankings" of different fields of science is a perilous choice. Differences can be explained, among others, with different publishing and citing conventions of different fields. The main rule is that a bibliometric analysis is valid only within one field of science. The "Leiden Manifesto for Research Metrics" consists of ten principles that should be followed when evaluating research using bibliometric methods.
- Research development and evaluation (UEF)
- San Francisco Declaration on Research Assessment (DORA)
- CoARA (Coalition for Advancing Research Assessment)
- Hong Kong Principles
- Leiden Manifesto for Research Metrics
- Metric Tide
- Good practice in researcher evaluation - recommendation for the responsible evaluation of a researcher in Finland
Library, Joensuu Campus Library, Carelia
Library, Kuopio Campus Library, Snellmania