The field of bibliometrics analyses publications through statistical methods. It uses citation data to provide insight into the impact of research outputs. Bibliometrics can be used in combination with qualitative indicators such as peer review, funding received, and the number of patents and awards granted.
You can use bibliometrics to:
- analyse your research outputs
- provide evidence of the impact of your research
- find new and emerging areas of research
- identify potential research collaborators
- identify suitable sources in which to publish.
Sources of bibliometric data
Sources of journal information that enable publication and citation analysis include Web of Science, Scopus, SciVal, Google Scholar and Journal Citation Reports. You shouldn’t combine data from different sources to generate a specific metric as their coverage is different so they produce data differently. When you search for a specific researcher, make sure that you include all their possible name variations (this is made easier if the researcher has an ORCID identifier connecting their outputs).
Some metrics are normalised. This aims to put citations into context. For example field weighted citation impact (FWCI) takes into account the differences in research behaviour across disciplines. FWCI of 1 means the output performs as expected for the global average. FWCI of more than 1 means the output is cited more than expected. For example 1.25 means cited 25% more than expected.
Any single metric will not provide a rounded overview of research performance. To practice responsible research metrics you will need to use a “basket of metrics”. Consider the context in which the metrics are used and look at them alongside appropriate qualitative measures.
Find out more by attending a bibliometrics webinar run by the Library’s Research Support team.
Here are some common bibliometric measures. Each metric has a link to more information about what it is and possible use-cases, from Elsevier’s Scival support hub and the Clarivate support pages (Journal Citation Reports):
- Citation counts: the number of times a research output appears in the reference lists of other documents. Found in: Google Scholar, SciVal, Scopus and Web of Science
- Citations per publication: the average number of citations received per publication. It can apply to an entity such as a researcher, a group of researchers, an entire institution. Found in: SciVal, Scopus and Web of Science
- Field-weighted citation impact: the ratio of citations received relative to the expected world average for the subject field, publication type and publication year. It can apply to a research output or group of research outputs. Found in: SciVal and Scopus
- H-index: designed to measure an author's productivity and impact. It is the number of an author’s publications (h) that have h or more citations to them. Found in: Google Scholar, SciVal, Scopus and Web of Science
- Journal Impact Factor: based on the average number of citations received per paper in that journal during the preceding two years. Found in Journal Citation Reports
- Outputs in top percentiles: the number or percentage of research outputs in the top most-cited publications in the world, UK, or a specific country. Found in SciVal and Scopus
- Publication counts: Also known as scholarly output. The total number of outputs published by an entity. Found in: Google Scholar, SciVal, Scopus and Web of Science.
Altmetrics are based on the number of times an article is shared, downloaded or mentioned on social media, blogs or in newspapers.
You should consider altmetrics alongside traditional bibliometric measures. This will give a wider picture of how a piece of research is being read and discussed. Altmetrics also give a more immediate indication of how an article is received than citations in publications.
The Altmetric doughnut
Altmetric developed the doughnut symbol as a way of visualising the attention a research output has received. It can be attached to an article, dataset or other research object online.
The Altmetric Attention Score in the centre of the doughnut is a count of all of the attention a research output has received in sources such as social media, blogs, newspapers and policy documents Each source is weighted by the company. For example, newspaper reports get a higher score than tweets.
The colours in the doughnut reflect the mix of sources. Take a look at this Almetric example article to see how the score is calculated.
The Altmetric doughnut tool works best with articles published after July 2011, and may underestimate scores for older articles.
How can I find altmetric data?
Ways to find altmetric data:
- Install the free Altmetric bookmark to display altmetrics for any published research output with a DOI.
- Use Figshare – an online repository for research outputs including posters, slideshows, and datasets – to see how many shares and citations an output has had.
- Create a personal profile on ImpactStory and link it to your research outputs online.
Considerations when using altmetrics
A high number of shares or social media mentions does not necessarily mean that an article is of high quality. An article may be mentioned on social media because it contains something amusing or unusual.
Social media can also be easily manipulated and "likes" or mentions can be paid for or generated, so the numbers might not reflect the actual level of public interest in a piece of work.
- Further information about additional metrics and use cases is available from the Metrics Toolkit.
- Metrics in SciVal: what are they and what are their strengths and weaknesses?
- Research metrics guidebook from Elsevier