Skip to Main Content

Research Metrics: Measuring Research Performance

What is Bibliometrics?

The statistical analysis of text-based research outputs (books, journal articles, conference papers, etc) with respect to citation counts. Based on how the citation details of outputs is utilised to analyse various aspects of research performance, most commonly used metrics can be classified into Author Level, Article Level and Journal Level.

Types of Bibliometrics

There are various metrics that individual researchers can use to measure their research performance. It should be remembered that these metrics can differ from one calculation to another, depending on the content of the database being used. Another consideration is that research in some disciplines will reflect a lower metric than others, especially where a publishing portfolio is less heavily weighted by articles. For this reason, author metrics should never be used in isolation. Some commonly used metrics include:

h-Index: The h-index the most commonly recognised author-level metric. It is calculated by counting the number of publications for which an author has been cited by other authors at least that same number of times. For instance, an h-index of 17 means that the author has published at least 17 papers that have each been cited at least 17 times. If the author's 18th most cited publication was cited only 10 times, the h-index would remain at 17. If the author's 18th most cited publication was cited 18 or more times, the h-index would rise to 18.

g-index: This is a variant of the h-index that gives more weight to the most highly cited papers in an author's data set in a way that the h-index does not.  The g-index is always the same as or higher than the h-index.

m-index: Another variant of the h-index that displays h-index per year since first publication. The h-index tends to increase with career length, and m-index can be used in situations where this is a shortcoming, such as comparing researchers within a field but with very different career lengths.

h5-index: This index uses a 5-year publication and citation window on the standard h-index calculation.

i10-index: Introduced by Google Scholar, the i10-index measures the number of publications by an author with at least 10 citations.

Learn more about using these indices in SciVal

Article Level Metrics that are used to measure the visibility of a researcher's work i.e. how often an output is cited in other publications.

FWCI: SciVal’s field-weighted citation impact (FWCI) is an article-level metric that takes the form of a simple ratio where actual citations of an output are divided by the expected rate for outputs of similar age, subject and publication type.  FWCI has the dual merits of simplicity and ease of interpretation: a value of 2 indicates that an output has achieved twice the expected impact relative to the world literature.  It is a useful addition to the benchmarking toolkit, but should not be used in isolation.

Citations per Publication - can be used to indicate the average citation for a set of publications. This can be useful for analysis of groups and disciplines, and to showcase performance.

 

These metrics that are used to measure the impact of a journal and compare their rankings against other journals. As with other metrics, they should never be used in isolation.

SJR: SCImago Journal Rank (SJR) is a metric based on the idea that ‘all citations are not created equal’. With SJR, the subject field, quality and reputation of the journal has a direct effect on the value of a citation.

SNIP: Source Normalized Impact per Paper (SNIP) measures contextual citation impact by weighting citations based on the total number of citations in a subject field. The impact of a single citation is given higher value in subject areas where citations are less likely, and vice versa.

What are Altmetrics?

Altmetrics use alternative modes of measuring the visibility of a research output such as mentions on social media, news articles, policies and patents. In doing so they provide broader insight into how your research is being viewed and used beyond traditional citation metrics. This makes it a useful resource for measuring new research outputs that are yet to receive citations in traditional scholarly publications. The University subscribes to Altmetric. For further information, refer to our libguide  - Altmetric at Dundee: Who's talking about your research?

What is PlumX?

PlumX is the second major aggregator of altmetrics. Unlike the Altmetric tool, it does not aggregate altmetrics data into a single score. Instead, it assigns metrics into five different categories, detailed below. Although we don’t subscribe to PlumX, it is still a useful visualisation tool that provides an indication of where and how our research is being cited beyond traditional publications.