When using Elements you will come across a number of quantitative research metrics. The information below explains the metrics and provides links to further information.
In the publications summary on your home page you may see one or more values for your h-index, with an indication of the data source it relates to. The possible data sources are Scopus, Web of Science and Europe PMC.
Elements does not import your h-index from each data source provider. It calculates the h-indexes for you based on the claimed publications in your Elements profile.
For each of the data sources, Elements calculates the h-index as follows:
1. all claimed publications are identified.
2. those publications that do not have a record from the chosen data source are discarded.
3. the standard formula for calculating the h-index is applied to the remaining records.
It is therefore possible that the h-index in Elements will not be the same as the h-index you could find calculated for you in Scopus, Web of Science or Europe PMC.
Please see the metrics toolkit description of the h-index for more information about the limitations of this measure and read the SHU guide to responsible metrics.
When looking at the detailed view of your publications, you may also see Altmetrics and Citation metrics data relating to each individual publication.
You can click on the Altmetrics icon (known as the Altmetrics donut) to see more detail about social media and similar attention to the publication. Please read our web page about altmetrics for more information.
You may also see citation counts from the databases; Dimensions, EPMC (Europe PMC), Scopus and WoS (Web of Science).
When you look at the detailed view of your publications, you may see a SNIP and SJR provided below publications which are journal articles. These are metrics which relate to the journal in which your article was published. It is important to be aware that journal bibliometrics indicators should not be used to assess particular articles in that journal.
It is important to understand the limitations of any metrics you use and to use them for appropriate purposes.
Please read the SHU guide to responsible metrics.
SHU is also a signatory of the San Francisco Declaration on Research Assessment (DORA). This is a set of recommendations for funding agencies, institutions, publishers, researchers and other stakeholders, to improve practices in research assessment and is about about using quantitative indicators (metrics) responsibly.
The Metrics toolkit is a great source of information about using metrics responsibly, including the limitations and use cases for individual indicators.
You may find the short video useful: The Leiden Manifesto for Research Metrics from Diana Hicks, et al. on Vimeo. It describes 10 principles to guide the use of metrics in research evaluation and is a video version of the Nature paper: Hicks, D., Wouters, P., Waltman, L., de Rijke, S. & Rafols I. (2015). The Leiden Manifesto for research metrics: use these 10 principles to guide research evaluation. Nature, April 23, 520:429-431. doi:10.1038/520429a.