Three key measures of research impact are:
- Quality of the journal – journal rankings such as Scimago
- Quality of the publication/article = times cited, altmetrics
- Personal or departmental measure = h-index
Other indicators of impact beyond metrics are now being more widely accepted in the research community. The ARC (Australian Research Council) Research Impact Principles and Framework defines research impact as “the demonstrable contribution that research makes to the economy, society, culture, national security, public policy or services, health, the environment, or quality of life, beyond contributions to academia”. Key metrics such as article citations and other indicators such as media mentions should complement each other to build a case study of impact.
When researchers refer to another author’s work in their own published work, they cite it. Such citations can be analysed to measure the usage of the cited work.
A citation index is a compilation of all the cited references from articles published during a particular year or period. A citation index allows you to determine the research impact of your publications according to the number of times it has been cited by other researchers.
You can now find much of the citation data explained on this page via your VU Elements profile. For more information see VU Elements Quick Guide.
The availability of citation data varies enormously between disciplines, as does the citation behaviour. This means that rankings are only relevant within disciplines and journals dedicated to very specific disciplines tend not to have high impact factors.
Scopus and Scimago
Scopus is an abstract and citation database of peer-reviewed literature and quality web sources, with smart tools to track, analyse and visualise research. Along with locating scholarly articles and book chapters, you can use Scopus for:
- Citations of articles (from 1996 onwards)
- Citation score of authors via the ‘Author Search’ function (h-index)
- Citation data of the publications of authors
- The SNIP (Source Normalized Impact per Paper) value of journals.
Scimago is a portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database. The SJR (Scimago Journal Rank) is a metric applied to the journals indexed in Scimago based on average citation counts. Scimago presents the journal subject category lists in quartile rankings.
Google Scholar also indexes citations for individual articles and provides a h-index for researchers with a My Citations profile.
Google Scholar is good for disciplines not well-covered by citation databases such as Scopus or Web of Science. However it is important to carefully check citation data from Google Scholar to ensure there are no duplicates or mid-attributions.
Other databases with citation links
Many discipline specific databases give citation counts, e.g.:
And some publisher databases:
But they only cover citations from articles indexed within that database.
The h-index attempts to measure a researcher's total (journal) output, based on citations over an entire body of work. So, a researcher with an index of h has published h papers with at least h citations each.
You can find your h-index in citation databases such as Scopus and Google Scholar but be aware that they can only give your h-index according to the articles they index. The h-index can also be calculated manually by obtaining citation counts for all your published papers. Then rank your papers by # citations (from highest cited to lowest), and draw a line where the number of items above this line, which is h, have at least h citations.
Keep in mind the value of h will vary between subject disciplines. Also the h-index does not take into account age or currency of the researcher.
Altmetrics, otherwise known as alt(ernative)metrics, has emerged as an alternative to more traditional journal and book citation metrics (bibliometrics), such as the impact factor and h-index.
Altmetrics should ideally be used as a supplement to citation metrics and considered as yet another measure of impact to be included when presenting a ‘case study of impact’. That is, build a case around the different ways your research has had a positive impact on academia, society, culture and/or economy.
The term ‘altmetrics’ is best defined by what it measures:
Altmetrics measure the number of times a research output gets cited, tweeted about, liked, shared, bookmarked, viewed, downloaded, mentioned, favourited, reviewed, or discussed. It harvests these numbers from a wide variety of open source web services that count such instances, including open access journal platforms, scholarly citation databases, web-based research sharing services, and social media.
By nature, altmetrics are a much more immediate measure of how a research output is tracking in the ‘real world’. That is because altmetrics is not slowed down by the lengthy peer-review and editorial process of publishing to generate a ‘score’. A researcher can publish a journal article for instance and immediately start generating attention in the form of tweets and bookmarks etc. The more attention an output receives, the higher the score.
Loria, P. (2013, Apr 16). Altmetrics and open access a measure of public interest [Web log post]. Retrieved from http://aoasg.org.au/altmetrics-and-open-access-a-measure-of-public-interest/
How are altmetrics scored?
There are a number of companies that track and analyse online activity to give an altmetric score. VU researchers can access the altmetric score (provided by altmetric.com) for their individual research outputs via their VU Elements account.
Altmetric.com monitor a range on online sources for mentions of a research output. This involves looking for HTML links to scholarly content, and also text-mining for publication and author name. The online sources include:
- Public policy documents
- Mainstream media
- Online reference managers, e.g. Mendeley
- Post-publication peer-review platforms, e.g. Pubpeer and Publons
- Research highlights, e.g. F1000
- Social media, e.g. Facebook, Twitter, LinkedIn
- Multimedia and other online platforms.
The Altmetric score is an automatically calculated, weighted approximation all of the attention a research output has received. It is based on volume, sources and author. It is appropriate to consider this score as a measure of attention rather than quality. The score does not take into account the nature of the attention!
For more details on how Altmetric.com generates their score see the donut score.
Other altmetric tools that can be used to gain an altmetric score for your research output include:
- ImpactStory – can be used to track attention around a wide range of outputs (such as articles, data, slides etc). Aggregates data from a wide range of online sources and displays in a single report.
- PLoS Impact Explorer – allows you to browse conversations collected by altmetric.com for papers published by PLoS.
- Altmetric.com embeddable badges – Many systems and websites, including VU Elements, display an embedded altmetric.com badge so that researchers can instantly see their altmetric data.
For more information on open access publishing and your funder obligations contact the Research Librarian or your College Librarian, for contact details see the Library research support page.