Universities must increase the visibility of their research to demonstrate the contribution they make to the knowledge economy, improve their chances in the competition for research funding and position themselves in the global research landscape. It is therefore important to publish your work where you know it will be visible and have an impact.
Databases such as Clarivate Analytics’ Journal Citation Reports and Elsevier’s Scopus offer various metrics and functionalities to compare the value and impact of journals. Google Scholar also provides citation metrics for journals.
The following performance indicators are included for journals in the JCR:
All journals indexed by Thomson Reuters Web of Science are divided into specific Web of Science subject categories. Journals within these categories are then ranked according to Impact factor to indicate journals with the highest impact in the field. An indication of the quartile in which the journal falls is also indicated.
Scopus has been a data source for deriving many bibliometric indicators and related analytical tools for measuring performance of journals. The “compare journals” functionality gives a graphic display of up to 10 selected journals at a time that contain the following indicators:
Scimago Journal Rank indicator (SJR)
Source Normalized Impact per Paper (SNIP)
Impact per Publication (IPP)
Percentage not cited by year
Percentage review documents by year
Different databases offer different functionalities to determine citation patterns and rankings. Graphs and maps are usually available for better visualisation.
Comparisons between the citation results of different databases should be done with care as date ranges and source material upon which the results are based differ between databases. A search on more than one database such as Web of Science and Scopus should be done to get more accurate results. Google Scholar metrics also provides a different picture.
Most funding and rating institutions require the number of research contributions and citations counts per researcher. The National Research Foundation (NRF), for instance, requires an author to provide their Web of Science H-Index, Scopus H-index, Google Scholar profile, and Google citation profile. Author name ambiguity makes it extremely problematic to calculate these contributions.
To address this, several databases have developed solutions. Thomson Reuters launched the Researcher ID in 2008. Since 15 April 2019 the Web of Science ResearcherID is hosted on Publons where an author's peer review activity is also recorded. Scopus allocates an Author ID to each name form. Google Scholar provides a platform for profiles and citations. The different social media platforms, such as Research Gate, Academia.edu, and LindedIn all have their individual identifiers. ORCID (Open Researcher and Contributor ID) developed a hub that links to all of these identifiers. By using ORCID, the researcher only has to populate ORCID, and that information will then be pushed to all the other databases and social media platforms.