Bibliometrics or citation-based research metrics are quantitative tools that can be used to measure and evaluate the quality and impact of scholarly work. Bibliometrics can be generated at the article, journal, and researcher level and include indicators like citation counts, weighted or normalised citation impact, h-index, and journal impact factors. When demonstrating the impact of your research, metrics only tell part of your story and should always be supplemented with qualitative peer review and a personal narrative statement.
As a researcher, it is crucial to demonstrate the quantitative and qualitative impact of your research on both the academic and wider community. Being familiar with your bibliometrics ensures that you can demonstrate your impact and engagement for grant applications and academic promotion rounds.
It is essential for researchers to use their online profiles to promote outputs and professional activity; they are the foundation for measuring and demonstrating research impact using bibliometrics. It is important to regularly check that your profiles are up-to-date, accurate, and disambiguated from other profiles.
Learn how to update each of the main profiles:
Maintaining profiles will ensure your outputs are correctly attributed to you, allow you to accurately track your impact and analyse your bibliometrics, and enable you to find and be found by potential new collaborators and sources of funding by promoting your research globally.
The Library provides a range of research metrics and bibliometrics tools and resources to help you assess and demonstrate your research quality and impact.
These tools ensure that you can:
Citation metrics may differ between tools, depending on the data source they use. There is overlap between Scopus, SciVal, Web of Science, and InCites but because there are differences in what each platform indexes, you should use each tool when gathering your metrics. There are also alternative measures to demonstrate impact that should be considered.
Scopus is an abstract and citations database which can be used to find research, identify experts, and gain access to reliable data, metrics, and analytical tools.
Useful metrics: h-index; Documents by author; Citation counts; Field Weighted Citation Impact (FWCI); PlumX alternative metrics; CiteScore; Source Normalized Impact per Paper (SNIP); Scimago Journal Ranking (SJR)
SciVal is Elsevier’s benchmarking and research analytics tool, which measures research performance using Scopus data.
Useful metrics: h-index; Documents by author; Citation counts; Field Weighted Citation Impact (FWCI); Outputs in top citation percentiles; Outputs in top journal percentiles and quartiles; International and industry collaborations; CiteScore; SNIP; SJR
Web of Science is an abstract and citation database similar to Scopus but indexing different data sources.
Useful metrics: h-index; Documents by author; Citation counts; Category Normalized Citation Impact (CNCI); Author Impact Beamplot; Author position (first, last, corresponding); Hot Papers; Highly Cited Papers; Journal Impact factor (JIF)
InCites is Clarivate’s benchmarking and research analytics tool, which measures research performance using Web of Science data.
Useful metrics: h-index; Documents by author; Citation counts; Category Normalized Citation Impact (CNCI); Author position (first, last, corresponding); Hot Papers; Highly Cited Papers; % Docs cited; % Docs in Q1/Q2 journals; Articles in top 1% and 10%; International and industry collaborations; Average percentiles; JIF; Journal Normalized Citation Impact (JNCI); Percentile in subject area
Clarivate’s Journal Citation Reports (JCR) provides journal-level metrics using Web of Science data to help assess the quality and impact of journals.
Useful metrics: Journal Citation Impact (JCI); Journal Impact Factor (JIF); Journal quartile ranks; h-index; Cited half-life; Contributions by country/region; Article Influence Score; 5 Year Impact Factor; Immediacy Index
In addition to traditional bibliometrics, there are alternative or supplemental measures which can help demonstrate the impact of your research. Alternative metrics, for example, offered by tools like Altmetric Explorer and PlumX measure research impact based on online activity. Unlike traditional citation impact, which takes time to accrue, alternative metrics show real-world, online engagement with your research immediately. Your ResearchNow profile displays the Altmetric donut and PlumX plumprint next to your outputs, where applicable. Clicking on the donut or plumprint will take you to the interpretations of those metrics. The free Altmetric bookmarklet displays article-level altmetrics for outputs with a DOI.
Alternative measures to demonstrate impact can be especially useful for researchers in fields not well-covered by traditional bibliometrics, such as the arts, humanities, and social sciences. Other ways to demonstrate research impact include:
To demonstrate impact, you may also like to refer to journal quality lists like the Excellence in Research Australia 2023 Journal List or discipline-specific quality lists such as: Australian Business Deans Council (ABDC); Australian Political Studies Association Preferred Journals List; Computer Research and Education Association of Australia Rankings Portal; European Reference Index for the Humanities and Social Sciences Plus; Harzing Journal Quality List; and the Washington and Lee Law Journals.
A meaningful and comprehensive demonstration of research quality and impact should encompass a spread of quantitative and qualitative metrics to reflect the many ways that research can be considered successful.
When looking at your bibliometrics, it is important not to rely solely on citation data for the purposes of evaluation. Citation data should be used in conjunction with informed peer review and with attention to factors that influence citation rates including publication language and schedule, journal history and format, and subject area specialty.
Flinders University Library supports the responsible use of metrics as espoused by the Declaration on Research Assessment (DORA) and the Leiden Manifesto for Research Metrics.
The Library provides guides on using Bibliometric tools to help you start gathering research metrics. Queries about gathering metrics can be submitted through a ServiceOne Library Research query.
For support with narrative statements (like ROPE) and the nuances of grant and promotion applications, please talk to your College Research Support team and the Research Grants and Tenders team. There is also excellent information available on the Research services and support page.
You consent to the use of our cookies if you proceed.