Skip to main content

Changing How We Evaluate Scholarship

While major universities recognize the impact of open scholarship for distribution through institutional repositories, they do not go far enough in “counting” open scholarship in its various forms towards rank and advancement.  In this post, I am comparing the recommendations of DORA, the Leiden Manifesto, and HuMetrics on how we can change the way we evaluate scholarship to be more quality-centered.  While none of the publications define quality in scholarship, there is definitely overlap in their recommendations for change, despite the fact that these organizations come from different disciplines. These could act as general areas of consideration for proposing a shift in the evaluation of scholarship.



1) Quality over quantity. All three publications agree on emphasizing quality as defined by experts in the field, not exclusively by journal metrics.  Journal metrics are a part of the picture, but not the whole picture.  Institutions, funders, and researchers alike should emphasize the importance of the quality of research over the package or publishing it comes in (DORA, 2014. HIcks et al, 2015).  HuMetrics (2020) goes further to suggest a shift from output to quality since research in the humanities is often a lengthier process with traditionally fewer publications and citations. Some article-level metrics could be helpful to emphasize the quality of given research over the rating of a journal.  


2) Transparency in metrics used.  There should be no “black box” metrics (Leiden Manifesto, 2015).  This is a response to criticisms that it is not always clear what is being assessed in JIF or other metrics.  Also, criteria should be clear for funding decisions and RPT decisions.  For scholars, DORA recommends “responsible authorship,” with clear delineation of how listed authors have contributed.  


3) Broaden perspectives on impact. DORA suggests that acceptable research outputs should expand to include software and data sets.  HuMetrics expands research impact to data curation and project management, as well as other scholarly processes which surround journal publication, such as peer review and editing.  By broadening our view of scholarship, we debunk the “myth of the lone scholar” (HuMetrics, 2020) and count all meaningful scholarly contributions in RPT decisions.  All three mention various forms of scholarship, including open scholarship.  We should use altmetrics to quantify varied types of impact, but realize that even altmetrics leave out quality contributions in public policy, debates, and conferences (HuMeterics, 2020).


4) Base evaluation on values.  The Leiden Manifesto suggests that institutions clearly define their values and priorities and align evaluation accordingly.  Scholars at a teaching institution, for example, should not be held to the same research standards as those at Research 1 Institutions.  HuMetrics is concerned with the culture of scholarship and recommends that scholars “establish values-based frameworks that align with, recognize, and reward personal and institutional values enacted in supportive local contexts.”  This will allow us to avoid getting into corrosive competitive cultures based on journal metrics that don’t reflect the richness of scholarly work.  


These four areas represent the overlapping priorities of these three well-known publications recommending change in how we evaluate scholarship. Using these priorities as a guide could help departments analyze and improve current practices of evaluating reserach.


Agate, N., Kennison, R., Konkiel, S., Long, C. P., Rhody, J., Sacchi, S., & Weber, P. (2020). The transformative power of values-enacted scholarship. Humanities and Social Sciences Communications, 7(1), 1-12.


Agate, N., Kennison, R., Konkiel, S., Long, C., Rhody, J., & Sacchi, S. (2017). HuMetricsHSS: towards value-based indicators in the Humanities and Social Sciences.


Bladek, M. (2014). DORA: San Francisco declaration on research assessment (May 2013). College & Research Libraries News, 75(4), 191-196.


Hicks, D., Wouters, P., Waltman, L. et al. (2015).  Bibliometrics: The Leiden Manifesto for research metrics. Nature 520, 429–431. https://doi.org/10.1038/520429a


Comments