Judicious Use of Bibliometrics to Supplement Peer Evaluations of Research in Kinesiology

in Kinesiology Review
Restricted access

Purchase article

USD  $24.95

Student 1 year subscription

USD  $41.00

1 year subscription

USD  $54.00

Student 2 year subscription

USD  $77.00

2 year subscription

USD  $101.00

Peer evaluation of scholarly publications and faculty research agendas is an important responsibility of kinesiology faculty and administrators. These expert disciplinary judgments can be supplemented by the careful use of relevant publication- and scholar-specific bibliometric data. This paper summarizes the misuse of journal-level bibliometrics and the research on more relevant publication- or scholar-specific bibliometrics. Recommendations and examples are presented for use of publication- and scholar-specific metrics as supplementary data for peer evaluation of research in kinesiology. Faculty who are knowledgeable about the meaning and limitations of bibliometrics may effectively use these tools to support judgments and check for potential bias in peer evaluations of research for appointment, tenure, promotion, and awards.

The author is with the Dept. of Health & Human Performance, Texas State University, San Marcos, TX.

Address correspondence to dk19@txstate.edu.
  • Abramo, G., D’Angelo, C.A., & Costa, F.D. (2010). Testing the trade-off between productivity and quality in research activities. Journal of the American Society for Information Science and Technology, 61, 132–140. doi:10.1002/asi.21254

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Adams, D. (2016). Metrics. In Publish or perish manual. Retrieved from https://harzing.com/resources/publish-or-perish/manual/using/query-results/metrics

    • Export Citation
  • Adler, R., Ewing, J., & Taylor, P. (2009). Citation statistics—A report from the International Mathematical Union in cooperation with the International Council of Industrial and Applied Mathematics and the Institute of Mathematical Statistics. Statistical Science, 24, 1–14. doi:10.1214/09-STS285

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Aksnes, D.W., & Taxt, R.E. (2004). Peer reviews and bibliometric indicators: A comparative study at a Norwegian university. Research Evaluation, 13(1), 33–41. doi:10.3152/147154404781776563

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Alonso, S., Cabrerizo, F.J., Herrera-Viedma, E., & Herrera, F. (2009). h-index: A review focused in its variants, computation and standardization for different scientific fields. Journal of Informetrics, 3, 273–289. doi:10.1016/j.joi.2009.04.001

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Althouse, B.M., West, J.D., Bergstrom, C.T., & Bergstrom, T. (2009). Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology, 60, 27–34. doi:10.1002/asi.20936

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, T.R., Hankin, R.K.S., & Killworth, P.D. (2008). Beyond the Durfee square: Enhancing the h-index to score total publication output. Scientometrics, 76, 577–588. doi:10.1007/s11192-007-2071-2

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Antonakis, J., & Lalive, R. (2008). Quantifying scholarly impact: IQp versus Hirsh h. Journal of the American Society for Information Science and Technology, 59, 956–969. doi:10.1002/asi.20802

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Araújo Ruiz, J.A., van Hooydonk, G., Torricella Morales, R.G., & Arencibia Jorge, R. (2005). Cuban scientific articles in ISI Citation Indexes and CubaCiencias databases (1988–2003). Scientometrics, 65, 161–171. doi:10.1007/s11192-005-0265-4

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Barnes, C. (2017). The h-index debate: An introduction for librarians. Journal of Academic Librarianship, 43(6), 487–494. doi:10.1016/j.acalib.2017.08.013

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Belcher, B.M., Rasmussen, K.E., Kemshaw, M.R., & Zornes, D.A. (2016). Defining and assessing research quality in a transdisciplinary context. Research Evaluation, 25, 1–17. doi:10.1093/reseval/rvv025

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Belter, C.W. (2015). Bibliometric indicators: Opportunities and limits. Journal of the Medical Library Association, 103, 219–221. PubMed ID: 26512227 doi:10.3163/1536-5050.103.4.014

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bollen, J., Van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principle component analysis of 39 scientific impact measures. PLoS One, 4(6), 6022. PubMed ID: 19562078 doi:10.1371/journal.pone.0006022

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bornmann, L. (2011). Scientific peer review. Annual Review in Information Science and Technology, 45, 197–245. doi:10.1002/aris.2011.1440450112

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bornmann, L., & Daniel, H.-D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64, 45–80. doi:10.1108/00220410810844150

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bornmann, L, & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7, 286–291. doi:10.1016/j.joi.2012.12.003

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bornmann, L., & Marx, W. (2013). How good is research really? EMBO Reports, 14, 226–230. PubMed ID: 23399654 doi:10.1038/embor.2013.9

  • Bornmann, L., & Marx, W. (2014). How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations. Scientometrics, 98, 487–509. doi:10.1007/s11192-013-1161-y

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cameron, B.D. (2005). Trends in the usage of the ISI bibliometric data: Uses, abuses, and implications. Libraries and the Academy, 5, 105–125. doi:10.1353/pla.2005.0003

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Campbell, P. (2008). Escape from the impact factor. Ethics in Science and Environmental Politics, 8, 5–7. doi:10.3354/esep00078

  • Cardinal, B.J. (2013). Judicious use of journal impact factors and the preservation of our fields of study. Journal of Physical Education, Recreation and Dance, 84(2), 7–9. doi:10.1080/07303084.2013.757172

    • Search Google Scholar
    • Export Citation
  • Coleman, A. (2007). Assess the value of a journal beyond the impact factor. Journal of the American Society for Information Science and Technology, 58, 1148–1161. doi:10.1002/asi.20599

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Colledge, L., & Verlinde, R. (2014). SciVal metrics guidebook [Manual]. Retrieved from https://www.elsevier.com/__data/assets/pdf_file/0020/53327/scival-metrics-guidebook-v1_01-february2014.pdf

    • Export Citation
  • Costas, R., & Bordons, M. (2005). Bibliometric indicators at the micro-level: Some results in the area of natural resources at the Spanish CSIC. Research Evaluation, 14, 110–120. doi:10.3152/147154405781776238

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Costas, R., & Bordons, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Informetrics, 1, 193–203. doi:10.1016/j.joi.2007.02.001

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Costas, R., van Leeuwen, T.N., & Bordons, M. (2010). A bibliometric classificatory approach for the study and assessment of research performance at the individual level. Journal of the American Society for Information Science and Technology, 61, 1564–1581. doi:10.1002/asi.21348

    • Search Google Scholar
    • Export Citation
  • Delgado-Lopez-Cozar, E., & Cabezas-Clavjo, A. (2013). Ranking journals: Could Google Scholar metrics be an alternative to journal citation reports and Scimago journal rank? Learned Publishing, 26, 101–113. doi:10.1087/20130206

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Derrick, G.E., Haynes, A., Chapman, S., & Hall, W.D. (2011). The association between four citation metrics and peer rankings of research influence of Australian researchers in six fields of public health. PLoS One, 6(4), e18521. PubMed ID: 21494691 doi:10.1371/journal.pone.0018521

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Duffy, R., Jadidian, A., Webster, G., & Sandell, K. (2011). The research productivity of academic psychologists: Assessment, trends, and best practice recommendations. Scientometrics, 89, 207–227. doi:10.1007/s11192-011-0452-4

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Egghe, L. (2006). Theory and practice of the g-index. Scientometrics, 69, 131–152. doi:10.1007/s11192-006-0144-7

  • Franceschet, M. (2010). The difference between popularity and prestige in the sciences and in the social sciences: A bibliometric analysis. Journal of Informetrics, 4, 55–63. doi:10.1016/j.joi.2009.08.001

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016). The museum of errors/horrors in Scopus. Journal of Informetrics, 10, 174–182. doi:10.1016/j.joi.2015.11.006

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1, 359–375. doi:10.1007/BF02019306

  • Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295, 90–93. PubMed ID: 16391221 doi:10.1001/jama.295.1.90

  • Hagen, N.T. (2008). Harmonic allocation of authorship credit: Source-level correction of bibliometric bias assures accurate publication and citation analysis. Plos One, 3(12), e4021. PubMed ID: 19107201 doi:10.1371/journal.pone.0004021

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hagen, N.T. (2014). Counting and comparing publication output with and without equalizing and inflationary bias. Journal of Informetrics, 8, 310–317. doi:10.1016/j.joi.2014.01.003

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Harzing, A.-W., Alakanagas, S., & Adams, D. (2014). hIa: An individual annual h-index to accommodate disciplinary and career length differences. Scientometrics, 99, 811–821. doi:10.1007/s11192-013-1208-0

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520, 429–431. PubMed ID: 25903611 doi:10.1038/520429a

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Higher Education Funding Council for England. (2015). The metric tide: Correlation analysis of REF2014 scores and metrics. Supplementary report II to the independent review of the role of metrics in research assessment and management. United Kingdom: Higher Education Funding Council for England. doi:10.13140/RG.2.1.3362.4162

    • Search Google Scholar
    • Export Citation
  • Hirsch, J.E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102, 16569–16572. PubMed ID: 16275915 doi:10.1073/pnas.0507655102

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hirsch, J.E., & Buela-Casal, G. (2014). The meaning of the h-index. International Journal of Clinical and Health Psychology, 14, 161–164. doi:10.1016/S1697-2600(14)70050-X

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hood, W.H., & Wilson, C.S. (2001). The literature of bibliometrics, scientometrics, and informetrics. Scientometrics, 52, 291–314. doi:10.1023/A:1017919924342

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jensen, P., Rouquier, J.B., & Croissant, Y. (2009). Testing bibliometric indicators by their prediction of scientists promotions. Scientometrics, 78, 467–479. doi:10.1007/s11192-007-2014-3

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ke, Q., Ferrara, E., Radicchi, F., & Flammini, A. (2015). Defining and identifying sleeping beauties in science. Proceedings of the National Academy of Sciences, 112, 7426–7431. PubMed ID: 26015563 doi:10.1073/pnas.1424329112

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2013a). Citation classics in applied biomechanics of sports. Chinese Journal of Sports Biomechanics, 5(Suppl. 1), 273–276.

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2013b). Impact and prestige of kinesiology-related journals. Comprehensive Psychology, 2, Article 13. doi:10.2466/50.17.CP.2.13

  • Knudson, D. (2014). Citation rates for highly-cited papers from different sub-disciplinary areas within kinesiology. Chronicle of Kinesiology in Higher Education, 25(2), 9–17.

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2015a). Biomechanics scholar citations across academic ranks. Biomedical Human Kinetics, 7, 142–146. doi:10.1515/bhk-2015-0021

  • Knudson, D. (2015b). Citation rate of highly-cited papers in 100 kinesiology-related journals. Measurement in Physical Education and Exercise Science, 19, 44–50. doi:10.1080/1091367X.2014.988336

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2015c). Evidence of citation bias in kinesiology-related journals. Chronicle of Kinesiology in Higher Education, 26(1), 5–12.

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2015d). Kinesiology faculty citations across academic rank. Quest, 67, 346–351. doi:10.1080/00336297.2015.1082144

  • Knudson, D. (2017a). Citation metrics of excellence in sports biomechanics research. Sports Biomechanics. Advance online publication. PubMed ID: 29129137 doi:10.1080/14763141.2017.1391328

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2017b). Profiles of excellence in sports biomechanics research. In W. Potthast, A. Niehoff, & S. David (Eds.), Proceedings of the 35th international conference on biomechanics in sports (pp. 831–834). Cologne: German Sport University Cologne.

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2017c). Twenty years of authorship, sampling, and references in kinesiology research reports. International Journal of Kinesiology in Higher Education, 1, 44–52. doi:10.1080/24711616.2017.1282760

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., & Chow, J. (2008). North American perception of the prestige of biomechanics serials. Gait & Posture, 27, 559–563. PubMed ID: 17707642 doi:10.1016/j.gaitpost.2007.07.005

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., Morrow, J., & Thomas, J. (2014). Advancing kinesiology through improved peer review. Research Quarterly for Exercise and Sport, 85, 127–135. PubMed ID: 25098008 doi:10.1080/02701367.2014.898117

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kosmulski, M. (2012). The role of references in scientific papers: Cited papers as objects of research. Research Evaluation, 21, 87–88. doi:10.1093/reseval/rvr004

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kurmis, A.P. (2003). Understanding the limitations of the journal impact factor. Journal of Bone and Joint Surgery (American Volume), 85A, 2449–2454. PubMed ID: 14668520 doi:10.2106/00004623-200312000-00028

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langfeldt, L. (2001). The decision-making constraints and processes of grant peer review, and their effects on the review outcome. Social Studies of Science, 31, 820–841. doi:10.1177/030631201031006002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langfeldt, L. (2006). The policy challenges of peer review: Managing bias, conflict of interests and interdisciplinary assessments. Research Evaluation, 15, 31–41. doi:10.3152/147154406781776039

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lee, C.J., Sugimoto, C.R., Zhang, G., & Cronin, B. (2013). Bias in peer review. Journal of the American Society for Information Science and Technology, 64, 2–17. doi:10.1002/asi.22784

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lehmann, S., Jackson, A., & Lautrup, B. (2008). A quantitative analysis of indicators of scientific performance. Scientometrics, 76, 369–390. doi:10.1007/s11192-007-1868-8

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Leydesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the American Society for Information Science and Technology, 60, 1327–1336. doi:10.1002/asi.21024

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Leydesdorff, L., & Bornmann, L. (2011). How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the Association for Information Science and Technology, 62, 217–229. doi:10.1002/asi.21450

    • Search Google Scholar
    • Export Citation
  • Leydesdorff, L.Bornmann, L., Comins , J.A., & Milojevic, S. (2016). Citations: Indicators of quality? The impact fallacy. Frontiers in Research Metrics and Analytics, 1, 1. doi:10.3389/frma.2016.00001

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Leydesdorff, L., & Opthof, T. (2010). Scopus’ source normalized impact per paper (SNIP) versus the journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology, 61(11), 2365–2369. doi:10.1002/asi.21371

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Leydesdorff, L., Wouters, P., & Bornmann, L. (2016). Professional and citizen bibliometrics: Complementarities and ambivalences in the development and use of indicators—A state-of-the-art report. Scientometrics, 109, 2129–2150. PubMed ID: 27942086 doi:10.1007/s11192-016-2150-8

    • Crossref
    • Search Google Scholar
    • Export Citation
  • MacRoberts, M.H., & MacRoberts, B.R. (1987). Quantitative measures of communication in science: A study of the formal level. Social Studies of Science, 16, 151–172. doi:10.1177/030631286016001008

    • Crossref
    • Search Google Scholar
    • Export Citation
  • MacRoberts, M.H., & MacRoberts, B.R. (2018). The mismeasure of science: Citation analysis. Journal of the Association for Information Science and Technology, 69(3), 474–482. doi:10.1002/asi.23970

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Meho, L.I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus versus Google Scholar. Journal of the American Society for Information Science, 58, 2105–2125. doi:10.1002/asi.20677

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Moed, H.F. (2007). The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy, 34, 575–583. doi:10.3152/030234207X255179

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Moed, H.F. (2017). Applied evaluative informetrics. In W. Glanzel & A Schubert (Series Eds.), Qualitative and quantitative analysis of scientific and scholarly communication. Cham, Switzerland: Springer. doi:10.1007/978-3-319-60522-7

    • Search Google Scholar
    • Export Citation
  • Moed, H.F., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing Google Scholar and Scopus. Journal of Informetrics, 10, 533–551. doi:10.1016/j.joi.2016.04.017

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Nicolaisen, J. (2007). Citation analysis. Annual Review of Information Science and Technology, 41, 609–641. doi:10.1002/aris.2007.1440410120

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Owlia, P., Vasei, M., Goliaei, B., & Nassiri, I. (2011). Normalized impact factor (NIF): An adjusted method for calculating the citation rate of biomedical journals. Journal of Biomedical Informatics, 44, 216–220. PubMed ID: 21078409 doi:10.1016/j.jbi.2010.11.002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Perneger, T.V., Poncet, A., Carpentier, M., Agoritsas, T., Combescure, C., & Gayet-Ageron, A. (2017). Thinker, soldier, scribe: Cross-sectional study of researchers’ roles and author order in the Annals of Internal Medicine. BMJ Open, 7, e013898. PubMed ID: 28647720 doi:10.1136/bmjopen-2016-013898

    • Crossref
    • Search Google Scholar
    • Export Citation
  • The PLoS Medicine Editors. (2006). The impact factor game. PLoS Medicine, 3(6), e291. PubMed ID: 16749869 doi:10.1371/journal.pmed.0030291

    • Search Google Scholar
    • Export Citation
  • Postma, E. (2007). Inflated impact factors? The true impact of evolutionary papers in non-evolutionary journals. PLoS One, 2(10), e999. PubMed ID: 17912376 doi:10.1371/journal.pone.0000999

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Radicchhi, F., Forunato, S., & Castellano, C. (2008). Universality of citation distributions: An objective measure of scientific impact. Proceedings of the National Academy of Sciences, 105, 17268–17272. doi:10.1073/pnas.0806977105

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ricker, M. (2015). A numerical algorithm with preference statements to evaluate the performance of scientists. Scientometrics, 103, 191–212. PubMed ID: 25821279 doi:10.1007/s11192-014-1521-2

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rossner, M., Van Epps, H., & Hill, E. (2007). Show me the data. Journal of Cell Biology, 179, 1091–1092. PubMed ID: 18086910 doi:10.1083/jcb.200711140

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rossner, M., Van Epps, H., & Hill, E. (2008). Irreproducible results: A response to Thompson Scientific. Journal of Cell Biology, 180, 254–255. PubMed ID: 18192491 doi:10.1083/jcb.200801036

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ruscio, J., Seaman, F., D’Oriano, C., Stremlo, E., & Mahalchik, K. (2012). Measuring scholarly impact using modern citation-based indices. Measurement, 10, 123–146. doi:10.1080/15366367.2012.711147

    • Search Google Scholar
    • Export Citation
  • San Francisco declaration on research assessment. (2015). Retrieved from http://www.ascb.org/dora/

    • PubMed
    • Export Citation
  • Seglen, P.O. (1992). The skewness of science. Journal of the American Society for Information Science, 43, 628–638. doi:10.1002/(SICI)1097-4571(199210)43:9<628::AID-ASI5>3.0.CO;2-0

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Seglen, P.O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314, 497–497. doi:10.1136/bmj.314.7079.497

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Starbuck, W.H. (2005). How much better are the most-prestigious journals? The statistics of academic publication. Organization Science, 16, 180–200. doi:10.1287/orsc.1040.0107

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stern, R.E. (1990). Uncitedness in the biomedical literature. Journal of the American Society for Information Science, 41, 193–196. doi:10.1002/(SICI)1097-4571(199004)41:3<193::AID-ASI5>3.0.CO;2-B

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tahamtan, I., Afshar, A.S., & Ahamdzadeh, K. (2016). Factors affecting number of citations: A comprehensive review of the literature. Scientometrics, 107, 1195–1225. doi:10.1007/s11192-016-1889-2

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tahira, M., Abdullah, A., Alias, R.A., & Bakri, A. (2017). Evaluation of new research performance indices at the researcher level. Information Development, 34(1). doi:10.1177/0266666916674251

    • Search Google Scholar
    • Export Citation
  • Thelwall, M., & Fairclough, R. (2015). The influence of time and discipline on the magnitude of correlations between citation counts and quality scores. Journal of Informetrics, 9, 529–541. doi:10.1016/j.joi.2015.05.006

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Van Dalen, H.P., & Henkens, K. (2004). Demographers and their journals: Who remains uncited after ten years? Population and Development Review, 30, 489–506. doi:10.1111/j.1728-4457.2004.00025.x

    • Crossref
    • Search Google Scholar
    • Export Citation
  • van Raan, A.F.J. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67, 491–502. doi:10.1556/Scient.67.2006.3.10

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10, 365–391. doi:10.1016/j.joi.2016.02.007

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Waltman, L., van Eck, N.J., & Wouters, P. (2013). Counting publications and citations: Is more always better? Journal of Informetrics, 7, 635–641. doi:10.1016/j.joi.2013.04.001

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Welk, G., Fischman, M.G., Greenleaf, C., Harrison, L., Randsell, L., van der Mars, H., & Zhu, W. (2014). Editorial board position statement regarding the declaration on research assessment (DORA)—Recommendations with respect to journal impact factors. Research Quarterly for Exercise and Sport, 85, 429–430. PubMed ID: 25412123 doi:10.1080/02701367.2014.964104

    • Crossref
    • Search Google Scholar
    • Export Citation
  • West, R.E., & Rich, P.J. (2012). Rigor, impact, and prestige: A proposed framework for evaluating scholarly publications. Innovative Higher Education, 37, 359–371. doi:10.1007/s10755-012-9214-3

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wildgaard, L., Schneider, J.W., & Larsen, B. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101, 125–158. doi:10.1007/s11192-014-1423-3

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ye, F.Y., & Bornmann, L. (2018). “Smart girls” versus “sleeping beauties” in the sciences: The identification of instant and delayed recognition by using citation angle. Journal of the Association for Information Science and Technology, 69(3), 359–367. doi:10.1002/asi.23846

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, L., Rousseau, R. Sivertsen G. (2017). Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen’s work on journal impact and research evaluation. PLoS One, 12(3), e0174205. PubMed ID: 28350849 doi:10.1371/journal.pone.0174205

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhou, Y.B., Lu, L., & Li, M. (2012). Quantifying the influence of scientists and their publications: Distinguishing between prestige and popularity. New Journal of Physics, 14, 033033. doi:10.1088/1367-2630/14/3/033033

    • Crossref
    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 456 456 79
Full Text Views 5 5 1
PDF Downloads 3 3 2