Scientific evaluation based on citation indexes: History of the development of the impact factor, its weaknesses and proposals for other solutions
DOI:
https://doi.org/10.33448/rsd-v13i9.46878Keywords:
Impact factor; Citation index; Scientific journal; Journal evaluation.Abstract
The purpose of this article is to address the context of scientific journal evaluation using citation indexes. This is a descriptive study that seeks to detail the particularities of the Impact Factor (IF), as well as to broaden the understanding of journal evaluation supported by citation indexes. The objectives of the study are: i) to describe the development of the IF; ii) to identify other bibliometric measures based on the number of citations; iii) to list points of attention that should be observed in scientific evaluation processes based on citation indexes; and iv) to identify alternative ways to use indexes formulated by large commercial scientific publishers. Despite the growing development of other citation indexes, it is not advisable to limit the evaluation of journals to these measures alone, given the diversity of factors that influence their results. As alternatives to the indexes published by commercial databases, some countries have proposed databases and indexes that more adequately evaluate their scientific production. The development of non-profit databases such as OpenALex is also noteworthy. In conclusion, it is understood that a cultural change is needed in the scientific community regarding the evidence used to evaluate research products. Those who carry out scientific evaluations supported exclusively by metrics must bear in mind that the precision granted by numerical results is only apparent. Therefore, the use of a set of analysis methods, paying special attention to peer review, is the most recommended.
References
ABCD/USP. (2022). Indicadores e métricas. https://www.abcd.usp.br/apoio-pesquisador/indicadores-pesquisa/lista-indicadores-bibliometricos/
Almeida, C. C. (2019). Fator de Impacto e avaliação da produção científica: compreensão na perspectiva das áreas de Ciência da Informação e Matemática, Probabilidade e Estatística. (Tese de Doutorado). Universidade Estadual Paulista Júlio de Mesquita Filho, Marília, SP. https://repositorio.unesp.br/handle/11449/182447.
Almeida, C. C. & Gracio, M. C. C. (2020). O Fator de Impacto e as boas práticas de avaliação científica. Ciência da Informação em Revista. 7 (1), 138–52. 10.28998/cirev.2020v7n1i. https://www.seer.ufal.br/index.php/cir/article/view/8865.
Antunes, A. A. (2015). Como avaliar produção científica. Revista do Colégio Brasileiro de Cirurgiões, 42, 17-19.
Bellis, N. (2009). Bibliometrics and Citation Analysis: from the Science Citation Index to Cybermetrics. The Scarecrow Press.
Bensman, S. J. (2007). Garfield and the impact factor. Annual Review of Information Science and Technology, 41(1), 93-155. http://garfield.library.upenn.edu/bensman/bensmanegif2007.pdf.
Canto, F. L., Pinto, A. L., Carvalho Segundo, W. L. R. & Neubert, P. S. (2024) Cobertura de citações da OpenAlex, da Scopus e da Web of Science: análise comparativa a partir da produção científica da Universidade Federal de Santa Catarina. In: 9º Encontro Brasileiro de Bibliometria e Cientometria EBBC, Brasília, Brasil. DOI: https://doi.org/10.22477/ix.ebbc.321. https://ebbc.inf.br/ojs/index.php/ebbc/article/view/321.
COARA (2022). The agreement full text. https://coara.eu/agreement/the-agreement-full-text/.
DORA (2012). San Francisco Declaration on Research Assessment. https://sfdora.org/.
Garfield, E. (1955). Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas. Science, 122(3159), 108-111. https://doi.org/10.1126/science.122.3159.108.
Garfield, E. & Sher, I. H. (1963). New Factors in the Evaluation of Scientific Literature Through Citation Indexing. American Documentation, 4(3), 195-201.
http://www.garfield.library.upenn.edu/essays/v6p492y1983.pdf.
Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 17, 471-479. http://www.garfield.library.upenn.edu/essays/v1p527y1962-73.pdf.
Garfield, E. (1983). How to use Journal Citation Reports including a special salute to the Johns Hopkins Medical Journal. Essays of an Information Scientist, 6, 131-138. http://www.garfield.library.upenn.edu/essays/v6p131y1983.pdf.
Garfield, E. (1999). Journal impact factor: a brief review. Canadian Medical Association Journal, 161(8), 979-980.
Garfield, E. (2005). The Agony and the Ecstasy - The History and Meaning of the Journal Impact. In: International Congress on Peer Review And Biomedical Publication, Chicago. garfield.library.upenn.edu/papers/jifchicago2005.pdf.
Gil, A. C. (2017). Como elaborar projetos de pesquisa (6ed.). Atlas.
Hicks, D., Wouters, P., Waltman, L.; De Rijcke, S. & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. DOI: 10.1038/520429a.
Incites. Clarivate. (2021). Indicators handbook. Clarivate. https://incites.help.clarivate.com/Content/Indicators-Handbook/ih-about.htm.
Macêdo, D. J, Schiessl, I. T. & Shintaku, M. (2024). Desvendando o OpenAlex: avanços e oportunidades em métricas bibliométricas. In: 9º Encontro Brasileiro de Bibliometria e Cientometria EBBC, Brasília, Brasil.
Marchlewski, C., Silva, P. M. da. & Soriano, J. B. (2011). A influência do sistema de avaliação Qualis na produção de conhecimento científico: algumas reflexões sobre a Educação Física. Motriz: Revista De Educação Física, 17(1), 104–116. https://doi.org/10.5016/1980-6574.2011v17n1p94.
Maricato, J. M., Mazoni, A., Mugnaini, R., Packer, A. L. & Costas, R. (2023). SciELO as an open scientometric research infrastructure: general discussion of coverage in OpenAlex, WoS, Scopus and Dimensions. In: 27th International Conference on Science, Technology and Innovation Indicators, Leiden, Holanda.
Miglioli, S. (2018). Influências e limites do fator de impacto como métrica de avaliação na ciência. PontodeAcesso, 11(3), 17–33. https://periodicos.ufba.br/index.php/revistaici/article/view/17263
Openalex. (2024). About. https://openalex.org/about
Passos, P. C. S. J, Passos, J. E., Caregnato, S. E. & Silva, T. L. K. (2018). Quality criteria in scientific journals: A study on the adequacy to the digital environment. Informação e Sociedade, 28, 209-226.
Pereira, A. S., Shitsuka, D. M., Parreira, F. J. & Shitsuka, R. (2018). Metodologia da pesquisa científica. UFSM.
Priem, J., Piwowar, H. & Orr, R. (2022). OpenAlex: A fully-open index of scholarly works, authors, venues, institutions, and concepts. ArXiv. https://arxiv.org/abs/2205.01833.
Sampieri, R. H., Collado, C. F. & Lucio, P. B. (2006). Metodologia de pesquisa. (3 ed). McGraw-Hill Interamericana do Brasil Ltda.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Lívia Rejane Miguel Amaral Schumann; Luciana Calabró
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
1) Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
2) Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
3) Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.