CRITICAL CRITERIA FOR EVALUATION OF SCIENTIFIC RESEARCH ACTIVITY IN SCIENCE DIRECT

Authors

  • Antoaneta Angelova-Stanimirova University od National and world Economy, Bulgaria

DOI:

https://doi.org/10.35120/kij5801171a

Keywords:

control, evaluation, evaluation of research activity, criteria for evaluation research activity

Abstract

Nowadays, the role of universities has expanded beyond the basic assumption of creating and disseminating knowledge. Increasing the effectiveness of research activity leads to an increase in the university's image, which means better opportunities to attract and retain highly qualified academic staff, improve the recognition of the university and increase the number of prospective students.
Naturally, taking action to promote the quality of research is an essential part of university management. So, it is vital to develop and implement effective mechanisms for monitoring and evaluating the research activities of teachers, which are elements of the control process. In this context, the evaluation of research activity is essential for implementing the control function in universities. It is emphasized that research activity is at the centre of attention of all stakeholders in the university.
The purpose of this report is to outline the criteria for evaluating scientific research activity found in the Science Direct database.
The report raises a research question: what are the criteria for evaluating scientific research activity?
The research toolkit covers the methods of analysis, synthesis, summaries, descriptive statistics and Criteria i.
The research has three steps: the first step is to determine Criteria i: the Science Direct scientific base was selected, the research period is until April 2023, and the main limitation is the "open access" filter. In the second step, the search was conducted. The third step is related to the synthesis of the results.
The study found 16 criteria for evaluating research activity. The most frequently mentioned are "bibliometrics", "impact factor", "qualitative criteria", and "quantitative criteria". "Number of authors" is in third place. Followed by two criteria: "number of reads" and "number of citations".
The report's author concludes that, according to the research carried out, quantitative criteria prevail when evaluating the scientific research activity.
Recommendations for future research: expanding the research sample for a more comprehensive study; analyzing the research evaluation systems and the criteria they use at the state level; comparative analysis of the criteria used by the research control systems; building the system for control and evaluation of the research activity, covering a wide range of criteria and applicable in Bulgarian practice.

References

Симеонов, О., & Ламбовска, М. (2016). Системи за управленски контрол. Екс-прес, Габрово.

Buchmayer, C., Greil, M., Hikl, A., Kaiser-Dolidze, O., Miniberger, C. (2014). Usability on the Edge: The Implementation of u:cris at the University of Vienna. Procedia Computer Science. Volume 33. https://doi.org/10.1016/j.procs.2014.06.017.

Curk, L. (2019). Implementation of the Evaluation of Researchers’ Bibliographies in Slovenia. Procedia Computer Science. Volume 146. https://doi.org/10.1016/j.procs.2019.01.082.

Delgado-Márquez, B. L., Bondar, Y., & Delgado-Márquez, L. (2012). Higher education in a global context: Drivers of top-universities’ reputation. Problems of Education in the 21st Century, 40, 17–25. http://www.scientiasocialis.lt/pec/node/674

Diturije, I., Muharem, E. (2010). Human resources management at South East European University as a new model of higher education in the Republic of Macedonia. Procedia - Social and Behavioral Sciences. Volume 2, Issue 2. https://doi.org/10.1016/j.sbspro.2010.03.833.

Dombashov, R. (2022). Относно някои проблеми при оценяване на предприятия. Бизнес посоки, 31(01 BG), 49-55.

Frutos-Belizón, J., García-Carbonell, N., Ruíz-Martínez, M., Sánchez-Gardey, G. (2023). Disentangling international research collaboration in the Spanish academic context: Is there a desirable researcher human capital profile? Research Policy. Volume 52, Issue 6,https://doi.org/10.1016/j.respol.2023.104779.

Griffey, S., Piccinino, L., Gallivan, J., Lotenberg, L. D., Tuncer, D. (2015). Applying national survey results for strategic planning and program improvement: The National Diabetes Education Program. Evaluation and Program Planning. Volume 48. https://doi.org/10.1016/j.evalprogplan.2014.10.002.

Horenberg, F., Lungu, D., Nuti, S. (2020). Measuring research in the big data era: The evolution of performance measurement systems in the Italian teaching hospitals. Health Policy. Volume 124, Issue 12. https://doi.org/10.1016/j.healthpol.2020.10.002.

Kwon, Ki-Seok (2015). Evolution of Universities and Government Policy: the case of South Korea. Asian Journal of Innovation and Policy, Volume 4 Issue 1, Pages.103-127. https://doi.org/10.7545/ajip.2015.4.1.103

Lambovska, M. R., & Raitskaya, L. K. (2022). High-Quality Publications in Russia: A Literature Review on How to Influence University Researchers. Integration of Education, 26(2), 312-330. doi:10.15507/1991-9468.107.026.202202.312-330

Meagher, P. (2002). Changing hands: governance and transformation in Hungary’s financial sector. In: Review of Central and East European Law. https://doi.org/10.1163/157303502124667693.

Meek, V. L., Davies, D. (2009). Policy dynamics in higher education and research: Concepts and observations. In V. L. Meek, U. Teichler, & M. L. Kearney (Eds.), Higher education, research and innovation: Changing dynamics. Report on the UNESCO forum on higher education, research and knowledge 2001–2009 (pp. 41–84). International Centre for Higher Education Research

Morales-Morante, L. F. (2016). Production and impact of Peruvian social science journals in the Latinex catalogue. Investigación Bibliotecológica: Archivonomía, Bibliotecología e Información. Volume 30, Issue 69, Supplement. https://doi.org/10.1016/j.ibbai.2016.10.021.

Musa, H., El-Sharief, M., Musa, I., Musa, T., Akintunde, T. (2021). Global scientific research output on sickle cell disease: A comprehensive bibliometric analysis of web of science publication. Scientific African. Volume 12. https://doi.org/10.1016/j.sciaf.2021.e00774.

Oprea, M. (2011). A University Knowledge Management Tool for Academic Research Activity Evaluation, Informatica Economicăvol. 15, no. 3

Palinkas, LA. (2015). Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Adm Policy Ment Health., 42(5):533-44. https://doi.org/10.1007/s10488-013-0528-y

Tsolakidis, A., Sgouropoulou, C., Papageorgiou, E., Terraz O., Miaoulis G. (2013). Institutional Research Management using an Integrated Information System. Procedia - Social and Behavioral Sciences. Volume 73. https://doi.org/10.1016/j.sbspro.2013.02.085.

Whitley, R. (2007). Changing governance of the public sciences. In: Whitley R, Gläser J (eds) The Changing Governance of the Sciences. Dordrecht, The Netherlands: Springer, pp.3–26, https://doi.org/10.1007/978-1-4020-6746-4_1

Yuizono, T., Jiang, J., Zhao, X., Munemori, J. (2014). Usage of Blogging Software for Laboratory Management to Support Weekly Seminars Using Research Activity Reports. Procedia Computer Science. Volume 35. https://doi.org/10.1016/j.procs.2014.08.258.

Downloads

Published

2023-06-01

How to Cite

Angelova-Stanimirova, A. (2023). CRITICAL CRITERIA FOR EVALUATION OF SCIENTIFIC RESEARCH ACTIVITY IN SCIENCE DIRECT. KNOWLEDGE - International Journal , 58(1), 171–177. https://doi.org/10.35120/kij5801171a