BIBLIOMETRIC ANALYSIS OF DIGITAL HUMANITIES RESEARCH: INSIGHTS FROM SCOPUS DATABASE STUDIES (2005–2025)

Authors

DOI:

https://doi.org/10.18623/rvd.v22.n2.3136

Keywords:

Digital Humanities, Bibliometrics, Scopus, Publication Trends, Interdisciplinary Research, Knowledge Production

Abstract

This study offers a bibliometric analysis of research in digital humanities, utilizing data from the Scopus database spanning 2005 to 2025. The analysis looks at publication trends, sources, authorship, geographic distribution, funding sponsors, and document types to show how this interdisciplinary field is changing over time. The results show that scholarly output slowly grew from 2005 to 2017, then jumped up sharply in 2019. Peer-reviewed outputs are the most common type of publication, showing that the field relies on both traditional humanities dissemination and technical exchanges. Digital Scholarship in the Humanities and Lecture Notes in Computer Science are two of the most important publications in the field. Authorship is somewhat spread out, with input from a wide range of scholars. The United States is the biggest contributor, followed by Australia, India, and the United Kingdom. Funding comes from both international and national agencies, which shows that policymakers recognize the value of digital humanities research. All of these results show how digital humanities has grown in a dynamic and interdisciplinary way over the past 20 years and give us an idea of where it will go in the future to connect technology and cultural scholarship.

References

ADAMS, J. (2013). Collaborations: The rise of research networks. Nature, 500(7461), 557–560. https://pubmed.ncbi.nlm.nih.gov/23075965/

ALTBACH, P. G., & DE WIT, H. (2018). The challenge to higher education internationalization. International Higher Education, (95), 2–4. https://www.universityworldnews.com/post.php?story=20180220091648602

ARCHAMBAULT, É., & LARIVIÈRE, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635–649. https://doi.org/10.1007/s11192-007-2036-x

ARCHAMBAULT, É., CAMPBELL, D., GINGRAS, Y., & LARIVIÈRE, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326. https://doi.org/10.1002/asi.21062

ARIA, M., & CUCCURULLO, C. (2017). Bibliometrix: An R-tool for comprehensive science mapping analysis. Journal of Informetrics, 11(4), 959–975. https://doi.org/10.1016/j.joi.2017.08.007

BAKER, D. & POWELL, J. (2024). Global Mega-Science: Universities, Research Collaborations, and Knowledge Production. Redwood City: Stanford University Press. https://doi.org/10.1515/9781503639102

BERRY, D. M., & FAGERJORD, A. (2017). Digital humanities: Knowledge and critique in a digital age. Polity Press. DOI: https://doi.org/10.31400/dh-hun.2019.2.391

BORGMAN, C. L. (2015). Big data, little data, no data: Scholarship in the networked world. MIT Press. https://doi.org/10.7551/mitpress/9963.001.0001

BORNMANN, L., & MUTZ, R. (2015). Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references. Journal of the Association for Information Science and Technology, 66(11), 2215–2222. https://doi.org/10.1002/asi.23329

BOZEMAN, B., & BOARDMAN, C. (2014). Research collaboration and team science: A state-of-the-art review and agenda. Springer. https://www.researchgate.net/publication/264861750_Research_Collaboration_and_Team_Science_A_State-of-the-Art_Review_and_Agenda

DE SOLLA PRICE, D. J. (1963). Little science, big science. Columbia University Press. https://archive.org/details/littlesciencebig0000pric

DONTHU, N., KUMAR, S., MUKHERJEE, D., PANDEY, N., & LIM, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of Business Research, 133, 285–296. https://doi.org/10.1016/j.jbusres.2021.04.070

ELSEVIER. (2025). Scopus Content Coverage Guide. Elsevier B.V. https://www.researchgate.net/publication/330161507_Scopus_Content_Coverage_Guide

FALAGAS, M. E., PITSOUNI, E. I., MALIETZIS, G. A., & PAPPAS, G. (2008). Comparison of PubMed, Scopus, Web of Science, and Google Scholar: Strengths and weaknesses. The FASEB Journal, 22(2), 338–342. https://doi.org/10.1096/fj.07-9492LSF

GEUNA, A., & MARTIN, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304. https://doi.org/10.1023/B:MINE.0000005155.70870.bd

GLÄNZEL, W., & SCHOEPFLIN, U. (1995). A bibliometric study on ageing and reception processes of scientific literature. Journal of Information Science, 21(1), 37-53. https://www.deepdyve.com/lp/sage/a-bibliometric-study-on-ageing-and-reception-processes-of-scientific-Oei6nN1zka?key=sage

GÓMEZ-NÚÑEZ, A.J., VARGAS-QUESADA, B., DE MOYA-ANEGÓN, F. et al. Improving SCImago Journal & Country Rank (SJR) subject classification through reference analysis. Scientometrics 89, 741–758 (2011). https://doi.org/10.1007/s11192-011-0485-8

GRANT, M. J., & BOOTH, A. (2009). A typology of reviews: An analysis of 14 review types. Health Information & Libraries Journal, 26(2), 91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x

HASSAN, M. M., & AHMAD, A. R. (2025). Systematic literature review on the sustainability of higher education institutions (HEIs): Dimensions, practices and research gaps. Cogent Education, 12(1), 2549789. https://doi.org/10.1080/2331186X.2025.2549789

HAZELKORN, E. (2015). Rankings and the reshaping of higher education: The battle for world-class excellence. Springer. https://link.springer.com/book/10.1057/9781137446671

KATZ, J. S., & MARTIN, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18. https://doi.org/10.1016/S0048-7333(96)00917-1

KOIBICHUK, V. V., POROSHYN, D. A., & YEFIMENKO, A. Y. (2025). Інформаційні послуги як драйвер змін на ринку праці: бібліометричний підхід [Information services as a driver of change in the labor market: A bibliometric approach]. Sumy State University. https://essuir.sumdu.edu.ua/handle/123456789/99602

KURNIAWAN, W., & MULUK, H. (2025). Temporal bias and psychological susceptibility to misinformation: A systematic review and bibliometric synthesis. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5418399

KWIEK, M (2020). Internationalists and locals: international research collaboration in a resource-poor system. Scientometrics 124, 57–105 . https://doi.org/10.1007/s11192-020-03460-2

LEYDESDORFF, L., & WAGNER, C. (2008). International collaboration in science and the formation of a core group. Journal of Informetrics, 2(4), 317–325. https://doi.org/10.1016/j.joi.2008.07.003

LOTKA, A. J. (1956). The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16(12), 317–323. (Original work published 1926). The frequency distribution of scientific productivity on JSTOR

MARGINSON, S. (2016). The worldwide trend to high participation higher education: Dynamics of social stratification in inclusive systems. Higher Education, 72(4), 413–434. https://doi.org/10.1007/s10734-016-0016-x

MOED, H. F. (2005). Citation analysis in research evaluation. Springer. https://doi.org/10.1007/1-4020-3714-7

MOED, H. F. (2017). Applied evaluative informetrics. Springer. https://doi.org/10.1016/j.joi.2018.03.002

MOED, H. F., DE BRUIN, R. E., & VAN LEEUWEN, T. N. (1995). New bibliometric tools for the assessment of national research performance: Database description, overview of indicators, and first applications. Scientometrics, 33(3), 381–422. https://doi.org/10.1007/BF02017338

MOED, HENK (2017). Applied Evaluative Informetrics: Part 1. (May 17, 2017). Part 1 of the book Applied Evaluative Informetrics, to be published by Springer in the book series Qualitative and Quantitative Analysis of Scientific and Scholarly Communication, Available at SSRN: https://ssrn.com/abstract=2969742 or http://dx.doi.org/10.2139/ssrn.2969742

NATIONAL SCIENCE BOARD. (2022). Science and engineering indicators 2022. National Science Foundation. https://ncses.nsf.gov/indicators

NEWMAN, M. E. J. (2001). The structure of scientific collaboration networks. Proceedings of the National Academy of Sciences, 98(2), 404–409. https://doi.org/10.1073/pnas.98.2.404

PASHA, A. T., & SULTAN, I. (2025). Visualizing the intellectual structure of green entrepreneurial orientation and sustainable performance: A bibliometric network analysis. Lead Sci Journal of Management, Innovation and Sustainability, 2(1), 45–63. https://scholarclub.org/index.php/LeadSci/article/view/21

SALMI, J. (2009). The challenge of establishing world-class universities. World Bank Publications. DOI: 10.1596/978-0-8213-7865-6

SCHREIBMAN, S., SIEMENS, R., & UNSWORTH, J. (2016). A new companion to digital humanities. John Wiley & Sons. DOI:10.1002/9781118680605

SHIN, J. C., & KEHM, B. M. (2013). Institutionalization of a world-class university in global competition. Springer. https://link.springer.com/book/10.1007/978-94-007-4975-7

SUBRAMANYAM, K. (1983). Bibliometric studies of research collaboration: A review. Journal of Information Science, 6(1), 33–38. https://doi.org/10.1177/016555158300600105

SUKOCO, B. M., YUDHOYONO, A. H., MAHARANI, I. A. K., & PUTRA, I. K. (2025). Bridging the gap: Indonesia’s research trajectory and national development through a scientometric analysis using SciVal. Journal of Open Innovation: Technology, Market, and Complexity, 11(2), 34. https://www.researchgate.net/publication/389594271

THOMPSON, J. B. (2005). Books in the digital age: The transformation of academic and higher education publishing in Britain and the United States. Polity Press. https://doi.org/10.1177/0267323106066687

VAN RAAN, A. F. J. (2003). The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments. (2003). TATuP - Zeitschrift für Technikfolgenabschätzung in Theorie Und Praxis, 12(1), 20-29. https://doi.org/10.14512/tatup.12.1.20

VAN RAAN, A. F. J. (2019). Measuring science: Basic principles and application of bibliometric indicators. In W. Glänzel (Ed.), Springer handbook of science and technology indicators (pp. 237–280). Springer. https://doi.org/10.1007/978-3-030-02511-3_10

VEUGELERS, R. (2017). The challenge of China’s rise as a science and technology powerhouse. Bruegel Policy Contribution, 19, 1–15. https://ideas.repec.org/p/bre/polcon/21154.html

WARE, M., & MABE, M. (2015). The STM report: An overview of scientific and scholarly journal publishing (4th ed.). International Association of Scientific, Technical and Medical Publishers. https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1008&context=scholcom

WHITLEY, R., GLÄSER, J., & ENGWALL, L. (2018). Reconfiguring knowledge production: changing authority relationships in the sciences and their consequences for intellectual innovation. Oxford University Press. https://global.oup.com/academic/product/reconfiguring-knowledge-production-9780199590193?cc=vn&lang=en&

WILDGAARD, L., SCHNEIDER, J. W., & LARSEN, B. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101(1), 125–158. https://doi.org/10.1007/s11192-014-1423-3

YAMAN, İ. (2025). Unveiling technology integration in K-12 EFL/ESL teaching: A bibliometric analysis. Journal of Language Teaching and Learning, 12(2), 86–104. https://www.jltl.com.tr/index.php/jltl/article/view/867

ZAKARIA, N. Z., OSMAN, I., & ZANI, A. M. (2025). Two decades of ethical behaviour: A global bibliometric insight into research trends. Proceedings of TiBEC IX 2025, Atlantis Press. https://www.atlantis-press.com/proceedings/tibec-ix-25/126014500

ZEBAKH, S., RHOUMA, A., & ARVANITIS, R. (2022). Mapping the agricultural research systems in the Maghreb (Algeria, Morocco and Tunisia). Science, Technology and Society, 27(3), 425–446. https://doi.org/10.1177/09717218221078231

Downloads

Published

2025-10-22

How to Cite

Chieu, N. V., Lich, H. T., & Nguyen, T. T. (2025). BIBLIOMETRIC ANALYSIS OF DIGITAL HUMANITIES RESEARCH: INSIGHTS FROM SCOPUS DATABASE STUDIES (2005–2025). Veredas Do Direito, 22(2), e223136. https://doi.org/10.18623/rvd.v22.n2.3136