Author ORCID Identifier
Ngo Cong-Lem: https://orcid.org/0009-0005-3299-477X
Abstract
The rapid spread of generative artificial intelligence (GenAI) in education has intensified concerns about its impact on learners’ critical thinking, underscoring the need for reliable instruments to assess how students engage critically with AI-generated content. This study adapted, developed, and initially validated the Generative AI–Critical Thinking (GenAI–CrT) scale, an eight-item short-form instrument designed to capture EFL learners’ critical thinking in AI-enhanced learning contexts. Data from 233 Vietnamese undergraduates and graduates underwent exploratory and confirmatory factor analyses, supporting a theoretically grounded four-factor structure comprising analytical skills, logical reasoning, evidence evaluation, and open-mindedness. The scale demonstrated excellent model fit (χ²(14) = 23.49, p = .053; CFI = .990; TLI = .979; RMSEA = .054; SRMR = .023) and high internal consistency (α = .90). A regression model using a subsample (n = 189) explained approximately 7% of the variance in AI self-efficacy, suggesting that the four analytical dimensions of the GenAI–CrT function collectively, but not independently, in shaping learners’ confidence in using AI tools. Beyond psychometric validation, this study offers initial evidence that critical thinking operates as an integrated competence underpinning reflective and ethical engagement in AI-enhanced learning. The GenAI–CrT can further support teachers and curriculum designers in assessing and scaffolding learners’ analytical, evaluative, and open-minded interactions with AI-generated content.
First Page
1
Last Page
24
Ethics Approval
Yes
Declaration Statement
Conflict of interest disclosure
The authors have no competing interests to disclose.
Funding statement
This research is funded by the Foundation for Science and Technology Development of Dalat University.
Ethics approval statement
As the researchers’ institution does not currently have a formal human research ethics committee, the study was conducted in accordance with established ethical principles and standard requirements for research involving human participants. All participants were informed about the study’s purpose, provided voluntary informed consent, and were assured of anonymity and confidentiality throughout the research process.
Data availability
Data available on reasonable request due to privacy/ethical restrictions.
Declaration of Generative AI and AI-assisted Technologies in the Writing Process
During the preparation of this work, the authors used OpenAI’s ChatGPT-5 to assist with language refinement and proofreading. After using this tool, the authors carefully reviewed and edited the content as needed and take full responsibility for the final version of the manuscript.
Author contributions
Conceptualization: CL; Methodology: CL, TT, NK, NQ; Investigation: CL, TT, NK, NQ; Formal analysis: CL; Data curation: TT, NK, NQ; Writing – original draft: CL; Writing – review & editing: CL, TT, NK, NQ.
Participant consent statement
Informed consent was obtained electronically from all participants before they proceeded to complete the instrument.
Acknowledgements
We would also like to sincerely thank all participants for their involvement in this study. Their willingness to contribute their time and perspectives was essential to the completion of this research.
Permission to reproduce materials from other sources
N/A
Clinical trial registration
N/A
Recommended Citation
Cong-Lem, N., Nguyen, T. T., Nguyen, K. N., & Nguyen, Q. N. (2026). Development and psychometric validation of a short-form critical thinking scale in generative AI contexts (GenAI–CrT): Evidence from Vietnamese EFL learners. Journal of Educational Technology Development and Exchange (JETDE), 19(1), 1-24. https://doi.org/10.18785/jetde.1901.01
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Included in
Curriculum and Instruction Commons, Disability and Equity in Education Commons, Educational Assessment, Evaluation, and Research Commons, Educational Technology Commons, Teacher Education and Professional Development Commons