Generative AI paradigm shift in Higher Education: balancing myths and realities in assessment marking and design
Conaldi, Guido ORCID: 0000-0003-3552-7307 and Mambrini, Francesco (2023) Generative AI paradigm shift in Higher Education: balancing myths and realities in assessment marking and design. In: SHIFT 2024: 'Inclusive Higher Education: Myths and Realities', 10th - 11th Jan, 2024, University of Greenwich, London. (Unpublished)
|
PDF (Conference theme)
46003_CONALDI_Generative_AI_paradigm_shift_in_Higher_Education_Balancing_myths_and_realities.pdf - Other Download (139kB) | Preview |
Abstract
Expanding on the work of Dwivedi et al. (2023) and De Vita et al. (2023), this study investigates the challenges and opportunities associated with the utilization and potential misuse of AI chatbots in the context of higher education, specifically emphasizing the design and implementation of authentic student assessments. These technologies are on the brink of achieving a level of sophistication that will fundamentally transform conventional approaches to student assessments. This study presents an update of the findings initially shared at the Greenwich Business School Learning and Teaching Festival 2023.
Initial academic responses to generative AI were heavily concerned with its role in academic dishonesty and the broader implications for maintaining academic integrity. Nonetheless, this study argues that the strategic adoption of generative AI tools could catalyze the creation of novel, authentic assessment models that fully integrate and capitalize on the advancements in AI. Anticipating that this integration may inspire future students to critically evaluate the role of AI in their academic programs, it could also encourage them to utilize AI tools for enhancing their critical thinking, learning outcomes, employability, and ethical application.
Addressing the challenge of identifying the misuse of AI in academia, the study discusses two main strategies. The first involves the use of AI-powered detection tools, which, despite their promise, come with limitations such as high costs and limited adoption. The second strategy focuses on analyzing linguistic features to detect AI-generated content, emphasizing the importance of educators' familiarity with their students' writing styles.
An experiment is conducted using specific AI chatbots to generate essays, aiming to mimic undergraduate students' writing styles. These AI-generated essays are then compared with actual student submissions from a UK university using computational linguistic analysis. The preliminary findings reveal surprising consistencies in the use of language between the AI-generated and student-written essays, including similar use of word classes and syntactic relations. Interestingly, the AI's writing style shows preferences for longer words and more adjectives, aligning along multiple dimensions with the style of top-graded student essays.
Concluding, the study suggests that as generative AI becomes more entrenched in educational settings, the pedagogical methods employed by educators are likely to face more rigorous examination. Against this backdrop, the authors advocate a forward-looking research agenda in this domain, where the integration of AI in HE should not be seen as a threat but as an opportunity to rethink assessment methods and foster authentic learning experiences. The authors invite further discussion and insights on the future role of AI in the academic landscape.
Item Type: | Conference or Conference Paper (Paper) |
---|---|
Uncontrolled Keywords: | artificial intelligence in education; authentic assessment; academic integrity; assessment models; AI detection tools; computational linguistic analysis |
Subjects: | L Education > L Education (General) Q Science > Q Science (General) Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Faculty / School / Research Centre / Research Group: | Faculty of Business |
Last Modified: | 26 Feb 2024 11:16 |
URI: | http://gala.gre.ac.uk/id/eprint/46003 |
Actions (login required)
View Item |
Downloads
Downloads per month over past year