Hallucinations in ChatGPT: An Unreliable Tool for Learning

953 views

Zakia Ahmad1* , Wahid Kaiser2 & Sifatur Rahim3  
1,2,3 Department of English, University of Asia Pacific – UAP, Dhaka, Bangladesh. *Corresponding author.

Rupkatha Journal, Vol. 15, Issue 4, 2023. https://doi.org/10.21659/rupkatha.v15n4.17
[Article History: Received: 30 October 2023. Revised: 17 December 2023. Accepted: 18 December 2023. Published: 19 December 2023
]
Full-Text PDF Issue Access

Abstract

Recently, ChatGPT has been upgraded to its newer version for its unsubscribed users – ChatGPT 3.5. Though ChatGPT has become an astonishing phenomenon all over the world for creating realistic texts within seconds, it can disseminate wrong information and misconceptions. Technical experts have identified this problem as hallucination. This paper has examined ChatGPT’s ability to differentiate between correct and incorrect relations in the questions that are set to it. It has also explored the efficacy of ChatGPT in helping students acquire linguistic and literary proficiency. The study took the form of exploratory interpretive research. The participants of the research study were students studying English at the undergraduate level. Data was collected through semi-structured interviews, FGDs, and input provided to ChatGPT. All data were analyzed qualitatively. The findings of this research indicate that ChatGPT tends to provide inconsistent information when a series of contextual questions are asked. Because of this hallucination, ChatGPT becomes an unreliable source for language and literature learning.

Keywords: ChatGPT, hallucination, language learning, literature learning, reliability.

Sustainable Development Goals: Better Education
Citation: Ahmad, Z., Kaiser, W. & Rahim, S. (2023) Hallucinations in ChatGPT: An Unreliable Tool for Learning. Rupkatha Journal 15:4. https://doi.org/10.21659/rupkatha.v15n4.17