The HELTASA Digital Teaching and Learning Team hosted their first Digital Dialogue for 2023 on 15 March entitled “How do we navigate the AI hype together?”. Since November 2022, there have been many discussions and resources published about ChatGPT and the potential impacts of Artificial Intelligence (AI) tools in higher education. Attracting nearly 150 participants, the session participants discussed what ChatGPT and AI more generally means for learning and teaching in South African higher education institutions. The webinar recording and slides from the session are available.

What is generative AI?

The first invited guest speaker, Neil Kramm, from the Centre for Higher Education Research, Teaching and Learning (CHERTL) at Rhodes University, provided an overview of ChatGPT-3.5, noting that this is only one large language model from many AI tools available. ChatGPT-3.5 generates novel, human-like text, based on training on text data from the internet. Some of the limitations are that it was trained on data until 2021 and is not connected to the internet. It will also hallucinate, i.e confidently state incorrect information or use biased information. Many of these limitations may be overcome with future versions. GPT-4 was released the day of the webinar and claims to significantly reduce hallucinations compared to GPT-3.5. While ChatGPT-3.5 is currently free to use (with a paid option), ChatGPT-4 has to be paid for.

What is the point of an academic or considerations for generative AI?

The second invited guest speaker, Prof Michael Rowe, from the School of Health and Social Care at the University of Lincoln in the United Kingdom, provided some considerations for the use of AI in higher education and the potential shift about how we think about ourselves as humans in relation to machines. A key consideration is that while AI tools have limitations, they will get better over time. While ChatGPT can pass most licensing examinations for careers such as medicine or law, leading to concerns of students outsourcing exams to machines, a consideration is that many traditional examinations may not be valid nor reliable. Another consideration is the link between writing and thinking, so if someone (or something) else is doing the writing, does this mean that we start to think less? At the same time, due to AI advances we may have the opportunity for our own personal virtual assistants or AI-tutors. Different universities have taken different approaches to students using ChatGPT from outright bans to enabling students to use AI tools but requesting them to disclose that they have used them and how. Graduates going into the world of work will be going into environments where AI tools will be used so students will need to know how to use these technologies. We need to work towards integrating tools into education rather than try to legislate against them, especially when technologies advance so rapidly. Another concern is relying on other machines (such as Turnitin) to identify AI generated text. How we as institutions respond to AI will signal how we think and relate to our students. Do we see our students as partners in learning or as ‘catching’ students who may cheat? We need to start to provide guidelines for the use of AI tools in learning and teaching and students need to be involved in these conversations. We also need to (re)consider the role and purpose of academics as lecturers.

How can we navigate AI tools together?

Participants discussed how to consider AI and learning and teaching in small groups. Some of the key discussions related to:

Responding to the influx of AI tools: Chat-GPT and other tools are here so we cannot ignore them. As these technologies evolve rapidly, we need to be aware of them and look for ways to meaningfully integrate them. We cannot wait on university leadership to provide guidelines, we need to start moving (taking a bottom up approach as we are the institution). Many institutions seek to “police” the use of AI tools instead of thinking about how to support learning. There is also a question of timing related to policies and whether AI related policies are needed as these tools are still very new.

Opportunities for using AI tools in learning and teaching: These tools could be used to support student reading and writing by getting feedback on early drafts. They could also be useful for supporting second-language speakers in receiving more feedback. Some students may feel more comfortable interacting with a machine instead of a lecturer.

Concerns for using AI tools in learning and teaching: There are equity and access concerns and those using advanced tools will be those who can afford them. There is potential to exacerbate existing inequalities. There are also challenges in using information that is not accurate or further eurocentric outputs.

Support Needed

Prompt engineering will be a key skill for using AI generation tools. Both staff and students need to be aware of the opportunities and limitations of AI generation tools. Similarly, staff and students need to be aware of the opportunities and limitations of AI detector tools. While we may not be aware of what tools students are using and when, we need to have dialogues with our students about how they are using tools.

Looking forward

The opportunities presented by AI tools enable us to revisit or rethink our assessments and their role in learning. There may also be a need for changes to academic integrity policies and our understanding of plagiarism. In April 2023, Turnitin’s AI detection tool was released. However, it is only available to lecturers and may not be reliable (misidentifying human-generated text as AI and vice versa) so we need to be discerning and have conversations with students about the purposes of a university degree, and about knowledge building in the disciplines.

The HELTASA Digital Learning and Teaching Project Team will be building on discussions arising in this dialogue in our next dialogue to be held in May 2023, particularly to consider critical AI literacies. Join us there! Our stance is that we need to continue to share within and across universities so that we can build towards a more informed higher education sector together. We invite HELTASA members to share guidelines for supporting staff and students around the use of AI tools or look to collaborate on creating these guidelines together.