The introduction of Generative AI (GenAI) tools such as ChatGPT or Copilot into university courses can be both exciting and terrifying for course convenors. The potential for innovation is attractive, but the decision may trigger many unanswered questions and unforeseen challenges. It may not only transform – for better or worse – the learning experience of students, but also take the professors on a change journey with them.
This blog presents a collaborative reflection on the transformative change journey that a group of post-graduate LSE students and their course leader experienced when the use of GenAI tools was first introduced as a pilot project on a change management course in 2023-2024 (Leading Organisational Change). It explores the impact of this change, not just on the students’ learning experience, but also on the overall dynamics of the course.
When people go through changes, they adapt to the new situation by experiencing emotional transitions. The Change Curve – a concept commonly used in change management – can assist organisations in predicting how people might react to change and can help in providing timely support for those who need it. Based on Elisabeth Kübler-Ross’ book On Death and Dying (1969), people experiencing change go through similar emotional stages as those facing grief – like denial, anger, bargaining, depression and acceptance. The experiences of the students and their course leader closely reflected these stages on the change curve.
1) Shock & Denial
When GenAI was introduced on the course in January 2024, students were allowed to choose whether to use it in lessons and assessments. This was a novel approach, especially considering the generally prohibitive position prevalent in the wider sector at the time. In this context, the course’s new policy surprised students. Despite an initial survey showing widespread GenAI usage among them, reactions to the new policy were mixed.
Although universities were keen to update their policies after committing to support students and faculty becoming AI literate, the high-level guidelines that most adopted left plenty of room for interpretation. While flexible guidelines promote academic freedom and adaptability to different disciplines and student cohorts, they may also result in a lack of transparency, inconsistent execution and many grey areas. Perhaps this ambiguity made students hesitant about whether, and how openly, they could use chatbots in their learning. Although some students claimed that the open-use policy alleviated their previous concerns about academic integrity, others were sceptical, thinking that acknowledging the use of chatbots could disadvantage them.
Student Reflections:
“From our perspective, being able to use GenAI quite freely was shocking for several of us, as it was contrary to what we experienced in many other courses. While some course leaders had allowed GenAI for grammatical improvements or as a learning and research aid, others banned it completely and understood the outcome to be plagiarism, copyright transgression and/or a violation of LSE’s policy. Due to this ambiguity, not only at LSE but in the education sector in general, anxiety was quite high, transparency and trust comparatively low, and a clear path yet to be seen. The weekly media coverage of new GenAI inventions and concerns further contributed to this uncertainty. It felt like being torn between embracing global change and navigating the unknown or staying behind due to fearing sanction.”
2) Frustration and Depression
Allowing students to openly use GenAI platforms came with a number of challenges that the course convenor had to address. Besides writing a new policy to outline the rules of ethical and responsible GenAI use, the course leader had to design new processes for assignment submissions and create new forms for the acknowledgement of GenAI usage and design fair marking criteria.
At the beginning of the journey, the unforeseen and often conflicting tasks could trigger despair, stress and frustration in course leaders. As one administrative issue gets resolved, another might appear immediately with new challenges, accelerating in scale and scope. While students may also need additional support and guidance, course leaders may also struggle with the challenge of keeping up with the fast-paced technological environment.
Student Reflections:
“We experienced frustration with GenAI policies differing across courses. To act correctly, one always had to come back and reconsider which use case was available and permitted for the current course. Another point of resentment was with GenAI tools themselves. For example, they would either provide completely wrong responses or offer answers in such a convincing way that it might hinder critical second-guessing. Further frustration arose from the limited use cases for GenAI.
Although it helped with answering questions that one would otherwise have entered into a search engine like Google; certain useful features—such as summarizing papers or slides—were constrained by copyright regulations. Therefore, figuring out how to best use GenAI platforms in our learning was a challenge. The LSE-wide introduction of the data-protected version of Copilot in April 2024 might make a positive difference for students to effectively and ethically integrate the use of GenAI in their learning in the coming academic years. However this was not available during our course.”
3) Exploration and Decision
As the course progressed, the challenges changed from administrative tasks and procedural issues to marking and moderation. As students could decide whether to use GenAI in their assignments or not, it was important to ensure that the marking criteria used for submissions provided equity for students irrespective of whether they used GenAI or not.
Marking assignments in which students openly used GenAI assistance provides challenges for lecturers, who may be less familiar with GenAI tools. They might find it more difficult to give feedback to students on the appropriate use of chatbots or could miss spotting unethical use. Marking assignments that consistently show proficiency in language but contain logical and analytical flaws may take longer to evaluate, especially for educators teaching in a language that is not their first. When marking is done by a teaching team with varying familiarity with GenAI and different primary languages, managing consistency becomes even harder for course leaders. These demands might set course leaders back to the ‘depression’ stage of the change curve many times during implementation. The shift towards this more proactive approach of GenAI on university modules therefore requires significant time, effort and energy that goes beyond the often already stretched academic workload. Institutions must ensure that course conveners receive the necessary support as they take on this challenge. Otherwise, successful GenAI integration attempts will be far less likely. The final decision whether to embark on this change journey may depend on the support that course conveners receive.
Student Reflections:
“Through regular exchanges with our course leader, we gained more trust in the course’s AI policy. We embraced experimenting with this technology and learned where it could best support us, where it had been ridiculously unhelpful, and where ethical considerations lay. We learned that if these tools have specific frameworks and predefined ethical regulations, they can contribute to one’s learning process and strengthen the outcome. Furthermore, as our course was on change management, experiencing the implementation of GenAI into Higher Education allowed us to witness a profound change hands-on. This helped us to reflect on change theories in a new way and primed us for leading change in our future workplaces.“
4) Integration
Much has changed since January 2024 in how universities and the broader higher education sector approach GenAI. On this course, the pilot project was successful and rewarding. Although the transition to a more proactive approach towards GenAI brought some obstacles, it also created new ways to connect and collaborate with students in exploring how to adapt our teaching and learning environment to the new GenAI-enabled context we live in.
Although change implementation appears linear on the change curve, a successful AI transformation requires constant adaptation and critical, self-reflected learning. The change implementation has not been completed for this course either, even though our joint educator-student change journey has now ended. Drawing from the pilot, further changes will be necessary in the next academic year to integrate what we have learned. By sharing our reflections and joint insights with the academic community, we hope to help those who are still hesitating about whether to embark on this journey, and to contribute in a small way to the cultural and pedagogical transformation that the whole academic sector is experiencing.