AI for T&L in Education: Preparing students for an AI future

AI skills and literacies

Keeping it human

Unlike many other technologies, AI's natural language processing ability can provide the illusion of human interaction. We give our AI assistants human names like Siri or Alexa. We prescribe personas and tones in our prompts to generative AIs. However, in order to keep our sense of evaluative judgement and critical thinking - it is vital our students and we understand that AIs don't "understand" the output they provide. They are not accountable nor can they verify what they produce. (On the Dangers of Stochastic Parrots | Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 2021) 

Why is it vital that we understand AI is not sentient? If we attribute human characteristics to AI - over time, we will come to think of them as trustworthy advisors in areas of our lives that would benefit more from human advice, could develop unhealthy relationships with them and come to trust the outputs of all AIs - making us more vulnerable and less likely to bring critical thinking and evaluative judgement to interrogate their outputs.  (Luccioni and Marcus, 2023)  

AI Literacies and Skills

PAIR framework - clearly define Problem, use AI to explore and experiment, Review results via critical thinking, reflection

(Acar, 2023)

Although this framework by researcher Oguz Acar is aimed at thinking about AI use for students – considering these steps is also useful for academics and learning designers who want to effectively and ethically use AI. 

Acar emphasises that one needs a clear policy about the use of AIs and the need to learn about it yourself if you plan to allow the use of AI by your students. to use this as an academic.  He goes on to say one needs to align one’s design with the student's prior knowledge, cognitive abilities and past experiences and to remember that any time one designs for the use of AI IN learning, one must make sure that AI is accessible – free and hopefully aligned to UDL principles. 

The Framework, with the acronym PAIR, breaks down as follows:

P - Problem formulation

Understanding the problem you want to solve before you begin sounds simple enough, but often we think we need to start with the solution in mind or come up with a solution and then shop around for how to implement it. Consider the problem you're trying to solve for whom and what success will look like, rather than starting the problem definition with a solution in mind. This allows you to engage in some divergent thinking and cast a wider net. For instance, if students are having trouble with a particular concept - rather than default to a video you create, break down what you define as "trouble with the concept". Is it the number of questions coming to staff? Not connecting to earlier learnings? Responses that incompletely answer the question posed?   

Use this information to craft your prompt (see the Prompting section for information on prompt creation).

AI - Exploring and experimenting with Artificial Intelligence

Once you understand the focus, scope and boundaries of the problem you're trying to solve, the next step is to choose the best AI tool for the job. There are thousands of AI and AI-powered tools out there. Some, like Scopus, are excellent for finding research but won't help you with creating a scenario or rubric. Others, like Open AI's ChatGPT or ChatSonic, are excellent for general writing and basic creative tasks but if you want to incorporate current events, starting with a tool like Bing that is connected to the current internet is a better choice. Specialised tools exist for coding, transcripts, image generation, citation and a wide range of applications. The Resource tab in this section will provide some links to get you started. Make sure you test whether you can access some of these via the University's internet, as some may be blocked.

R - Review using critical thinking and reflection

Your final step in this framework involves applying evaluative judgement and critical thinking to the output. Is the output factually correct? Does it address your specific need and audience? Does the writing flow well? Are the citations and sources credible? Remember you can and should have a back-and-forth conversation with some AI tools to refine the results. But as the content you create should reflect your expertise, ensure you verify statements, test code and record your methodology where appropriate. Reflect on what you brought to the process, and remember that whatever you have just used to aid your work is a tool for competent humans, not a replacement for them.


AI for personalised learning

From helping educators and students to track progress and activity to answering questions about content and concepts, AI is already being used to support educational outcomes for learners. 

Automating supports based on progress and performance

Tools like OnTask allow educators to set up emails tailored to notify and encourage students based on if/then rules. For instance, IF a student hasn't completed key learning activities by a set date, THEN that student is sent an email pre-written by the instructor that encourages them to do so. If a student seems to be struggling with activities related to a particular concept, an email could be automatically generated to suggest additional resources and supports. 

Personalised, cognitive-based learning with custom-trained AIs

Because GPT AI models can be trained on specific content or work with other systems, the outputs AI creates can be based on credible, high-quality, vetted content.

The Khan Academy, an online educational platform that provides video lessons and interactive exercises, has done just that with a tool aimed at both educators and students. Educators (who have to pay to use the platform unless paid for by their district) can seek ideas on developing lesson plans and content based on Khan Academy's materials and students can use the AI dubbed Khanmigo to seek clarity on concepts and materials without getting their work done for them. 

Eight South Australian high schools have trialled an AI app, designed in partnership with Microsoft, to show them how to use AI to support learning with guardrails built for student safety. 

Researchers at UniDistance Suisse (Baillifard et al., 2023) recently published findings of research they did into the implementation of an AI tutor in a neuroscience course fed with course materials and found that students who actively engaged with the AI tutor led to an average improvement of up to 15 percentile points. 

Harvard University is integrating AI into their introduction to coding course, to help students de-bug coding and to answer basic questions.

AI is also being combined with other learning technologies such as virtual- and augmented-reality-enabled immersive learning as is the case with the South Australian educational technology company Lumination and educational systems, schools and industry are trialling various applications of these tools. 

Organisations are already able to use services like Microsft's Azure to create and train their own AI models on internal data to produce results.

Individuals are also able to do so - for a variety of purposes. For instance, inventor and futurist Ray Kurzweil has spent a decade or more on a very personal project: being able to connect with his deceased father (whom he lost at age 22) by feeding an AI his father Fred's journals, musical compositions, lecture notes and letters and training it so that the AI produces a simulacrum of a conversation with his father.  

Personalised, cognitive-based learning with general purpose, generative AIs

Even general-purpose, generative AIs can serve as a jumping-off point for further exploration of a topic or concept when fed a deliberately structured prompt, keeping in mind the risks of "hallucinations", inherent biases and limited access to current and/or paywalled data as well as privacy and IP concerns. There is also the very real concern that students will rely too heavily on AI writing output and miss out on learning the fundamental writing and analysis skills that help them use AI effectively. Educators need to develop AI skills and experience of their own precisely because of these risks. And teaching students to use AI to support learning means integrating AI use into coursework. (Mollick and Mollick, 2023

AI can be used as a feedback generator, tutor, team coach and even let your students take on the role of teacher. In addition to their excellent YouTube series, Ethan and Lilach Mollick have written a series for Harvard Business Education publishing on these topics that includes the wording for detailed prompts for these user scenarios. The fifth video in their YouTube series on Practical AI for Instructors and Students serves as an introduction to how to use general purpose, generative AI tools for personalised, cognitive-based learning and guidance for educators on how to effectively guide and integrate the use of AIs in the classroom.

AI agents vs Software

Currently, if you want to create a document, presentation, or spreadsheet and create or edit images to accompany them - you need three or four different programs. You may need an entirely different device to send photos and messages when you're out and about or to turn your air conditioning on before you get home or to record your ideas for a presentation you've been asked to give or to confirm you really DO remember that 3 pm dental appointment.

Microsoft founder and self-confessed lover of software Bill Gates predicts that in a few years, this will change completely - you will simply tell your device, in everyday language, what you want to accomplish and it will deliver. Depending on your comfort levels with cognitive offloading and sharing your personal information (and your employer's IP and privacy policies), your device will understand your unique context and needs. He sees a near future where we all have a truly personal assistant with us at all times.

There are issues that will need plenty of discussion and resolution, and Gates acknowledges this. These include the widening of the digital divide between the haves and have-nots, privacy, loss of control and the erosion of skills as well as the vulnerability of systems to attack and the risk of building our society on technologies that can let us down. We only have to look at the Optus outage in early November 2023 that caused small businesses that are cash-free to lose a half day's or more of income, took infrastructure such as trains offline and interfered with the operations of dozens of hospitals. 


AI and the future of work

Having competent humans in the loop to direct, reject, edit and tweak AI output is still very much desirable, but we're in the early days of its public use.

What is in store for entry-level office jobs that involve writing, administration, coding and graphic design in the next few years? The answer is - no one is certain quite yet. 

The jobs that are most impacted are those that involve activities that AI can do more efficiently than humans. Repetitive, data-driven tasks, entry-level programming, analytics, engineering, graphic design, human resources, marketing, advertising, some aspects of customer support, legal research and of course teaching and learning design. Some experts (Eloundou et al., 2023) posit that software and tooling built on top of large language models could complete between 47% and 56% of all worker tasks in the US (obviously depending on the industry and application) significantly faster at the same level of quality. 

Research by the Boston Consulting Group (Fabrizio Dell'Acqua et al., 2023) on 18 different tasks done at the consulting company, found that of the 758 consultants involved in the study, those who used AI outperformed those who did not - by a considerable margin and by every way performance is measured. Interestingly too - those consultants who measured the worst before using AI had the biggest increase in their performance, indicating that AI use can be a leveller.

However, when posed a problem the experimenters knew AI would give a wrong but convincing answer to - many did worse. Another study on the use of high-quality AI by 181 recruiters (Fabrizio Dell'Acqua, 2023) found that users became over-reliant on AI and missed out on potentially qualified applicants.

Researcher Ethan Mollick's article on AI's "Jagged Frontier" looks at both these studies as well as how he uses AI depending on the task at hand. His conclusion thus far is:

"On some tasks AI is immensely powerful, and on others it fails completely or subtly. And, unless you use AI a lot, you won’t know which is which."  

AI prompting

Prompt engineering is how you construct inputs for generative AIs and is vital to ensuring you get output that is suitable to your needs. To do this, you need to have at least a baseline understanding of the topic at hand as you will need to use the correct terminology and an understanding of the context of the terms being used. In the case of AI image generation, you may need the basics of the type of art you wish to generate and the colour palettes, styles, periods, framing and other visual considerations in addition to defining the subject, focus, context, actions and settings you want to see in the final product.

Different AI tools can produce different results. Even the same AI tool can and will produce different outputs from the same input. Practice is key to getting the results you want. 

Some basic rules of prompting are:

  • Be concise - avoid colloquialisms and slang, use direct language, avoid trying to complete multiple tasks at once 
  • Target word complexity - this will help the AI understand the level of sophistication of the output
  • Provide context - provide the AI with a persona (a description of a person it should act as), the audience, the situation
  • Specify the output format - with a text AI - are you looking for quiz questions, a rubric, some bullet points from a transcript, a text description of an image, a summary of a document, a research question or paper title, a video script from your notes?
  • Parameters - is there any specific style or tone you're after? An artistic school or style of image? Background colour? How long do you want the output to be in terms of words, minutes, pages, number of bullet points?
  • Ask the AI if it has questions before it starts - this you can opt to do or not, depending on your context, but it can prompt you with some surprising questions that can help it narrow the context

This video from the Wharton School's Eli and Lilach Mollick demonstrate some AI prompting techniques. The key to getting proficient at AI prompting is practice. 

Video: Practical AI for Instructors and Students Part 3: Prompting AI (Length: 11:41)

Note: since this was written, ChatGPT has rolled out "custom GPTs" for paid subscribers. This functionality allows users to create persistent, contextualised personas that they can share with others. As well, paid subscribers can upload PDFs and images for analysis by ChatGPT.