by Caroline Chun
It’s no longer news that one of the first professional sectors threatened by the rapid adoption of ChatGPT and generative AI is education – universities and colleges around the country convened emergency meetings to discuss what to do about the risk of students using AI to cheat on their work. There’s another side to that evolving AI story. Recent research from professors at the University of Pennsylvania’s Wharton School, New York University and Princeton suggests that educators should be just as worried about their own jobs.
In an analysis of professions “most exposed” to the latest advances in large language models like ChatGPT, eight of the top 10 are teaching positions.
“When we ran our analysis, I was surprised to find that educational occupations come out close to the top in many cases,” said Robert Seamans, co-author of the new research study and professor at NYU.
Post-secondary teachers in English language and literature, foreign language, and history topped the list among educators.
Jobs most ‘exposed’ to generative A.I.
The table shows the jobs that are most likely to encounter generative A.I. as part of their responsibilities.
|2.||English language and literature teachers|
|3.||Foreign language and literature teachers|
|6.||Philosophy and religion teachers|
|8.||Political science teachers|
|9.||Criminal justice and law enforcement teachers|
Note: All teaching positions listed are at post-secondary institutions.Source: How will Language Modelers like ChatGPT Affect Occupations and Industries? Authors: Ed Felten (Princeton), Manav Raj (University of Pennsylvania) and Robert Seamans (New York University)
While evidence has been growing in recent years that work within highly skilled professions — for example, lawyers — may be influenced by AI, typically the jobs expected to be most affected by technology are routine or rote jobs, while highly-skilled labor is considered more protected.
But this study finds the opposite to be the case.
“Highly-skilled jobs may be more affected than others,” said Manav Raj, co-author and professor at the University of Pennsylvania’s Wharton School.
But affected jobs – or as the study officially describes it, jobs most “exposed to AI” – does not necessarily mean the human positions will be replaced.
“ChatGPT can be used to help professors generate syllabi or to recommend readings that are relevant to a given topic,” said Raj, who is not currently concerned about the fear of replacement. It can also design educational slides and in-class exercises. And for topics that are very dense, “ChatGPT can even help educators translate some of those lessons or takeaways in simpler language,” he said.
Education technology company Udemy has been selling language learning modules made with ChatGPT to help language teachers design their courses.
Duolingo, the popular online language learning company, is relying on AI technology to power its Duolingo English Test (DET), an English proficiency exam available online, on demand. The test utilizes ChatGPT to generate text passages for reading comprehension and AI for supporting human proctors in spotting suspicious test-taking behavior.
It is also working with teachers to generate lesson content and speed up the process and scale of adding advanced materials to the platform. “Since not everyone in the world has equal access to great teachers and favorable learning conditions, AI gives us the best chance to scale quality education to everyone who needs it,” said Klinton Bicknell, Duolingo’s head of AI.
What college professors are thinking and doing
Some professors are wary of ChatGPT and its capabilities.
Kristina Reardon, an English professor at Amherst College, says there is a line to draw when using ChatGPT as a professor and considering ChatGPT’s role of co-authorship in writing.
“No matter how good Chat GPT gets, I believe we can gain a lot from learning to pre-write, draft, revise, edit, etc. Writing is a process, and it’s an iterative one, and one that helps us think through ideas,” she said.
Many universities have sent out guidance to professors for use of ChatGPT and how they can augment their students’ experiences while still maintaining academic integrity.
Princeton advises professors to be explicit in the uses of ChatGPT in their syllabi; use it to enhance smaller group discussions, and use it as a tool to compare students’ own drafts of essays with a ChatGPT version.
At Cornell University, regardless of general university guidelines, every instructor will be free to make their own decision on what works best for their area of teaching, says Morten Christiansen, psychology professor at the school.
Many professors are starting to use ChatGPT in the classroom.
Laurent Dubreuil, professor in French and comparative literature at Cornell, is currently having his students assess the boundaries of academic freedom and censorship, as the most recent versions of ChatGPT are “now coming with set parameters about what is socially and politically acceptable to say — and what should not be uttered.”
Christiansen says ChatGPT can help level the playing field among students. “It can be used as a personal tutor to help them, and there’s an opportunity for students to evaluate what ChatGPT produces,” he said.
In fact, the technology’s imperfections are an opportunity to teach and learn in new ways, honing students’ critical analysis skills by prompting them to ask ChatGPT specific questions related to course content and critique the answers given back to them. Many current generative AI language models produce what AI experts have deemed “hallucinations.”
“ChatGPT will make things up and it will look like it is really confident in what it is saying, including adding references that don’t actually exist,” Christiansen said.
Ethan Mollick, entrepreneurship professor at Wharton, who has become an evangelist within the education world for generative AI experimentation, expects his students to use ChatGPT in every document they produce, whether this is for marketing materials, graphics, blog posts or even new working apps.
“I think we have to realize it’s part of our lives, and we have to figure out how to work with that,” he said.
Mollick does not think exposure means eventual replacement.
“We have to recognize that we need to change how we approach things and embrace this new technology,” he said. “We’ve adapted to other technological changes, and I think this one we will adapt to as well.”