As generative artificial intelligence tools become more common in schools, workplaces and other settings, colleges and universities are juggling how to prevent misuse of AI in the classroom while equipping students for the next chapters of their lives after higher education.
A May 2024 Student Voice survey from Inside Higher Ed and Generation Lab found that, when asked if they know when or how to use generative AI to help with coursework, a large number of undergraduates don’t know or are unsure (31 percent). Among students who did know when to use AI appropriately, that direction came from faculty (31 percent).
Methodology
Inside Higher Ed’s annual Student Voice survey was fielded in May in partnership with Generation Lab and had 5,025 total student respondents.
The field dates could put the data “a little behind the curve already on how schools have adapted and instituted policies,” says Chuck Lewis, an English professor at Beloit College and director of its writing program. “I think, even as quickly as this fall, I bet those numbers would change pretty significantly nationally.”
The sample includes over 3,500 four-year students and 1,400 two-year students. More than one-third of respondents were post-traditional (attending a two-year institution or 25 or older in age), 16 percent are exclusively online learners and 40 percent are first-generation students.
The complete data set, with interactive visualizations, is available here. In addition to questions about their academics, the survey asked students about health and wellness, college experience, and preparation for life after college.
Experts say providing clear and transparent communication about when AI can or should be used in the classroom is critical and requires faculty buy-in and understanding of related tools.
From Fearful to Future-Looking
Only 16 percent of Student Voice respondents (n=817) said they knew when to use AI because their college or university had published a policy on appropriate use cases for generative AI for coursework.
Students aren’t floundering in confusion without reason; 81 percent of college presidents, in early 2024, reported that they had yet to publish a policy governing the use of AI including in teaching and research, according to Inside Higher Ed’s 2024 presidents’ survey.
Similarly, a minority of provosts said, also earlier this year, that their institution had published a policy that governs the use of AI (20 percent), according to Inside Higher Ed’s 2024 chief academic officers’ report.
When ChatGPT first launched in November 2022, administrators and others working in higher education initially panicked over how students could use the tool for plagiarism.
Slowly, as new generative AI tools have emerged and a growing number of employers have indicated AI skills may be necessary in the workforce, college and university leaders have turned a corner, considering AI as a career development skill or walking back use of AI plagiarism detectors, shares Afia Tasneem, senior director of strategic research at the consulting firm EAB.
“Just a few months later, there was noticeable recognition that this was not a technology that you could just ban and declare victory and go home,” says Dylan Ruediger, senior program manager of the research enterprise at Ithaka S+R. “And since then, I’ve seen most institutions trying to find frameworks for thinking about generative AI as pedagogically useful.”
In the Classroom
Student Voice data found if students did know when to use generative AI, it was because at least some of their professors had addressed the issue in class (31 percent) or had included a policy in their syllabus (29 percent).
The biggest challenge in getting students AI ready is getting faculty on board, Tasneem says. A June survey from Ithaka found two in five faculty members were familiar with AI, but only 14 percent were confident in their ability to use AI in their teaching.
“If you look at university policies around student use of generative AI, they will quite often kick that decision to individual instructors and advise students to follow the rules that each instructor gives them,” Ruediger says.
Faculty members generally fall into three camps: those who require students to use AI, those who are absolutely prohibiting AI use and those who allow for limited use of AI when appropriate, Tasneem says.
At Beloit College in Wisconsin, the policy is to have no institutional-level policy, says Chuck Lewis, director of the writing program. “Faculty need to develop an informed, transparent and clear policy regarding their own classes and their own pedagogies.”
Like many of his colleagues in writing programs, Lewis was confronted early with the potential of AI in writing and how it could be used to circumvent student effort. But Lewis quickly realized that this technology was larger than reproducing writing samples and could also serve as a tool for deeper thinking.
“AI is an opportunity for us to revisit and maybe rethink or reinforce, but at least to rearticulate, all kinds of things that we think we know or believe about, for instance, learning and writing,” Lewis says. “It defamiliarizes us, in some sense, with our expectations and our norms. It’s an opportunity to go back and think, ‘Well, what is it about relationships?’ In terms of audience and purpose and whatnot.”
One example: In a creative writing course, Lewis and his students debated when it’s OK to let technology produce your writing, such as using suggested replies to a text message or email or sending a message to someone on an online dating site.
“If we can step away from this overdetermined, what we think we’re doing in the classroom, and think about these other places where we’re producing consuming content, it, again, sort of defamiliarizes us with what we want and why.”
In the Student Voice survey, learners at private institutions were more likely to say their professors had a policy in the syllabus (37 percent), compared to their peers at four-year publics (31 percent) or two-year publics (24 percent), which Lewis says may be due to the nature of private liberal arts colleges. “It’s very consistent with our mission and our brand to be very engaged with student processes.”
As colleges and universities elevate generative AI skills as a career competency or a factor that is central to the student experience in higher education, policies remain a challenge.
“As long as individual instructors have ultimate say over how it gets used in their classroom, it’s likely that there will be instructors who prefer not to allow the use of generative AI,” says Ruediger of Ithaka. “The general turn towards thinking about how to leverage generative AI, that’s happened already, and what happens next will largely depend on whether or not people are successful in finding effective ways to use it to actually foster teaching and learning.”
Equity Gaps
Student Voice data highlighted awareness gaps among historically disadvantaged student groups.
Forty percent of students at two-year public institutions said they were not sure about appropriate use, compared to 28 percent of public four-year students and 21 percent of private four-year students.
Adult learners (ages 25 and up) were more likely to say they’re not aware of appropriate use (43 percent) compared to their traditional aged (18- to 24-year-old) peers (28 percent). First-generation students (34 percent) were also less likely to be confident in appropriate use cases for AI compared to their continuing-generation peers (28 percent).
“I think a bad outcome would be to have knowledge about how to leverage this tool become part of the hidden curriculum,” Ruediger says. “It really underscores the need to be transparent and clear, to make sure that it’s fostering equitable use and access.”
Part of this trend could be tied to the type of institution students are attending, Lewis says, with students from less privileged backgrounds historically more likely to attend two- or four-year institutions that have yet to address AI at the faculty level.
It also hints at larger systemic disparities of who is or is not using AI, says EAB’s Tasneem.
Women, for example, are less likely to say they’re comfortable using AI, and people from marginalized backgrounds are more likely to say they avoid using tools such as ChatGPT that regurgitate racist, sexist, ageist and other discriminatory points of view, Tasneem added.
Institutional leaders should be aware of these awareness gaps and understand that not using AI can displace groups in the workplace and result in inequities later, Tasneem says.
Around one-quarter of Student Voice respondents said they’ve researched when they should use generative AI to understand appropriate use in the classroom. Men were most likely to say they’ve done their own research on appropriate use of ChatGPT (26 percent), while first-gen students, adult learners (20 percent) and two-year students (19 percent) were least likely to say that was true.
Nontraditional students and first-generation learners are more likely to be uncertain about making choices in their higher education experiences, Tasneem says. “They feel like they don’t know what’s going on, which makes it all the more important for faculty members to be transparent and clear about policies to level the playing field about what’s expected and prohibited. No one should have to do research by themselves or be in doubt about AI use.”
Put Into Practice
As colleges and universities consider how to deliver policy and inform students of appropriate AI use, experts recommend campus leaders:
Survey Says
A majority of provosts said faculty or staff have asked for additional training related to developments in generative AI (92 percent), and around three-quarters of institutions have offered training to address faculty concerns or questions about AI in the past 18 months, as of May, according to Inside Higher Ed’s 2024 provosts’ survey.
- Offer professional development and education. To prepare community members for working alongside AI, institutions should be offering workshops and education training, and these should be geared toward students and faculty members, Tasneem says. Only 8 percent of Student Voice respondents (n=413) said they knew of appropriate AI use in their courses because their institution has provided information sessions, trainings or workshops on the subject. “As we learn more and as institutions start using it more for academics and operations, we’ll start to see more tailored training, discipline-specific training,” she predicts.
- Provide sample language. Some colleges have created syllabus templates for professors to adapt and apply to their courses. The University of Washington’s center for teaching and learning has three samples for professors who encourage, prohibit or conditionally allow students to use AI.
- Identify champions. To encourage hesitant faculty members to engage with artificial intelligence tools, administrators can elevate faculty or staff members who are enthusiastic about the technology to bring their colleagues on board, Ruediger says.
- Communicate regularly with students. Appropriate AI use is not a topic that can be covered once and then never revisited, Lewis says. “It can’t just be a boilerplate and syllabus—it has to be tied again and again to specific contexts.” Faculty should investigate different elements of learning—such as researching, brainstorming and editing—and talk about specific ways AI can be applied to various stages of the process.
- Set guiding principles. Application of how AI is used in the curriculum should remain at the professor’s discretion, experts agree. But a college- or universitywide policy can reaffirm the institution’s values and mission for how to approach AI with ethics, Tasneem says.
- Consider academic dishonesty policies. Allowing AI use to be a professor-level decision, while beneficial for teaching and learning, may create some challenges for addressing academic integrity as students navigate differing policies in various courses, Lewis says. “This is about to get much more complicated in terms of the kinds of infractions that are going to come up, because they’re going to be much more variable.”
Should using generative AI be a part of a student’s core curriculum or a career competency? Tell us your thoughts.